Apr 23 16:32:30.929084 ip-10-0-129-102 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 16:32:30.929095 ip-10-0-129-102 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 16:32:30.929102 ip-10-0-129-102 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 16:32:30.929360 ip-10-0-129-102 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 16:32:41.170301 ip-10-0-129-102 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 16:32:41.170316 ip-10-0-129-102 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot ab6cb03a43c3473a9a11a9a555c672e5 -- Apr 23 16:35:07.857753 ip-10-0-129-102 systemd[1]: Starting Kubernetes Kubelet... Apr 23 16:35:08.320024 ip-10-0-129-102 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:08.320024 ip-10-0-129-102 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 16:35:08.320024 ip-10-0-129-102 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:08.320024 ip-10-0-129-102 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 16:35:08.320024 ip-10-0-129-102 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:08.321638 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.321548 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 16:35:08.325971 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.325949 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:08.325971 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.325967 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:08.325971 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.325971 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:08.325971 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.325975 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:08.325971 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.325978 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:08.326151 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.325982 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:08.326151 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.325985 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:08.326151 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.325988 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:08.326151 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.325990 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:08.326151 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.325993 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:08.326151 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.325996 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:08.326151 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326000 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:08.326151 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326004 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:08.326151 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326008 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:08.326151 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326010 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:08.326151 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326013 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:08.326151 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326017 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:08.326151 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326022 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:08.326151 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326025 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:08.326151 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326028 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:08.326151 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326031 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:08.326151 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326034 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:08.326151 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326036 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:08.326151 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326039 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:08.326627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326042 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:08.326627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326045 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:08.326627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326047 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:08.326627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326050 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:08.326627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326053 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:08.326627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326055 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:08.326627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326058 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:08.326627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326060 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:08.326627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326063 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:08.326627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326066 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:08.326627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326068 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:08.326627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326070 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:08.326627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326073 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:08.326627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326076 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:08.326627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326079 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:08.326627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326082 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:08.326627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326085 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:08.326627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326088 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:08.326627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326091 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:08.326627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326094 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:08.327130 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326096 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:08.327130 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326099 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:08.327130 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326101 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:08.327130 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326103 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:08.327130 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326106 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:08.327130 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326110 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:08.327130 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326113 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:08.327130 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326116 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:08.327130 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326119 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:08.327130 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326121 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:08.327130 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326124 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:08.327130 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326126 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:08.327130 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326129 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:08.327130 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326131 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:08.327130 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326134 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:08.327130 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326136 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:08.327130 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326139 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:08.327130 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326142 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:08.327130 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326145 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:08.327130 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326147 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:08.327627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326150 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:08.327627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326153 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:08.327627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326156 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:08.327627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326158 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:08.327627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326161 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:08.327627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326164 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:08.327627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326167 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:08.327627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326170 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:08.327627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326172 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:08.327627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326175 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:08.327627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326177 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:08.327627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326180 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:08.327627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326183 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:08.327627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326185 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:08.327627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326188 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:08.327627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326190 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:08.327627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326193 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:08.327627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326196 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:08.327627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326198 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:08.327627 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326201 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:08.328104 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326203 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:08.328104 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.326206 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:08.328104 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327327 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:08.328104 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327335 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:08.328104 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327338 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:08.328104 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327342 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:08.328104 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327345 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:08.328104 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327349 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:08.328104 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327352 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:08.328104 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327355 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:08.328104 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327357 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:08.328104 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327361 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:08.328104 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327364 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:08.328104 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327367 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:08.328104 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327371 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:08.328104 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327374 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:08.328104 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327388 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:08.328104 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327392 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:08.328104 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327396 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:08.328575 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327399 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:08.328575 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327402 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:08.328575 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327405 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:08.328575 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327408 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:08.328575 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327410 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:08.328575 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327413 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:08.328575 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327416 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:08.328575 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327418 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:08.328575 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327421 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:08.328575 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327425 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:08.328575 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327429 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:08.328575 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327432 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:08.328575 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327435 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:08.328575 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327440 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:08.328575 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327443 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:08.328575 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327446 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:08.328575 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327449 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:08.328575 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327452 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:08.328575 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327455 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:08.328575 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327458 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:08.329067 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327461 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:08.329067 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327464 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:08.329067 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327466 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:08.329067 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327469 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:08.329067 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327472 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:08.329067 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327475 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:08.329067 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327477 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:08.329067 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327480 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:08.329067 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327484 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:08.329067 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327486 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:08.329067 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327490 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:08.329067 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327492 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:08.329067 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327495 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:08.329067 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327497 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:08.329067 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327500 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:08.329067 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327502 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:08.329067 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327505 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:08.329067 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327507 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:08.329067 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327510 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:08.329067 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327513 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:08.329568 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327515 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:08.329568 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327518 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:08.329568 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327521 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:08.329568 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327523 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:08.329568 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327526 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:08.329568 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327529 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:08.329568 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327531 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:08.329568 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327534 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:08.329568 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327536 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:08.329568 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327539 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:08.329568 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327541 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:08.329568 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327545 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:08.329568 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327547 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:08.329568 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327550 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:08.329568 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327552 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:08.329568 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327555 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:08.329568 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327557 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:08.329568 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327560 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:08.329568 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327563 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:08.329568 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327565 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327568 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327571 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327575 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327577 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327580 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327583 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327585 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327588 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.327590 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327660 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327667 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327674 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327679 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327684 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327687 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327692 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327697 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327700 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327703 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327707 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 16:35:08.330049 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327710 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327713 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327717 2578 flags.go:64] FLAG: --cgroup-root="" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327720 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327723 2578 flags.go:64] FLAG: --client-ca-file="" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327726 2578 flags.go:64] FLAG: --cloud-config="" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327729 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327732 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327736 2578 flags.go:64] FLAG: --cluster-domain="" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327739 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327742 2578 flags.go:64] FLAG: --config-dir="" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327745 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327749 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327753 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327757 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327760 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327763 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327766 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327769 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327772 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327776 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327785 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327789 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327792 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327795 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 16:35:08.330587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327798 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327802 2578 flags.go:64] FLAG: --enable-server="true" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327805 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327810 2578 flags.go:64] FLAG: --event-burst="100" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327813 2578 flags.go:64] FLAG: --event-qps="50" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327816 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327819 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327823 2578 flags.go:64] FLAG: --eviction-hard="" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327827 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327830 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327833 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327837 2578 flags.go:64] FLAG: --eviction-soft="" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327840 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327843 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327846 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327849 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327852 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327855 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327858 2578 flags.go:64] FLAG: --feature-gates="" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327862 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327865 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327868 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327872 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327875 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327878 2578 flags.go:64] FLAG: --help="false" Apr 23 16:35:08.331224 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327881 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-129-102.ec2.internal" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327884 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327887 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327891 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327895 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327898 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327901 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327904 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327907 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327910 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327913 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327916 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327919 2578 flags.go:64] FLAG: --kube-reserved="" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327922 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327926 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327929 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327932 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327935 2578 flags.go:64] FLAG: --lock-file="" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327938 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327941 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327944 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327949 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327952 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327955 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 16:35:08.331841 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327958 2578 flags.go:64] FLAG: --logging-format="text" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327961 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327964 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327967 2578 flags.go:64] FLAG: --manifest-url="" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327969 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327974 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327978 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327982 2578 flags.go:64] FLAG: --max-pods="110" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327985 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327988 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327991 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327995 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.327998 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328001 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328004 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328013 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328016 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328020 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328023 2578 flags.go:64] FLAG: --pod-cidr="" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328026 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328032 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328035 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328041 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328044 2578 flags.go:64] FLAG: --port="10250" Apr 23 16:35:08.332444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328047 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328050 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c31e9465b1bc7994" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328053 2578 flags.go:64] FLAG: --qos-reserved="" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328056 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328059 2578 flags.go:64] FLAG: --register-node="true" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328062 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328065 2578 flags.go:64] FLAG: --register-with-taints="" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328068 2578 flags.go:64] FLAG: --registry-burst="10" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328071 2578 flags.go:64] FLAG: --registry-qps="5" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328074 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328077 2578 flags.go:64] FLAG: --reserved-memory="" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328080 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328083 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328086 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328089 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328093 2578 flags.go:64] FLAG: --runonce="false" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328096 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328099 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328102 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328107 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328110 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328113 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328116 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328120 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328122 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328125 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 16:35:08.333040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328128 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328131 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328134 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328137 2578 flags.go:64] FLAG: --system-cgroups="" Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328140 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328147 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328151 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328155 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328161 2578 flags.go:64] FLAG: --tls-min-version="" Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328164 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328167 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328170 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328173 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328176 2578 flags.go:64] FLAG: --v="2" Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328180 2578 flags.go:64] FLAG: --version="false" Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328184 2578 flags.go:64] FLAG: --vmodule="" Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328189 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.328192 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328285 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328289 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328292 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328295 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328299 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328301 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:08.333680 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328304 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328308 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328311 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328314 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328316 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328319 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328322 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328324 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328327 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328330 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328332 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328335 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328338 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328341 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328344 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328346 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328349 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328352 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328354 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328357 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328359 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:08.334415 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328362 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:08.334987 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328364 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:08.334987 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328367 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:08.334987 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328370 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:08.334987 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328372 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:08.334987 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328394 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:08.334987 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328398 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:08.334987 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328401 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:08.334987 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328404 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:08.334987 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328408 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:08.334987 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328411 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:08.334987 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328414 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:08.334987 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328417 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:08.334987 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328420 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:08.334987 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328423 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:08.334987 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328426 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:08.334987 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328428 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:08.334987 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328431 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:08.334987 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328433 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:08.334987 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328436 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:08.335473 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328439 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:08.335473 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328441 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:08.335473 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328444 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:08.335473 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328447 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:08.335473 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328450 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:08.335473 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328452 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:08.335473 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328455 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:08.335473 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328457 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:08.335473 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328460 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:08.335473 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328462 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:08.335473 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328465 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:08.335473 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328467 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:08.335473 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328470 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:08.335473 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328472 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:08.335473 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328475 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:08.335473 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328478 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:08.335473 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328480 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:08.335473 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328483 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:08.335473 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328485 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:08.335473 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328487 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:08.335961 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328490 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:08.335961 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328493 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:08.335961 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328495 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:08.335961 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328498 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:08.335961 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328501 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:08.335961 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328504 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:08.335961 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328506 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:08.335961 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328509 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:08.335961 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328511 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:08.335961 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328514 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:08.335961 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328516 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:08.335961 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328520 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:08.335961 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328523 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:08.335961 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328526 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:08.335961 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328528 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:08.335961 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328531 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:08.335961 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328534 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:08.335961 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328536 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:08.335961 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.328539 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:08.336448 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.329359 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:08.336709 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.336688 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 16:35:08.336738 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.336711 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 16:35:08.336771 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336763 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:08.336771 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336771 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:08.336829 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336774 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:08.336829 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336777 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:08.336829 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336780 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:08.336829 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336783 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:08.336829 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336786 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:08.336829 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336788 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:08.336829 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336791 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:08.336829 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336794 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:08.336829 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336796 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:08.336829 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336799 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:08.336829 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336802 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:08.336829 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336805 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:08.336829 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336808 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:08.336829 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336811 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:08.336829 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336815 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:08.336829 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336820 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:08.336829 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336822 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:08.336829 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336825 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:08.336829 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336828 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:08.337292 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336832 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:08.337292 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336835 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:08.337292 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336838 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:08.337292 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336842 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:08.337292 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336845 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:08.337292 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336848 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:08.337292 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336851 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:08.337292 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336854 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:08.337292 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336857 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:08.337292 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336859 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:08.337292 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336862 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:08.337292 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336865 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:08.337292 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336868 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:08.337292 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336870 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:08.337292 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336873 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:08.337292 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336876 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:08.337292 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336878 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:08.337292 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336881 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:08.337292 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336883 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:08.337292 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336886 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:08.337810 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336888 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:08.337810 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336890 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:08.337810 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336893 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:08.337810 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336896 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:08.337810 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336898 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:08.337810 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336901 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:08.337810 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336903 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:08.337810 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336906 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:08.337810 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336910 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:08.337810 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336913 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:08.337810 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336916 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:08.337810 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336918 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:08.337810 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336921 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:08.337810 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336924 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:08.337810 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336927 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:08.337810 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336930 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:08.337810 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336932 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:08.337810 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336935 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:08.337810 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336938 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:08.337810 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336940 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:08.338321 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336943 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:08.338321 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336946 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:08.338321 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336948 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:08.338321 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336951 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:08.338321 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336953 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:08.338321 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336956 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:08.338321 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336959 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:08.338321 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336961 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:08.338321 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336964 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:08.338321 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336966 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:08.338321 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336969 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:08.338321 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336971 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:08.338321 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336973 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:08.338321 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336976 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:08.338321 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336978 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:08.338321 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336981 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:08.338321 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336983 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:08.338321 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336986 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:08.338321 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336988 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:08.338321 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336991 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:08.338836 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336993 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:08.338836 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336996 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:08.338836 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.336999 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:08.338836 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337001 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:08.338836 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337004 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:08.338836 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.337011 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:08.338836 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337130 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:08.338836 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337135 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:08.338836 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337138 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:08.338836 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337141 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:08.338836 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337144 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:08.338836 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337147 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:08.338836 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337150 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:08.338836 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337153 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:08.338836 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337156 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:08.339204 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337158 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:08.339204 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337532 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:08.339204 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337536 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:08.339204 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337539 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:08.339204 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337542 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:08.339204 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337545 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:08.339204 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337548 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:08.339204 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337551 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:08.339204 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337553 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:08.339204 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337557 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:08.339204 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337560 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:08.339204 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337562 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:08.339204 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337565 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:08.339204 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337567 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:08.339204 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337570 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:08.339204 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337572 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:08.339204 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337575 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:08.339204 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337577 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:08.339204 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337580 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:08.339204 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337583 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:08.339699 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337585 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:08.339699 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337588 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:08.339699 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337591 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:08.339699 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337593 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:08.339699 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337596 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:08.339699 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337598 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:08.339699 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337601 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:08.339699 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337603 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:08.339699 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337606 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:08.339699 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337608 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:08.339699 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337611 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:08.339699 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337613 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:08.339699 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337616 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:08.339699 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337618 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:08.339699 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337621 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:08.339699 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337623 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:08.339699 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337626 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:08.339699 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337629 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:08.339699 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337631 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:08.339699 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337634 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:08.340190 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337636 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:08.340190 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337639 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:08.340190 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337641 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:08.340190 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337643 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:08.340190 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337646 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:08.340190 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337649 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:08.340190 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337651 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:08.340190 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337653 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:08.340190 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337656 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:08.340190 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337658 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:08.340190 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337661 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:08.340190 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337664 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:08.340190 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337666 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:08.340190 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337669 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:08.340190 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337672 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:08.340190 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337676 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:08.340190 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337680 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:08.340190 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337683 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:08.340190 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337686 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:08.340678 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337689 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:08.340678 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337691 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:08.340678 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337694 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:08.340678 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337696 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:08.340678 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337699 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:08.340678 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337701 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:08.340678 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337704 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:08.340678 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337706 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:08.340678 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337709 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:08.340678 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337712 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:08.340678 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337714 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:08.340678 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337717 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:08.340678 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337719 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:08.340678 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337722 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:08.340678 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337724 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:08.340678 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337728 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:08.340678 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337731 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:08.340678 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:08.337734 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:08.341114 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.337739 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:08.341114 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.338471 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 16:35:08.341114 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.340489 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 16:35:08.341521 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.341509 2578 server.go:1019] "Starting client certificate rotation" Apr 23 16:35:08.341633 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.341614 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:35:08.341670 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.341660 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:35:08.369947 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.369922 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:35:08.373913 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.373889 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:35:08.393756 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.393733 2578 log.go:25] "Validated CRI v1 runtime API" Apr 23 16:35:08.399801 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.399786 2578 log.go:25] "Validated CRI v1 image API" Apr 23 16:35:08.401465 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.401440 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 16:35:08.405774 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.405754 2578 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 8a5c1c77-17f3-417b-ad8a-7cdff312f599:/dev/nvme0n1p3 a5e64cf4-f403-4762-b8a0-0f6662608a52:/dev/nvme0n1p4] Apr 23 16:35:08.405838 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.405774 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 16:35:08.408531 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.408515 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:35:08.411305 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.411194 2578 manager.go:217] Machine: {Timestamp:2026-04-23 16:35:08.409436138 +0000 UTC m=+0.431832244 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100652 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2005c9f402ea63a81633287297318e SystemUUID:ec2005c9-f402-ea63-a816-33287297318e BootID:ab6cb03a-43c3-473a-9a11-a9a555c672e5 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:42:20:48:a7:f5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:42:20:48:a7:f5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a2:b8:34:8f:af:fe Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 16:35:08.411305 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.411300 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 16:35:08.411440 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.411428 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 16:35:08.412825 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.412690 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 16:35:08.412956 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.412829 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-102.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 16:35:08.412999 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.412966 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 16:35:08.412999 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.412975 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 16:35:08.412999 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.412988 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:35:08.413646 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.413635 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:35:08.415144 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.415133 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:35:08.415256 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.415247 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 16:35:08.417840 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.417830 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 23 16:35:08.417874 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.417849 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 16:35:08.417874 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.417871 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 16:35:08.417940 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.417882 2578 kubelet.go:397] "Adding apiserver pod source" Apr 23 16:35:08.417940 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.417891 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 16:35:08.419119 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.419102 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:35:08.419198 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.419127 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:35:08.424575 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.424558 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 16:35:08.426515 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.426499 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 16:35:08.428296 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.428279 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 16:35:08.428296 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.428299 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 16:35:08.428424 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.428306 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 16:35:08.428424 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.428312 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 16:35:08.428424 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.428318 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 16:35:08.428424 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.428323 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 16:35:08.428424 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.428329 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 16:35:08.428424 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.428335 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 16:35:08.428424 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.428342 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 16:35:08.428424 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.428348 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 16:35:08.428424 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.428357 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 16:35:08.428424 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.428365 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 16:35:08.429906 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.429887 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5cmbf" Apr 23 16:35:08.429980 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.429970 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 16:35:08.430017 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.429981 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 16:35:08.431897 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:08.431872 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 16:35:08.431969 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:08.431879 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-102.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 16:35:08.432004 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.431992 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-102.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 16:35:08.433728 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.433716 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 16:35:08.433772 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.433751 2578 server.go:1295] "Started kubelet" Apr 23 16:35:08.433874 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.433834 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 16:35:08.433924 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.433847 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 16:35:08.433924 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.433915 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 16:35:08.434663 ip-10-0-129-102 systemd[1]: Started Kubernetes Kubelet. Apr 23 16:35:08.435195 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.435017 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 16:35:08.435641 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.435627 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 23 16:35:08.438521 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.438505 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5cmbf" Apr 23 16:35:08.442408 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.442392 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 16:35:08.442408 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.442395 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 16:35:08.443022 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.443007 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 16:35:08.443022 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.443013 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 16:35:08.443159 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.443032 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 16:35:08.443159 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:08.442888 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 16:35:08.443258 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.443202 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 23 16:35:08.443258 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.443210 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 23 16:35:08.443425 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:08.443374 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 23 16:35:08.443999 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.443980 2578 factory.go:153] Registering CRI-O factory Apr 23 16:35:08.444101 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.444052 2578 factory.go:223] Registration of the crio container factory successfully Apr 23 16:35:08.444160 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.444101 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 16:35:08.444160 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.444109 2578 factory.go:55] Registering systemd factory Apr 23 16:35:08.444160 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.444116 2578 factory.go:223] Registration of the systemd container factory successfully Apr 23 16:35:08.444160 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.444135 2578 factory.go:103] Registering Raw factory Apr 23 16:35:08.444341 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.444165 2578 manager.go:1196] Started watching for new ooms in manager Apr 23 16:35:08.444795 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.444772 2578 manager.go:319] Starting recovery of all containers Apr 23 16:35:08.448985 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.448964 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:08.452620 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:08.452594 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-102.ec2.internal\" not found" node="ip-10-0-129-102.ec2.internal" Apr 23 16:35:08.461188 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.461172 2578 manager.go:324] Recovery completed Apr 23 16:35:08.465042 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.465030 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:08.467569 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.467551 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:08.467660 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.467580 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:08.467660 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.467595 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:08.468087 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.468068 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 16:35:08.468087 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.468084 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 16:35:08.468195 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.468106 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:35:08.470602 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.470589 2578 policy_none.go:49] "None policy: Start" Apr 23 16:35:08.470602 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.470605 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 16:35:08.470681 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.470615 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 23 16:35:08.512008 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.511989 2578 manager.go:341] "Starting Device Plugin manager" Apr 23 16:35:08.516469 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:08.512073 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 16:35:08.516469 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.512088 2578 server.go:85] "Starting device plugin registration server" Apr 23 16:35:08.516469 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.512322 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 16:35:08.516469 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.512347 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 16:35:08.516469 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.512439 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 16:35:08.516469 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.512515 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 16:35:08.516469 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.512528 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 16:35:08.516469 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:08.514444 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 16:35:08.516469 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:08.514490 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-102.ec2.internal\" not found" Apr 23 16:35:08.569647 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.569605 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 16:35:08.570934 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.570878 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 16:35:08.570934 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.570906 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 16:35:08.570934 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.570924 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 16:35:08.570934 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.570931 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 16:35:08.571136 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:08.570966 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 16:35:08.573629 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.573610 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:08.613046 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.613017 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:08.614056 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.614038 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:08.614152 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.614067 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:08.614152 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.614077 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:08.614152 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.614098 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-102.ec2.internal" Apr 23 16:35:08.622831 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.622811 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-102.ec2.internal" Apr 23 16:35:08.622876 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:08.622841 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-102.ec2.internal\": node \"ip-10-0-129-102.ec2.internal\" not found" Apr 23 16:35:08.632935 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:08.632912 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 23 16:35:08.671742 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.671710 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal"] Apr 23 16:35:08.671809 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.671781 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:08.673912 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.673893 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:08.674021 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.673925 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:08.674021 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.673940 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:08.675100 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.675087 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:08.675235 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.675220 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal" Apr 23 16:35:08.675271 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.675247 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:08.675789 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.675773 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:08.675874 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.675803 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:08.675874 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.675817 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:08.675874 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.675864 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:08.676017 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.675887 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:08.676017 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.675901 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:08.677113 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.677098 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" Apr 23 16:35:08.677180 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.677123 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:08.677753 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.677735 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:08.677829 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.677768 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:08.677829 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.677783 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:08.691898 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:08.691875 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-102.ec2.internal\" not found" node="ip-10-0-129-102.ec2.internal" Apr 23 16:35:08.695517 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:08.695502 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-102.ec2.internal\" not found" node="ip-10-0-129-102.ec2.internal" Apr 23 16:35:08.733816 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:08.733798 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 23 16:35:08.745474 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.745453 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/432a54920ff69b032f406403f8e82323-config\") pod \"kube-apiserver-proxy-ip-10-0-129-102.ec2.internal\" (UID: \"432a54920ff69b032f406403f8e82323\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal" Apr 23 16:35:08.745545 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.745484 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d00e776913cd1177ab03d04d7041f574-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal\" (UID: \"d00e776913cd1177ab03d04d7041f574\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" Apr 23 16:35:08.745545 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.745506 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d00e776913cd1177ab03d04d7041f574-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal\" (UID: \"d00e776913cd1177ab03d04d7041f574\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" Apr 23 16:35:08.834241 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:08.834165 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 23 16:35:08.846093 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.846072 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/432a54920ff69b032f406403f8e82323-config\") pod \"kube-apiserver-proxy-ip-10-0-129-102.ec2.internal\" (UID: \"432a54920ff69b032f406403f8e82323\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal" Apr 23 16:35:08.846167 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.846102 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d00e776913cd1177ab03d04d7041f574-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal\" (UID: \"d00e776913cd1177ab03d04d7041f574\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" Apr 23 16:35:08.846167 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.846121 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d00e776913cd1177ab03d04d7041f574-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal\" (UID: \"d00e776913cd1177ab03d04d7041f574\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" Apr 23 16:35:08.846167 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.846147 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d00e776913cd1177ab03d04d7041f574-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal\" (UID: \"d00e776913cd1177ab03d04d7041f574\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" Apr 23 16:35:08.846258 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.846162 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d00e776913cd1177ab03d04d7041f574-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal\" (UID: \"d00e776913cd1177ab03d04d7041f574\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" Apr 23 16:35:08.846258 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.846189 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/432a54920ff69b032f406403f8e82323-config\") pod \"kube-apiserver-proxy-ip-10-0-129-102.ec2.internal\" (UID: \"432a54920ff69b032f406403f8e82323\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal" Apr 23 16:35:08.934445 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:08.934414 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 23 16:35:08.993929 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.993895 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal" Apr 23 16:35:08.998584 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:08.998568 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" Apr 23 16:35:09.035234 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:09.035201 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 23 16:35:09.135720 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:09.135656 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 23 16:35:09.236211 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:09.236189 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 23 16:35:09.336696 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:09.336668 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 23 16:35:09.341868 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:09.341850 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 16:35:09.342009 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:09.341992 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 16:35:09.342063 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:09.342015 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 16:35:09.437755 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:09.437722 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 23 16:35:09.440942 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:09.440904 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 16:30:08 +0000 UTC" deadline="2028-01-31 22:42:55.67293505 +0000 UTC" Apr 23 16:35:09.440942 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:09.440939 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15558h7m46.231999068s" Apr 23 16:35:09.443049 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:09.443031 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 16:35:09.454919 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:09.454895 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:35:09.478353 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:09.478326 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9psv4" Apr 23 16:35:09.487176 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:09.487159 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9psv4" Apr 23 16:35:09.494822 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:09.494795 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod432a54920ff69b032f406403f8e82323.slice/crio-335480ddf00d2ab8841ff03ce8c146fc8becd85c9155d29934211673e430ad9b WatchSource:0}: Error finding container 335480ddf00d2ab8841ff03ce8c146fc8becd85c9155d29934211673e430ad9b: Status 404 returned error can't find the container with id 335480ddf00d2ab8841ff03ce8c146fc8becd85c9155d29934211673e430ad9b Apr 23 16:35:09.495155 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:09.495138 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd00e776913cd1177ab03d04d7041f574.slice/crio-a7a329e9c54697c1e82bbfaf904f01502eb6daf6be558c67c82754de26a3561f WatchSource:0}: Error finding container a7a329e9c54697c1e82bbfaf904f01502eb6daf6be558c67c82754de26a3561f: Status 404 returned error can't find the container with id a7a329e9c54697c1e82bbfaf904f01502eb6daf6be558c67c82754de26a3561f Apr 23 16:35:09.499711 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:09.499698 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:35:09.538106 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:09.538076 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 23 16:35:09.574155 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:09.574111 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal" event={"ID":"432a54920ff69b032f406403f8e82323","Type":"ContainerStarted","Data":"335480ddf00d2ab8841ff03ce8c146fc8becd85c9155d29934211673e430ad9b"} Apr 23 16:35:09.574999 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:09.574972 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" event={"ID":"d00e776913cd1177ab03d04d7041f574","Type":"ContainerStarted","Data":"a7a329e9c54697c1e82bbfaf904f01502eb6daf6be558c67c82754de26a3561f"} Apr 23 16:35:09.639156 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:09.639128 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 23 16:35:09.739689 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:09.739609 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-102.ec2.internal\" not found" Apr 23 16:35:09.829187 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:09.829155 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:09.843193 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:09.843165 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal" Apr 23 16:35:09.852947 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:09.852919 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:35:09.855023 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:09.854998 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" Apr 23 16:35:09.867506 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:09.867479 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:35:09.897357 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:09.897206 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:10.419480 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.419446 2578 apiserver.go:52] "Watching apiserver" Apr 23 16:35:10.425789 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.425762 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 16:35:10.427725 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.427695 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-q2hgm","openshift-network-diagnostics/network-check-target-5lhlh","openshift-ovn-kubernetes/ovnkube-node-hbtmc","kube-system/konnectivity-agent-s9d8v","kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz","openshift-image-registry/node-ca-xsjmw","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal","openshift-multus/network-metrics-daemon-h6kzn","openshift-network-operator/iptables-alerter-2dvdd","openshift-cluster-node-tuning-operator/tuned-hgwkb","openshift-dns/node-resolver-frq2q","openshift-multus/multus-additional-cni-plugins-fmfdm"] Apr 23 16:35:10.430550 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.430530 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xsjmw" Apr 23 16:35:10.430632 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.430619 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:10.430815 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:10.430689 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5lhlh" podUID="017fd19b-a66e-4805-8f42-625a4749d380" Apr 23 16:35:10.431827 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.431806 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.432872 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.432849 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-s9d8v" Apr 23 16:35:10.433950 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.433929 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.435049 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.435032 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.436275 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.436259 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:10.436365 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:10.436326 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6kzn" podUID="b7f21f2f-2763-41c8-af5e-52de8001226b" Apr 23 16:35:10.437457 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.437443 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2dvdd" Apr 23 16:35:10.438769 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.438689 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.438871 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.438784 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 16:35:10.439935 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.439919 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-frq2q" Apr 23 16:35:10.440454 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.440437 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 16:35:10.440540 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.440437 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 16:35:10.440926 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.440891 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 16:35:10.440926 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.440902 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 16:35:10.441267 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.441170 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 16:35:10.441267 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.441214 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 16:35:10.441404 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.441353 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 16:35:10.441465 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.441369 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vggrm\"" Apr 23 16:35:10.441465 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.441364 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:10.441897 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.441737 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 16:35:10.441897 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.441797 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 16:35:10.442041 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.441927 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:10.442094 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.442075 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 16:35:10.442227 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.442190 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-75hxj\"" Apr 23 16:35:10.442309 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.442253 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 16:35:10.442365 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.442315 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 16:35:10.442489 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.442472 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 16:35:10.442629 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.442597 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 16:35:10.442725 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.442602 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.443257 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.443237 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 16:35:10.443541 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.443518 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-zd6rx\"" Apr 23 16:35:10.443829 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.443801 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 16:35:10.444429 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.444035 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:10.444429 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.444321 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 16:35:10.444639 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.444622 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rndf6\"" Apr 23 16:35:10.444897 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.444883 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-b995b\"" Apr 23 16:35:10.445117 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.445102 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 16:35:10.445245 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.445226 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:10.445344 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.445328 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-dwlr4\"" Apr 23 16:35:10.445508 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.445493 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 16:35:10.445600 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.445566 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 16:35:10.445921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.445757 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-hlwxr\"" Apr 23 16:35:10.446621 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.446469 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-cjg42\"" Apr 23 16:35:10.447600 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.447413 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 16:35:10.447600 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.447560 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 16:35:10.448310 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.448290 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 16:35:10.448652 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.448636 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nlksj\"" Apr 23 16:35:10.454355 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454335 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1be7f2cd-30d6-400e-8502-6227dcb98324-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.454456 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454371 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1be7f2cd-30d6-400e-8502-6227dcb98324-etc-selinux\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.454456 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454413 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-multus-socket-dir-parent\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.454456 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454435 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-host-run-k8s-cni-cncf-io\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.454456 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454451 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-host-var-lib-cni-multus\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.454636 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454509 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-systemd\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.454636 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454527 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-node-log\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.454636 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454541 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-cnibin\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.454636 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454555 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-os-release\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.454636 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454569 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-295g7\" (UniqueName: \"kubernetes.io/projected/b7f21f2f-2763-41c8-af5e-52de8001226b-kube-api-access-295g7\") pod \"network-metrics-daemon-h6kzn\" (UID: \"b7f21f2f-2763-41c8-af5e-52de8001226b\") " pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:10.454636 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454589 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-lib-modules\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.454636 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454603 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3301fde8-0566-4365-a9d8-b069eb4bebb7-ovn-node-metrics-cert\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.454636 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454617 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.454636 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454630 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbbpz\" (UniqueName: \"kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz\") pod \"network-check-target-5lhlh\" (UID: \"017fd19b-a66e-4805-8f42-625a4749d380\") " pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:10.455101 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454651 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs\") pod \"network-metrics-daemon-h6kzn\" (UID: \"b7f21f2f-2763-41c8-af5e-52de8001226b\") " pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:10.455101 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454700 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-run-ovn\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.455101 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454735 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/10e49ebb-e9c6-4f87-903f-bb7018d79002-host-slash\") pod \"iptables-alerter-2dvdd\" (UID: \"10e49ebb-e9c6-4f87-903f-bb7018d79002\") " pod="openshift-network-operator/iptables-alerter-2dvdd" Apr 23 16:35:10.455101 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454761 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-sysctl-d\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.455101 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454785 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.455101 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454814 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-host-run-multus-certs\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.455101 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454846 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b84q\" (UniqueName: \"kubernetes.io/projected/d285eb72-a566-4dcd-badf-2fefeec9c577-kube-api-access-8b84q\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.455101 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454869 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1be7f2cd-30d6-400e-8502-6227dcb98324-sys-fs\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.455101 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454891 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-kubernetes\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.455101 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454915 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-sysctl-conf\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.455101 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454937 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-run\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.455101 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454963 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-cni-netd\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.455101 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.454986 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-system-cni-dir\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.455101 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455009 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0cf2af80-3ff4-4717-af9c-87bb29677708-host\") pod \"node-ca-xsjmw\" (UID: \"0cf2af80-3ff4-4717-af9c-87bb29677708\") " pod="openshift-image-registry/node-ca-xsjmw" Apr 23 16:35:10.455101 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455056 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1be7f2cd-30d6-400e-8502-6227dcb98324-device-dir\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.455827 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455106 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpvtf\" (UniqueName: \"kubernetes.io/projected/1be7f2cd-30d6-400e-8502-6227dcb98324-kube-api-access-kpvtf\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.455827 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455152 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c27hw\" (UniqueName: \"kubernetes.io/projected/10e49ebb-e9c6-4f87-903f-bb7018d79002-kube-api-access-c27hw\") pod \"iptables-alerter-2dvdd\" (UID: \"10e49ebb-e9c6-4f87-903f-bb7018d79002\") " pod="openshift-network-operator/iptables-alerter-2dvdd" Apr 23 16:35:10.455827 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455180 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-log-socket\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.455827 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455203 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-run-ovn-kubernetes\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.455827 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455228 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-cni-binary-copy\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.455827 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455253 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-multus-cni-dir\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.455827 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455278 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d285eb72-a566-4dcd-badf-2fefeec9c577-multus-daemon-config\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.455827 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455311 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-modprobe-d\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.455827 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455340 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-sys\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.455827 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455363 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-tuned\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.455827 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455402 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-var-lib-openvswitch\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.455827 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455427 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpztf\" (UniqueName: \"kubernetes.io/projected/3301fde8-0566-4365-a9d8-b069eb4bebb7-kube-api-access-wpztf\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.455827 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455451 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-os-release\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.455827 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455481 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.455827 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455504 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6675\" (UniqueName: \"kubernetes.io/projected/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-kube-api-access-z6675\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.455827 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455535 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/10e49ebb-e9c6-4f87-903f-bb7018d79002-iptables-alerter-script\") pod \"iptables-alerter-2dvdd\" (UID: \"10e49ebb-e9c6-4f87-903f-bb7018d79002\") " pod="openshift-network-operator/iptables-alerter-2dvdd" Apr 23 16:35:10.456409 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455576 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-host\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.456409 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455616 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-slash\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.456409 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455650 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-run-netns\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.456409 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455683 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1be7f2cd-30d6-400e-8502-6227dcb98324-registration-dir\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.456409 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455709 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-run-openvswitch\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.456409 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455724 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/74494667-d025-4d57-be34-03a72ee7cbaa-hosts-file\") pod \"node-resolver-frq2q\" (UID: \"74494667-d025-4d57-be34-03a72ee7cbaa\") " pod="openshift-dns/node-resolver-frq2q" Apr 23 16:35:10.456409 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455741 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/74494667-d025-4d57-be34-03a72ee7cbaa-tmp-dir\") pod \"node-resolver-frq2q\" (UID: \"74494667-d025-4d57-be34-03a72ee7cbaa\") " pod="openshift-dns/node-resolver-frq2q" Apr 23 16:35:10.456409 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455767 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xnz4\" (UniqueName: \"kubernetes.io/projected/74494667-d025-4d57-be34-03a72ee7cbaa-kube-api-access-9xnz4\") pod \"node-resolver-frq2q\" (UID: \"74494667-d025-4d57-be34-03a72ee7cbaa\") " pod="openshift-dns/node-resolver-frq2q" Apr 23 16:35:10.456409 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455789 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1be7f2cd-30d6-400e-8502-6227dcb98324-socket-dir\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.456409 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455813 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-host-run-netns\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.456409 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455862 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-host-var-lib-cni-bin\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.456409 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455886 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3301fde8-0566-4365-a9d8-b069eb4bebb7-ovnkube-config\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.456409 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455905 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-cnibin\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.456409 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455927 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-system-cni-dir\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.456409 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455947 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d285eb72-a566-4dcd-badf-2fefeec9c577-cni-binary-copy\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.456409 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455969 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-multus-conf-dir\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.456409 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.455990 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-cni-bin\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.457042 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.456014 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ac1c0ce2-6f52-471e-ba47-e46a7d7fc0a6-konnectivity-ca\") pod \"konnectivity-agent-s9d8v\" (UID: \"ac1c0ce2-6f52-471e-ba47-e46a7d7fc0a6\") " pod="kube-system/konnectivity-agent-s9d8v" Apr 23 16:35:10.457042 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.456047 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-etc-kubernetes\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.457042 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.456088 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ldz9\" (UniqueName: \"kubernetes.io/projected/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-kube-api-access-7ldz9\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.457042 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.456141 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-kubelet\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.457042 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.456191 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3301fde8-0566-4365-a9d8-b069eb4bebb7-env-overrides\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.457042 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.456251 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.457042 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.456283 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ac1c0ce2-6f52-471e-ba47-e46a7d7fc0a6-agent-certs\") pod \"konnectivity-agent-s9d8v\" (UID: \"ac1c0ce2-6f52-471e-ba47-e46a7d7fc0a6\") " pod="kube-system/konnectivity-agent-s9d8v" Apr 23 16:35:10.457042 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.456304 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-tmp\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.457042 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.456332 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-systemd-units\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.457042 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.456348 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3301fde8-0566-4365-a9d8-b069eb4bebb7-ovnkube-script-lib\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.457042 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.456368 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-host-var-lib-kubelet\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.457042 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.456406 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-sysconfig\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.457042 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.456429 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-var-lib-kubelet\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.457042 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.456447 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-etc-openvswitch\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.457042 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.456461 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-hostroot\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.457042 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.456475 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-run-systemd\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.457042 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.456491 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0cf2af80-3ff4-4717-af9c-87bb29677708-serviceca\") pod \"node-ca-xsjmw\" (UID: \"0cf2af80-3ff4-4717-af9c-87bb29677708\") " pod="openshift-image-registry/node-ca-xsjmw" Apr 23 16:35:10.457693 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.456514 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmcsm\" (UniqueName: \"kubernetes.io/projected/0cf2af80-3ff4-4717-af9c-87bb29677708-kube-api-access-mmcsm\") pod \"node-ca-xsjmw\" (UID: \"0cf2af80-3ff4-4717-af9c-87bb29677708\") " pod="openshift-image-registry/node-ca-xsjmw" Apr 23 16:35:10.488487 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.488458 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:30:09 +0000 UTC" deadline="2028-01-10 00:56:59.359000018 +0000 UTC" Apr 23 16:35:10.488487 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.488484 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15032h21m48.870519506s" Apr 23 16:35:10.557323 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.557289 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1be7f2cd-30d6-400e-8502-6227dcb98324-sys-fs\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.557512 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.557335 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-kubernetes\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.557512 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.557369 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-sysctl-conf\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.557512 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.557374 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1be7f2cd-30d6-400e-8502-6227dcb98324-sys-fs\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.557512 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.557407 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-run\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.557512 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.557457 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-cni-netd\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.557512 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.557462 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-run\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.557512 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.557473 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-kubernetes\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.557512 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.557493 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-system-cni-dir\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.557849 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.557527 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-cni-netd\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.557849 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.557541 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-system-cni-dir\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.557849 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.557558 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-sysctl-conf\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.557849 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.557564 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0cf2af80-3ff4-4717-af9c-87bb29677708-host\") pod \"node-ca-xsjmw\" (UID: \"0cf2af80-3ff4-4717-af9c-87bb29677708\") " pod="openshift-image-registry/node-ca-xsjmw" Apr 23 16:35:10.557849 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.557597 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0cf2af80-3ff4-4717-af9c-87bb29677708-host\") pod \"node-ca-xsjmw\" (UID: \"0cf2af80-3ff4-4717-af9c-87bb29677708\") " pod="openshift-image-registry/node-ca-xsjmw" Apr 23 16:35:10.557849 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.557599 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1be7f2cd-30d6-400e-8502-6227dcb98324-device-dir\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.557849 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.557636 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1be7f2cd-30d6-400e-8502-6227dcb98324-device-dir\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.557849 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.557656 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpvtf\" (UniqueName: \"kubernetes.io/projected/1be7f2cd-30d6-400e-8502-6227dcb98324-kube-api-access-kpvtf\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.557849 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.557686 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c27hw\" (UniqueName: \"kubernetes.io/projected/10e49ebb-e9c6-4f87-903f-bb7018d79002-kube-api-access-c27hw\") pod \"iptables-alerter-2dvdd\" (UID: \"10e49ebb-e9c6-4f87-903f-bb7018d79002\") " pod="openshift-network-operator/iptables-alerter-2dvdd" Apr 23 16:35:10.557849 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.557800 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-log-socket\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.559263 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.557709 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-log-socket\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.559263 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.558398 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-run-ovn-kubernetes\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.559263 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.558475 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-cni-binary-copy\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.559263 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.558503 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-multus-cni-dir\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.559263 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.558552 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-run-ovn-kubernetes\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.559263 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.558596 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-multus-cni-dir\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.559263 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.558694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d285eb72-a566-4dcd-badf-2fefeec9c577-multus-daemon-config\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.559263 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.558732 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-modprobe-d\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.559263 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.558777 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-sys\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.559263 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.558813 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-tuned\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.559263 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.558847 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-var-lib-openvswitch\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.559263 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.558879 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpztf\" (UniqueName: \"kubernetes.io/projected/3301fde8-0566-4365-a9d8-b069eb4bebb7-kube-api-access-wpztf\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.559263 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.558908 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-os-release\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.559263 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.558941 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.559263 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.558974 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6675\" (UniqueName: \"kubernetes.io/projected/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-kube-api-access-z6675\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.559263 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.559006 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/10e49ebb-e9c6-4f87-903f-bb7018d79002-iptables-alerter-script\") pod \"iptables-alerter-2dvdd\" (UID: \"10e49ebb-e9c6-4f87-903f-bb7018d79002\") " pod="openshift-network-operator/iptables-alerter-2dvdd" Apr 23 16:35:10.559263 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.559070 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-var-lib-openvswitch\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.560146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.559321 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 16:35:10.560146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.559367 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-modprobe-d\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.560146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.559660 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-cni-binary-copy\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.560146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.559673 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-os-release\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.560146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.559760 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-sys\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.560146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.559783 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.560146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.559807 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-host\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.560146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.559855 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-slash\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.560146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.559909 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-run-netns\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.560146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.559943 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1be7f2cd-30d6-400e-8502-6227dcb98324-registration-dir\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.560146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.559958 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d285eb72-a566-4dcd-badf-2fefeec9c577-multus-daemon-config\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.560146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.559972 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-run-openvswitch\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.560146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.559983 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-slash\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.560146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560026 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/74494667-d025-4d57-be34-03a72ee7cbaa-hosts-file\") pod \"node-resolver-frq2q\" (UID: \"74494667-d025-4d57-be34-03a72ee7cbaa\") " pod="openshift-dns/node-resolver-frq2q" Apr 23 16:35:10.560146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560062 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/74494667-d025-4d57-be34-03a72ee7cbaa-tmp-dir\") pod \"node-resolver-frq2q\" (UID: \"74494667-d025-4d57-be34-03a72ee7cbaa\") " pod="openshift-dns/node-resolver-frq2q" Apr 23 16:35:10.560146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560123 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-run-netns\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.560146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560141 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-host\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.560921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560181 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/74494667-d025-4d57-be34-03a72ee7cbaa-hosts-file\") pod \"node-resolver-frq2q\" (UID: \"74494667-d025-4d57-be34-03a72ee7cbaa\") " pod="openshift-dns/node-resolver-frq2q" Apr 23 16:35:10.560921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560196 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-run-openvswitch\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.560921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560208 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/10e49ebb-e9c6-4f87-903f-bb7018d79002-iptables-alerter-script\") pod \"iptables-alerter-2dvdd\" (UID: \"10e49ebb-e9c6-4f87-903f-bb7018d79002\") " pod="openshift-network-operator/iptables-alerter-2dvdd" Apr 23 16:35:10.560921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560223 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xnz4\" (UniqueName: \"kubernetes.io/projected/74494667-d025-4d57-be34-03a72ee7cbaa-kube-api-access-9xnz4\") pod \"node-resolver-frq2q\" (UID: \"74494667-d025-4d57-be34-03a72ee7cbaa\") " pod="openshift-dns/node-resolver-frq2q" Apr 23 16:35:10.560921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560242 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1be7f2cd-30d6-400e-8502-6227dcb98324-registration-dir\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.560921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560261 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1be7f2cd-30d6-400e-8502-6227dcb98324-socket-dir\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.560921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560334 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-host-run-netns\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.560921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560403 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-host-var-lib-cni-bin\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.560921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560438 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3301fde8-0566-4365-a9d8-b069eb4bebb7-ovnkube-config\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.560921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560442 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-host-run-netns\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.560921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560458 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/74494667-d025-4d57-be34-03a72ee7cbaa-tmp-dir\") pod \"node-resolver-frq2q\" (UID: \"74494667-d025-4d57-be34-03a72ee7cbaa\") " pod="openshift-dns/node-resolver-frq2q" Apr 23 16:35:10.560921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560476 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-cnibin\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.560921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560477 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1be7f2cd-30d6-400e-8502-6227dcb98324-socket-dir\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.560921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560510 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-host-var-lib-cni-bin\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.560921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560542 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-system-cni-dir\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.560921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560617 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d285eb72-a566-4dcd-badf-2fefeec9c577-cni-binary-copy\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.560921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560666 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-system-cni-dir\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.560921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560659 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-multus-conf-dir\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.561744 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560719 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-multus-conf-dir\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.561744 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560727 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-cni-bin\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.561744 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560772 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ac1c0ce2-6f52-471e-ba47-e46a7d7fc0a6-konnectivity-ca\") pod \"konnectivity-agent-s9d8v\" (UID: \"ac1c0ce2-6f52-471e-ba47-e46a7d7fc0a6\") " pod="kube-system/konnectivity-agent-s9d8v" Apr 23 16:35:10.561744 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560804 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-etc-kubernetes\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.561744 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560839 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ldz9\" (UniqueName: \"kubernetes.io/projected/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-kube-api-access-7ldz9\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.561744 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560887 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-cnibin\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.561744 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560929 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-kubelet\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.561744 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560967 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3301fde8-0566-4365-a9d8-b069eb4bebb7-env-overrides\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.561744 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561002 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.561744 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561032 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ac1c0ce2-6f52-471e-ba47-e46a7d7fc0a6-agent-certs\") pod \"konnectivity-agent-s9d8v\" (UID: \"ac1c0ce2-6f52-471e-ba47-e46a7d7fc0a6\") " pod="kube-system/konnectivity-agent-s9d8v" Apr 23 16:35:10.561744 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561063 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-tmp\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.561744 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561097 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-systemd-units\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.561744 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561109 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d285eb72-a566-4dcd-badf-2fefeec9c577-cni-binary-copy\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.561744 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.560775 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-cni-bin\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.561744 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561130 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3301fde8-0566-4365-a9d8-b069eb4bebb7-ovnkube-script-lib\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.561744 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561161 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-etc-kubernetes\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.561744 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561172 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-kubelet\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.561744 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561163 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-host-var-lib-kubelet\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.562639 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561211 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-host-var-lib-kubelet\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.562639 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561251 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-sysconfig\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.562639 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561284 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-var-lib-kubelet\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.562639 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561316 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-etc-openvswitch\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.562639 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561332 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3301fde8-0566-4365-a9d8-b069eb4bebb7-env-overrides\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.562639 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561363 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-hostroot\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.562639 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561004 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3301fde8-0566-4365-a9d8-b069eb4bebb7-ovnkube-config\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.562639 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561418 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-run-systemd\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.562639 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561448 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-var-lib-kubelet\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.562639 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561463 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-sysconfig\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.562639 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561498 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-etc-openvswitch\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.562639 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561451 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0cf2af80-3ff4-4717-af9c-87bb29677708-serviceca\") pod \"node-ca-xsjmw\" (UID: \"0cf2af80-3ff4-4717-af9c-87bb29677708\") " pod="openshift-image-registry/node-ca-xsjmw" Apr 23 16:35:10.562639 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561541 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmcsm\" (UniqueName: \"kubernetes.io/projected/0cf2af80-3ff4-4717-af9c-87bb29677708-kube-api-access-mmcsm\") pod \"node-ca-xsjmw\" (UID: \"0cf2af80-3ff4-4717-af9c-87bb29677708\") " pod="openshift-image-registry/node-ca-xsjmw" Apr 23 16:35:10.562639 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561542 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-hostroot\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.562639 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561583 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1be7f2cd-30d6-400e-8502-6227dcb98324-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.562639 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561616 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1be7f2cd-30d6-400e-8502-6227dcb98324-etc-selinux\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.562639 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561645 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-multus-socket-dir-parent\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.562639 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561670 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-host-run-k8s-cni-cncf-io\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.563477 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561694 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.563477 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561733 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-host-var-lib-cni-multus\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.563477 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561775 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-host-run-k8s-cni-cncf-io\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.563477 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561699 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-host-var-lib-cni-multus\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.563477 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561828 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-systemd\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.563477 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561853 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-node-log\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.563477 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561887 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-cnibin\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.563477 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561916 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-os-release\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.563477 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561948 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-295g7\" (UniqueName: \"kubernetes.io/projected/b7f21f2f-2763-41c8-af5e-52de8001226b-kube-api-access-295g7\") pod \"network-metrics-daemon-h6kzn\" (UID: \"b7f21f2f-2763-41c8-af5e-52de8001226b\") " pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:10.563477 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.561978 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-lib-modules\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.563477 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562004 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3301fde8-0566-4365-a9d8-b069eb4bebb7-ovn-node-metrics-cert\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.563477 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562034 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.563477 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562037 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ac1c0ce2-6f52-471e-ba47-e46a7d7fc0a6-konnectivity-ca\") pod \"konnectivity-agent-s9d8v\" (UID: \"ac1c0ce2-6f52-471e-ba47-e46a7d7fc0a6\") " pod="kube-system/konnectivity-agent-s9d8v" Apr 23 16:35:10.563477 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562067 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbbpz\" (UniqueName: \"kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz\") pod \"network-check-target-5lhlh\" (UID: \"017fd19b-a66e-4805-8f42-625a4749d380\") " pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:10.563477 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562094 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-systemd-units\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.563477 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562097 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs\") pod \"network-metrics-daemon-h6kzn\" (UID: \"b7f21f2f-2763-41c8-af5e-52de8001226b\") " pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:10.563477 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562151 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-run-ovn\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.563477 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:10.562185 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:10.564257 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562188 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/10e49ebb-e9c6-4f87-903f-bb7018d79002-host-slash\") pod \"iptables-alerter-2dvdd\" (UID: \"10e49ebb-e9c6-4f87-903f-bb7018d79002\") " pod="openshift-network-operator/iptables-alerter-2dvdd" Apr 23 16:35:10.564257 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562217 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-sysctl-d\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.564257 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562261 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3301fde8-0566-4365-a9d8-b069eb4bebb7-ovnkube-script-lib\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.564257 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:10.562309 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs podName:b7f21f2f-2763-41c8-af5e-52de8001226b nodeName:}" failed. No retries permitted until 2026-04-23 16:35:11.062280586 +0000 UTC m=+3.084676694 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs") pod "network-metrics-daemon-h6kzn" (UID: "b7f21f2f-2763-41c8-af5e-52de8001226b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:10.564257 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562329 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.564257 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562336 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-run-ovn\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.564257 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562330 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-run-systemd\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.564257 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562399 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-host-run-multus-certs\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.564257 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562425 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-cnibin\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.564257 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562438 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8b84q\" (UniqueName: \"kubernetes.io/projected/d285eb72-a566-4dcd-badf-2fefeec9c577-kube-api-access-8b84q\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.564257 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562605 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/10e49ebb-e9c6-4f87-903f-bb7018d79002-host-slash\") pod \"iptables-alerter-2dvdd\" (UID: \"10e49ebb-e9c6-4f87-903f-bb7018d79002\") " pod="openshift-network-operator/iptables-alerter-2dvdd" Apr 23 16:35:10.564257 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562684 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1be7f2cd-30d6-400e-8502-6227dcb98324-etc-selinux\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.564257 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562767 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0cf2af80-3ff4-4717-af9c-87bb29677708-serviceca\") pod \"node-ca-xsjmw\" (UID: \"0cf2af80-3ff4-4717-af9c-87bb29677708\") " pod="openshift-image-registry/node-ca-xsjmw" Apr 23 16:35:10.564257 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562839 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1be7f2cd-30d6-400e-8502-6227dcb98324-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.564257 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562899 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-os-release\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.564257 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.562990 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-sysctl-d\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.564257 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.563049 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-systemd\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.565097 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.563112 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-multus-socket-dir-parent\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.565097 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.563204 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-lib-modules\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.565097 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.563257 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d285eb72-a566-4dcd-badf-2fefeec9c577-host-run-multus-certs\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.565097 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.563304 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.565097 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.563435 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.565097 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.563503 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3301fde8-0566-4365-a9d8-b069eb4bebb7-node-log\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.565097 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.564581 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ac1c0ce2-6f52-471e-ba47-e46a7d7fc0a6-agent-certs\") pod \"konnectivity-agent-s9d8v\" (UID: \"ac1c0ce2-6f52-471e-ba47-e46a7d7fc0a6\") " pod="kube-system/konnectivity-agent-s9d8v" Apr 23 16:35:10.565604 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.565580 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-tmp\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.565729 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.565619 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-etc-tuned\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.566444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.566422 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3301fde8-0566-4365-a9d8-b069eb4bebb7-ovn-node-metrics-cert\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.573293 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.573269 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c27hw\" (UniqueName: \"kubernetes.io/projected/10e49ebb-e9c6-4f87-903f-bb7018d79002-kube-api-access-c27hw\") pod \"iptables-alerter-2dvdd\" (UID: \"10e49ebb-e9c6-4f87-903f-bb7018d79002\") " pod="openshift-network-operator/iptables-alerter-2dvdd" Apr 23 16:35:10.574206 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:10.574183 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:10.574206 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:10.574205 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:10.574342 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:10.574218 2578 projected.go:194] Error preparing data for projected volume kube-api-access-kbbpz for pod openshift-network-diagnostics/network-check-target-5lhlh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:10.574342 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:10.574273 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz podName:017fd19b-a66e-4805-8f42-625a4749d380 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:11.074259488 +0000 UTC m=+3.096655599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kbbpz" (UniqueName: "kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz") pod "network-check-target-5lhlh" (UID: "017fd19b-a66e-4805-8f42-625a4749d380") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:10.577131 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.577088 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmcsm\" (UniqueName: \"kubernetes.io/projected/0cf2af80-3ff4-4717-af9c-87bb29677708-kube-api-access-mmcsm\") pod \"node-ca-xsjmw\" (UID: \"0cf2af80-3ff4-4717-af9c-87bb29677708\") " pod="openshift-image-registry/node-ca-xsjmw" Apr 23 16:35:10.577688 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.577655 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ldz9\" (UniqueName: \"kubernetes.io/projected/a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6-kube-api-access-7ldz9\") pod \"tuned-hgwkb\" (UID: \"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6\") " pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.577793 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.577768 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b84q\" (UniqueName: \"kubernetes.io/projected/d285eb72-a566-4dcd-badf-2fefeec9c577-kube-api-access-8b84q\") pod \"multus-q2hgm\" (UID: \"d285eb72-a566-4dcd-badf-2fefeec9c577\") " pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.578399 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.578350 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpztf\" (UniqueName: \"kubernetes.io/projected/3301fde8-0566-4365-a9d8-b069eb4bebb7-kube-api-access-wpztf\") pod \"ovnkube-node-hbtmc\" (UID: \"3301fde8-0566-4365-a9d8-b069eb4bebb7\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.578790 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.578768 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xnz4\" (UniqueName: \"kubernetes.io/projected/74494667-d025-4d57-be34-03a72ee7cbaa-kube-api-access-9xnz4\") pod \"node-resolver-frq2q\" (UID: \"74494667-d025-4d57-be34-03a72ee7cbaa\") " pod="openshift-dns/node-resolver-frq2q" Apr 23 16:35:10.579484 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.579464 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6675\" (UniqueName: \"kubernetes.io/projected/07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd-kube-api-access-z6675\") pod \"multus-additional-cni-plugins-fmfdm\" (UID: \"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd\") " pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.579676 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.579659 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpvtf\" (UniqueName: \"kubernetes.io/projected/1be7f2cd-30d6-400e-8502-6227dcb98324-kube-api-access-kpvtf\") pod \"aws-ebs-csi-driver-node-sx5rz\" (UID: \"1be7f2cd-30d6-400e-8502-6227dcb98324\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.581159 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.581132 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-295g7\" (UniqueName: \"kubernetes.io/projected/b7f21f2f-2763-41c8-af5e-52de8001226b-kube-api-access-295g7\") pod \"network-metrics-daemon-h6kzn\" (UID: \"b7f21f2f-2763-41c8-af5e-52de8001226b\") " pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:10.742192 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.742103 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xsjmw" Apr 23 16:35:10.751025 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.751004 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:10.755430 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.755407 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:10.757946 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.757929 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-s9d8v" Apr 23 16:35:10.763536 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.763518 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" Apr 23 16:35:10.770129 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.770109 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q2hgm" Apr 23 16:35:10.777747 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.777732 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2dvdd" Apr 23 16:35:10.784302 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.784286 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" Apr 23 16:35:10.790831 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.790810 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-frq2q" Apr 23 16:35:10.796314 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.796297 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fmfdm" Apr 23 16:35:10.870232 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:10.870202 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:11.054640 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:11.054606 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07043a05_dfd7_4ffb_ac7d_95bd4f1e3ccd.slice/crio-920a8a64b7dd925a75e3d87ce4da2a73f92520624fa02981a8daf10ee0219395 WatchSource:0}: Error finding container 920a8a64b7dd925a75e3d87ce4da2a73f92520624fa02981a8daf10ee0219395: Status 404 returned error can't find the container with id 920a8a64b7dd925a75e3d87ce4da2a73f92520624fa02981a8daf10ee0219395 Apr 23 16:35:11.056594 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:11.056570 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda12982ff_ef0c_4f4d_88dc_c3e4719ef6d6.slice/crio-8c18cd5342f97b652a96bf94406142800281baf5bcf60317ba84d35ad4ae1a7d WatchSource:0}: Error finding container 8c18cd5342f97b652a96bf94406142800281baf5bcf60317ba84d35ad4ae1a7d: Status 404 returned error can't find the container with id 8c18cd5342f97b652a96bf94406142800281baf5bcf60317ba84d35ad4ae1a7d Apr 23 16:35:11.058311 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:11.058285 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1be7f2cd_30d6_400e_8502_6227dcb98324.slice/crio-2de1fcf94e23d1f2c6f2b4acd7785b6389794ad6ea6feda95e66548df15e2bfd WatchSource:0}: Error finding container 2de1fcf94e23d1f2c6f2b4acd7785b6389794ad6ea6feda95e66548df15e2bfd: Status 404 returned error can't find the container with id 2de1fcf94e23d1f2c6f2b4acd7785b6389794ad6ea6feda95e66548df15e2bfd Apr 23 16:35:11.059079 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:11.059005 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3301fde8_0566_4365_a9d8_b069eb4bebb7.slice/crio-c667bc39731dead79bf3cdde2740722323c82ba101cf7a2c8e12280c2494078b WatchSource:0}: Error finding container c667bc39731dead79bf3cdde2740722323c82ba101cf7a2c8e12280c2494078b: Status 404 returned error can't find the container with id c667bc39731dead79bf3cdde2740722323c82ba101cf7a2c8e12280c2494078b Apr 23 16:35:11.061369 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:11.061352 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74494667_d025_4d57_be34_03a72ee7cbaa.slice/crio-a0769f77c101334e4b73718d49d958e87452dcded77e297bdd94c633271a058f WatchSource:0}: Error finding container a0769f77c101334e4b73718d49d958e87452dcded77e297bdd94c633271a058f: Status 404 returned error can't find the container with id a0769f77c101334e4b73718d49d958e87452dcded77e297bdd94c633271a058f Apr 23 16:35:11.064200 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:11.062804 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10e49ebb_e9c6_4f87_903f_bb7018d79002.slice/crio-c4311717d311ebfa95a4d295ad1a74174e280d3a243910faacb56dbcaef91f8d WatchSource:0}: Error finding container c4311717d311ebfa95a4d295ad1a74174e280d3a243910faacb56dbcaef91f8d: Status 404 returned error can't find the container with id c4311717d311ebfa95a4d295ad1a74174e280d3a243910faacb56dbcaef91f8d Apr 23 16:35:11.064466 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:11.064265 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd285eb72_a566_4dcd_badf_2fefeec9c577.slice/crio-8dd5c2ebe4cafa96551b57e58ed3bdabe1295b75f76efbd4c7d93720900450b4 WatchSource:0}: Error finding container 8dd5c2ebe4cafa96551b57e58ed3bdabe1295b75f76efbd4c7d93720900450b4: Status 404 returned error can't find the container with id 8dd5c2ebe4cafa96551b57e58ed3bdabe1295b75f76efbd4c7d93720900450b4 Apr 23 16:35:11.065308 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:11.065286 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac1c0ce2_6f52_471e_ba47_e46a7d7fc0a6.slice/crio-f6fca3f1644f81a8def7de25e623cd4f7277489b79124f50eba793ed97733760 WatchSource:0}: Error finding container f6fca3f1644f81a8def7de25e623cd4f7277489b79124f50eba793ed97733760: Status 404 returned error can't find the container with id f6fca3f1644f81a8def7de25e623cd4f7277489b79124f50eba793ed97733760 Apr 23 16:35:11.065726 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:11.065703 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs\") pod \"network-metrics-daemon-h6kzn\" (UID: \"b7f21f2f-2763-41c8-af5e-52de8001226b\") " pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:11.065839 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:11.065822 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:11.065891 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:11.065886 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs podName:b7f21f2f-2763-41c8-af5e-52de8001226b nodeName:}" failed. No retries permitted until 2026-04-23 16:35:12.065871724 +0000 UTC m=+4.088267818 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs") pod "network-metrics-daemon-h6kzn" (UID: "b7f21f2f-2763-41c8-af5e-52de8001226b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:11.066170 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:11.066147 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cf2af80_3ff4_4717_af9c_87bb29677708.slice/crio-d121d77f19dccea54586206b0237e94658f2ec3444b4cf297cb1b638dad9a79a WatchSource:0}: Error finding container d121d77f19dccea54586206b0237e94658f2ec3444b4cf297cb1b638dad9a79a: Status 404 returned error can't find the container with id d121d77f19dccea54586206b0237e94658f2ec3444b4cf297cb1b638dad9a79a Apr 23 16:35:11.166114 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:11.166090 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbbpz\" (UniqueName: \"kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz\") pod \"network-check-target-5lhlh\" (UID: \"017fd19b-a66e-4805-8f42-625a4749d380\") " pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:11.166250 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:11.166232 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:11.166316 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:11.166255 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:11.166316 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:11.166265 2578 projected.go:194] Error preparing data for projected volume kube-api-access-kbbpz for pod openshift-network-diagnostics/network-check-target-5lhlh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:11.166316 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:11.166305 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz podName:017fd19b-a66e-4805-8f42-625a4749d380 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:12.166292058 +0000 UTC m=+4.188688152 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kbbpz" (UniqueName: "kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz") pod "network-check-target-5lhlh" (UID: "017fd19b-a66e-4805-8f42-625a4749d380") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:11.488723 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:11.488610 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:30:09 +0000 UTC" deadline="2027-12-07 05:15:50.749665054 +0000 UTC" Apr 23 16:35:11.488723 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:11.488647 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14220h40m39.261021025s" Apr 23 16:35:11.571615 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:11.571586 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:11.571757 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:11.571700 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5lhlh" podUID="017fd19b-a66e-4805-8f42-625a4749d380" Apr 23 16:35:11.580431 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:11.580350 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xsjmw" event={"ID":"0cf2af80-3ff4-4717-af9c-87bb29677708","Type":"ContainerStarted","Data":"d121d77f19dccea54586206b0237e94658f2ec3444b4cf297cb1b638dad9a79a"} Apr 23 16:35:11.584714 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:11.584663 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-s9d8v" event={"ID":"ac1c0ce2-6f52-471e-ba47-e46a7d7fc0a6","Type":"ContainerStarted","Data":"f6fca3f1644f81a8def7de25e623cd4f7277489b79124f50eba793ed97733760"} Apr 23 16:35:11.588960 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:11.588931 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2dvdd" event={"ID":"10e49ebb-e9c6-4f87-903f-bb7018d79002","Type":"ContainerStarted","Data":"c4311717d311ebfa95a4d295ad1a74174e280d3a243910faacb56dbcaef91f8d"} Apr 23 16:35:11.601345 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:11.601316 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" event={"ID":"1be7f2cd-30d6-400e-8502-6227dcb98324","Type":"ContainerStarted","Data":"2de1fcf94e23d1f2c6f2b4acd7785b6389794ad6ea6feda95e66548df15e2bfd"} Apr 23 16:35:11.606115 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:11.606056 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" event={"ID":"3301fde8-0566-4365-a9d8-b069eb4bebb7","Type":"ContainerStarted","Data":"c667bc39731dead79bf3cdde2740722323c82ba101cf7a2c8e12280c2494078b"} Apr 23 16:35:11.612581 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:11.612541 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal" event={"ID":"432a54920ff69b032f406403f8e82323","Type":"ContainerStarted","Data":"f132830f5abb93267b3396dc779950d93865f49b2a2bf8d79efe1b9b68850e78"} Apr 23 16:35:11.619297 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:11.619271 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q2hgm" event={"ID":"d285eb72-a566-4dcd-badf-2fefeec9c577","Type":"ContainerStarted","Data":"8dd5c2ebe4cafa96551b57e58ed3bdabe1295b75f76efbd4c7d93720900450b4"} Apr 23 16:35:11.622691 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:11.622668 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-frq2q" event={"ID":"74494667-d025-4d57-be34-03a72ee7cbaa","Type":"ContainerStarted","Data":"a0769f77c101334e4b73718d49d958e87452dcded77e297bdd94c633271a058f"} Apr 23 16:35:11.633770 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:11.633746 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" event={"ID":"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6","Type":"ContainerStarted","Data":"8c18cd5342f97b652a96bf94406142800281baf5bcf60317ba84d35ad4ae1a7d"} Apr 23 16:35:11.638896 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:11.638871 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fmfdm" event={"ID":"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd","Type":"ContainerStarted","Data":"920a8a64b7dd925a75e3d87ce4da2a73f92520624fa02981a8daf10ee0219395"} Apr 23 16:35:12.076324 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:12.076288 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs\") pod \"network-metrics-daemon-h6kzn\" (UID: \"b7f21f2f-2763-41c8-af5e-52de8001226b\") " pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:12.076468 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:12.076431 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:12.076524 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:12.076494 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs podName:b7f21f2f-2763-41c8-af5e-52de8001226b nodeName:}" failed. No retries permitted until 2026-04-23 16:35:14.076476369 +0000 UTC m=+6.098872481 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs") pod "network-metrics-daemon-h6kzn" (UID: "b7f21f2f-2763-41c8-af5e-52de8001226b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:12.177326 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:12.177228 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbbpz\" (UniqueName: \"kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz\") pod \"network-check-target-5lhlh\" (UID: \"017fd19b-a66e-4805-8f42-625a4749d380\") " pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:12.177631 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:12.177595 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:12.177631 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:12.177622 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:12.177631 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:12.177636 2578 projected.go:194] Error preparing data for projected volume kube-api-access-kbbpz for pod openshift-network-diagnostics/network-check-target-5lhlh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:12.177845 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:12.177694 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz podName:017fd19b-a66e-4805-8f42-625a4749d380 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:14.177676325 +0000 UTC m=+6.200072432 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kbbpz" (UniqueName: "kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz") pod "network-check-target-5lhlh" (UID: "017fd19b-a66e-4805-8f42-625a4749d380") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:12.574093 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:12.573591 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:12.574093 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:12.573721 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6kzn" podUID="b7f21f2f-2763-41c8-af5e-52de8001226b" Apr 23 16:35:12.650542 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:12.650223 2578 generic.go:358] "Generic (PLEG): container finished" podID="d00e776913cd1177ab03d04d7041f574" containerID="2906077b5ccab8b926c5c8b18d411731cc0de5fea31b4f0805d039e6352d7371" exitCode=0 Apr 23 16:35:12.650542 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:12.650448 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" event={"ID":"d00e776913cd1177ab03d04d7041f574","Type":"ContainerDied","Data":"2906077b5ccab8b926c5c8b18d411731cc0de5fea31b4f0805d039e6352d7371"} Apr 23 16:35:12.665350 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:12.665292 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-102.ec2.internal" podStartSLOduration=3.665275402 podStartE2EDuration="3.665275402s" podCreationTimestamp="2026-04-23 16:35:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:11.626453537 +0000 UTC m=+3.648849654" watchObservedRunningTime="2026-04-23 16:35:12.665275402 +0000 UTC m=+4.687671519" Apr 23 16:35:13.571851 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:13.571813 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:13.572102 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:13.571939 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5lhlh" podUID="017fd19b-a66e-4805-8f42-625a4749d380" Apr 23 16:35:13.655237 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:13.655200 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" event={"ID":"d00e776913cd1177ab03d04d7041f574","Type":"ContainerStarted","Data":"8c5c765e30b7fa4c3a458a7deaad7cfe3868a579ea4a22141f4e3ab2a4321c24"} Apr 23 16:35:14.095922 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:14.095816 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs\") pod \"network-metrics-daemon-h6kzn\" (UID: \"b7f21f2f-2763-41c8-af5e-52de8001226b\") " pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:14.096101 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:14.095961 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:14.096101 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:14.096039 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs podName:b7f21f2f-2763-41c8-af5e-52de8001226b nodeName:}" failed. No retries permitted until 2026-04-23 16:35:18.096018022 +0000 UTC m=+10.118414119 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs") pod "network-metrics-daemon-h6kzn" (UID: "b7f21f2f-2763-41c8-af5e-52de8001226b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:14.196482 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:14.196446 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbbpz\" (UniqueName: \"kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz\") pod \"network-check-target-5lhlh\" (UID: \"017fd19b-a66e-4805-8f42-625a4749d380\") " pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:14.196700 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:14.196645 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:14.196700 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:14.196668 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:14.196700 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:14.196681 2578 projected.go:194] Error preparing data for projected volume kube-api-access-kbbpz for pod openshift-network-diagnostics/network-check-target-5lhlh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:14.196905 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:14.196748 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz podName:017fd19b-a66e-4805-8f42-625a4749d380 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:18.196726982 +0000 UTC m=+10.219123078 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-kbbpz" (UniqueName: "kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz") pod "network-check-target-5lhlh" (UID: "017fd19b-a66e-4805-8f42-625a4749d380") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:14.571542 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:14.571465 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:14.571707 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:14.571605 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6kzn" podUID="b7f21f2f-2763-41c8-af5e-52de8001226b" Apr 23 16:35:15.571532 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:15.571488 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:15.572042 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:15.571625 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5lhlh" podUID="017fd19b-a66e-4805-8f42-625a4749d380" Apr 23 16:35:15.601463 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:15.601403 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-102.ec2.internal" podStartSLOduration=6.601367171 podStartE2EDuration="6.601367171s" podCreationTimestamp="2026-04-23 16:35:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:13.673177905 +0000 UTC m=+5.695574024" watchObservedRunningTime="2026-04-23 16:35:15.601367171 +0000 UTC m=+7.623763288" Apr 23 16:35:15.601786 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:15.601766 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-lpllm"] Apr 23 16:35:15.604715 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:15.604684 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:15.604855 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:15.604758 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lpllm" podUID="3c77861a-3b9a-47ae-9a06-cdc3b74145f7" Apr 23 16:35:15.707099 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:15.707062 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-kubelet-config\") pod \"global-pull-secret-syncer-lpllm\" (UID: \"3c77861a-3b9a-47ae-9a06-cdc3b74145f7\") " pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:15.707282 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:15.707127 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret\") pod \"global-pull-secret-syncer-lpllm\" (UID: \"3c77861a-3b9a-47ae-9a06-cdc3b74145f7\") " pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:15.707282 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:15.707154 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-dbus\") pod \"global-pull-secret-syncer-lpllm\" (UID: \"3c77861a-3b9a-47ae-9a06-cdc3b74145f7\") " pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:15.807634 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:15.807594 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret\") pod \"global-pull-secret-syncer-lpllm\" (UID: \"3c77861a-3b9a-47ae-9a06-cdc3b74145f7\") " pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:15.807819 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:15.807660 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-dbus\") pod \"global-pull-secret-syncer-lpllm\" (UID: \"3c77861a-3b9a-47ae-9a06-cdc3b74145f7\") " pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:15.807819 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:15.807746 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-kubelet-config\") pod \"global-pull-secret-syncer-lpllm\" (UID: \"3c77861a-3b9a-47ae-9a06-cdc3b74145f7\") " pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:15.807929 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:15.807872 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:15.807929 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:15.807911 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-kubelet-config\") pod \"global-pull-secret-syncer-lpllm\" (UID: \"3c77861a-3b9a-47ae-9a06-cdc3b74145f7\") " pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:15.808024 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:15.807931 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-dbus\") pod \"global-pull-secret-syncer-lpllm\" (UID: \"3c77861a-3b9a-47ae-9a06-cdc3b74145f7\") " pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:15.808024 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:15.807945 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret podName:3c77861a-3b9a-47ae-9a06-cdc3b74145f7 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:16.30792604 +0000 UTC m=+8.330322150 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret") pod "global-pull-secret-syncer-lpllm" (UID: "3c77861a-3b9a-47ae-9a06-cdc3b74145f7") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:16.311792 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:16.311750 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret\") pod \"global-pull-secret-syncer-lpllm\" (UID: \"3c77861a-3b9a-47ae-9a06-cdc3b74145f7\") " pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:16.311979 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:16.311890 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:16.311979 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:16.311965 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret podName:3c77861a-3b9a-47ae-9a06-cdc3b74145f7 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:17.311943921 +0000 UTC m=+9.334340036 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret") pod "global-pull-secret-syncer-lpllm" (UID: "3c77861a-3b9a-47ae-9a06-cdc3b74145f7") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:16.571514 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:16.571438 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:16.571660 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:16.571579 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6kzn" podUID="b7f21f2f-2763-41c8-af5e-52de8001226b" Apr 23 16:35:17.321641 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:17.321605 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret\") pod \"global-pull-secret-syncer-lpllm\" (UID: \"3c77861a-3b9a-47ae-9a06-cdc3b74145f7\") " pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:17.321835 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:17.321805 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:17.321902 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:17.321878 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret podName:3c77861a-3b9a-47ae-9a06-cdc3b74145f7 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:19.321856922 +0000 UTC m=+11.344253019 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret") pod "global-pull-secret-syncer-lpllm" (UID: "3c77861a-3b9a-47ae-9a06-cdc3b74145f7") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:17.571987 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:17.571165 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:17.571987 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:17.571298 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5lhlh" podUID="017fd19b-a66e-4805-8f42-625a4749d380" Apr 23 16:35:17.571987 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:17.571732 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:17.571987 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:17.571825 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lpllm" podUID="3c77861a-3b9a-47ae-9a06-cdc3b74145f7" Apr 23 16:35:18.128451 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:18.128416 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs\") pod \"network-metrics-daemon-h6kzn\" (UID: \"b7f21f2f-2763-41c8-af5e-52de8001226b\") " pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:18.128623 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:18.128554 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:18.128623 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:18.128614 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs podName:b7f21f2f-2763-41c8-af5e-52de8001226b nodeName:}" failed. No retries permitted until 2026-04-23 16:35:26.128599961 +0000 UTC m=+18.150996056 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs") pod "network-metrics-daemon-h6kzn" (UID: "b7f21f2f-2763-41c8-af5e-52de8001226b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:18.229896 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:18.229855 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbbpz\" (UniqueName: \"kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz\") pod \"network-check-target-5lhlh\" (UID: \"017fd19b-a66e-4805-8f42-625a4749d380\") " pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:18.230075 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:18.230044 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:18.230075 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:18.230071 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:18.230217 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:18.230086 2578 projected.go:194] Error preparing data for projected volume kube-api-access-kbbpz for pod openshift-network-diagnostics/network-check-target-5lhlh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:18.230217 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:18.230147 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz podName:017fd19b-a66e-4805-8f42-625a4749d380 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:26.230127939 +0000 UTC m=+18.252524038 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-kbbpz" (UniqueName: "kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz") pod "network-check-target-5lhlh" (UID: "017fd19b-a66e-4805-8f42-625a4749d380") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:18.573340 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:18.573249 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:18.573759 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:18.573412 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6kzn" podUID="b7f21f2f-2763-41c8-af5e-52de8001226b" Apr 23 16:35:19.338678 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:19.338635 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret\") pod \"global-pull-secret-syncer-lpllm\" (UID: \"3c77861a-3b9a-47ae-9a06-cdc3b74145f7\") " pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:19.338860 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:19.338805 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:19.338928 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:19.338897 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret podName:3c77861a-3b9a-47ae-9a06-cdc3b74145f7 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:23.338876489 +0000 UTC m=+15.361272598 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret") pod "global-pull-secret-syncer-lpllm" (UID: "3c77861a-3b9a-47ae-9a06-cdc3b74145f7") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:19.571532 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:19.571468 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:19.571705 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:19.571468 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:19.571705 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:19.571601 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5lhlh" podUID="017fd19b-a66e-4805-8f42-625a4749d380" Apr 23 16:35:19.571824 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:19.571702 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lpllm" podUID="3c77861a-3b9a-47ae-9a06-cdc3b74145f7" Apr 23 16:35:20.571235 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:20.571119 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:20.571667 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:20.571255 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6kzn" podUID="b7f21f2f-2763-41c8-af5e-52de8001226b" Apr 23 16:35:21.571502 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:21.571467 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:21.571502 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:21.571509 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:21.571967 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:21.571602 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5lhlh" podUID="017fd19b-a66e-4805-8f42-625a4749d380" Apr 23 16:35:21.571967 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:21.571688 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lpllm" podUID="3c77861a-3b9a-47ae-9a06-cdc3b74145f7" Apr 23 16:35:22.574619 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:22.574588 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:22.575033 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:22.574727 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6kzn" podUID="b7f21f2f-2763-41c8-af5e-52de8001226b" Apr 23 16:35:23.365243 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:23.365183 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret\") pod \"global-pull-secret-syncer-lpllm\" (UID: \"3c77861a-3b9a-47ae-9a06-cdc3b74145f7\") " pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:23.365424 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:23.365365 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:23.365484 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:23.365453 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret podName:3c77861a-3b9a-47ae-9a06-cdc3b74145f7 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:31.365436407 +0000 UTC m=+23.387832501 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret") pod "global-pull-secret-syncer-lpllm" (UID: "3c77861a-3b9a-47ae-9a06-cdc3b74145f7") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:23.572047 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:23.572011 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:23.572047 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:23.572044 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:23.572278 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:23.572142 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5lhlh" podUID="017fd19b-a66e-4805-8f42-625a4749d380" Apr 23 16:35:23.572328 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:23.572266 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lpllm" podUID="3c77861a-3b9a-47ae-9a06-cdc3b74145f7" Apr 23 16:35:24.573985 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:24.573957 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:24.574458 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:24.574098 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6kzn" podUID="b7f21f2f-2763-41c8-af5e-52de8001226b" Apr 23 16:35:25.571392 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:25.571343 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:25.571575 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:25.571343 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:25.571575 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:25.571474 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lpllm" podUID="3c77861a-3b9a-47ae-9a06-cdc3b74145f7" Apr 23 16:35:25.571661 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:25.571575 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5lhlh" podUID="017fd19b-a66e-4805-8f42-625a4749d380" Apr 23 16:35:26.186888 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:26.186852 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs\") pod \"network-metrics-daemon-h6kzn\" (UID: \"b7f21f2f-2763-41c8-af5e-52de8001226b\") " pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:26.187281 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:26.186974 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:26.187281 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:26.187041 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs podName:b7f21f2f-2763-41c8-af5e-52de8001226b nodeName:}" failed. No retries permitted until 2026-04-23 16:35:42.187016196 +0000 UTC m=+34.209412307 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs") pod "network-metrics-daemon-h6kzn" (UID: "b7f21f2f-2763-41c8-af5e-52de8001226b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:26.287625 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:26.287587 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbbpz\" (UniqueName: \"kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz\") pod \"network-check-target-5lhlh\" (UID: \"017fd19b-a66e-4805-8f42-625a4749d380\") " pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:26.287803 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:26.287734 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:26.287803 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:26.287753 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:26.287803 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:26.287765 2578 projected.go:194] Error preparing data for projected volume kube-api-access-kbbpz for pod openshift-network-diagnostics/network-check-target-5lhlh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:26.287955 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:26.287822 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz podName:017fd19b-a66e-4805-8f42-625a4749d380 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:42.287804699 +0000 UTC m=+34.310200795 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-kbbpz" (UniqueName: "kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz") pod "network-check-target-5lhlh" (UID: "017fd19b-a66e-4805-8f42-625a4749d380") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:26.571583 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:26.571500 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:26.571787 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:26.571643 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6kzn" podUID="b7f21f2f-2763-41c8-af5e-52de8001226b" Apr 23 16:35:27.571693 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:27.571658 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:27.572054 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:27.571659 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:27.572054 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:27.571757 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lpllm" podUID="3c77861a-3b9a-47ae-9a06-cdc3b74145f7" Apr 23 16:35:27.572054 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:27.571826 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5lhlh" podUID="017fd19b-a66e-4805-8f42-625a4749d380" Apr 23 16:35:28.573166 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:28.572969 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:28.573741 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:28.573270 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6kzn" podUID="b7f21f2f-2763-41c8-af5e-52de8001226b" Apr 23 16:35:28.681683 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:28.681623 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 16:35:28.682006 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:28.681974 2578 generic.go:358] "Generic (PLEG): container finished" podID="3301fde8-0566-4365-a9d8-b069eb4bebb7" containerID="c76e56c8ce40fe38ff195afaed823b06ea73b6dda7a8d4d9a3c730bdb709f416" exitCode=1 Apr 23 16:35:28.682074 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:28.682043 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" event={"ID":"3301fde8-0566-4365-a9d8-b069eb4bebb7","Type":"ContainerStarted","Data":"dd6864001215e06cb9488a7d9b627013e65c2b65e1cd1dc8ba6a5f30cf6f4866"} Apr 23 16:35:28.682147 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:28.682081 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" event={"ID":"3301fde8-0566-4365-a9d8-b069eb4bebb7","Type":"ContainerStarted","Data":"fa14615310181df565f9a6d1865a3445f8d6a98b750c8e875886365e5ede4083"} Apr 23 16:35:28.682147 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:28.682095 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" event={"ID":"3301fde8-0566-4365-a9d8-b069eb4bebb7","Type":"ContainerDied","Data":"c76e56c8ce40fe38ff195afaed823b06ea73b6dda7a8d4d9a3c730bdb709f416"} Apr 23 16:35:28.682147 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:28.682110 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" event={"ID":"3301fde8-0566-4365-a9d8-b069eb4bebb7","Type":"ContainerStarted","Data":"11b701f80a134d0d5d1bd8fbf630d61ff483dccb3dacccc25369a79f0a7ef743"} Apr 23 16:35:28.683425 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:28.683370 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q2hgm" event={"ID":"d285eb72-a566-4dcd-badf-2fefeec9c577","Type":"ContainerStarted","Data":"fe0ba452cca018971c18b121fd2a9155df166c4a5351abfb504092205944a4cc"} Apr 23 16:35:28.684686 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:28.684664 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-frq2q" event={"ID":"74494667-d025-4d57-be34-03a72ee7cbaa","Type":"ContainerStarted","Data":"f85f9115869dfd256ff37c9b19e19ed616320f4779546ba43bcc1496ab8d6e3d"} Apr 23 16:35:28.685907 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:28.685887 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" event={"ID":"a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6","Type":"ContainerStarted","Data":"855c29a08b6c30a3675ece5cf37309702b8edf6fe383803147bce426bffe722e"} Apr 23 16:35:28.687497 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:28.687470 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fmfdm" event={"ID":"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd","Type":"ContainerStarted","Data":"58e7d41fd7a4ad1868d691aaec8ff5e9205cfb3e568cce61244caca6148b8d4d"} Apr 23 16:35:28.691813 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:28.689142 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xsjmw" event={"ID":"0cf2af80-3ff4-4717-af9c-87bb29677708","Type":"ContainerStarted","Data":"8516ad596cfbb1fa1960011134718857e2058cced088841cda3172e83ea4dc33"} Apr 23 16:35:28.693353 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:28.693328 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-s9d8v" event={"ID":"ac1c0ce2-6f52-471e-ba47-e46a7d7fc0a6","Type":"ContainerStarted","Data":"e6a56875d57b9ca42247a970d826ba1a48f84c4f2adfdff7a859db294dbaf747"} Apr 23 16:35:28.694874 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:28.694854 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" event={"ID":"1be7f2cd-30d6-400e-8502-6227dcb98324","Type":"ContainerStarted","Data":"547f276556c7b06151a2230b4324120ff22997be278067c9b5660095a896ee31"} Apr 23 16:35:28.716243 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:28.716201 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-q2hgm" podStartSLOduration=3.658290623 podStartE2EDuration="20.71618663s" podCreationTimestamp="2026-04-23 16:35:08 +0000 UTC" firstStartedPulling="2026-04-23 16:35:11.06716047 +0000 UTC m=+3.089556578" lastFinishedPulling="2026-04-23 16:35:28.125056476 +0000 UTC m=+20.147452585" observedRunningTime="2026-04-23 16:35:28.715887445 +0000 UTC m=+20.738283598" watchObservedRunningTime="2026-04-23 16:35:28.71618663 +0000 UTC m=+20.738582758" Apr 23 16:35:28.777117 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:28.776827 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-hgwkb" podStartSLOduration=3.730496121 podStartE2EDuration="20.776807564s" podCreationTimestamp="2026-04-23 16:35:08 +0000 UTC" firstStartedPulling="2026-04-23 16:35:11.058364405 +0000 UTC m=+3.080760501" lastFinishedPulling="2026-04-23 16:35:28.10467585 +0000 UTC m=+20.127071944" observedRunningTime="2026-04-23 16:35:28.737087744 +0000 UTC m=+20.759483860" watchObservedRunningTime="2026-04-23 16:35:28.776807564 +0000 UTC m=+20.799203681" Apr 23 16:35:28.800416 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:28.800319 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-frq2q" podStartSLOduration=3.781087635 podStartE2EDuration="20.800302788s" podCreationTimestamp="2026-04-23 16:35:08 +0000 UTC" firstStartedPulling="2026-04-23 16:35:11.06545888 +0000 UTC m=+3.087854989" lastFinishedPulling="2026-04-23 16:35:28.084674036 +0000 UTC m=+20.107070142" observedRunningTime="2026-04-23 16:35:28.799922606 +0000 UTC m=+20.822318726" watchObservedRunningTime="2026-04-23 16:35:28.800302788 +0000 UTC m=+20.822698905" Apr 23 16:35:28.834141 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:28.834099 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xsjmw" podStartSLOduration=11.944205671 podStartE2EDuration="20.834082295s" podCreationTimestamp="2026-04-23 16:35:08 +0000 UTC" firstStartedPulling="2026-04-23 16:35:11.069345706 +0000 UTC m=+3.091741804" lastFinishedPulling="2026-04-23 16:35:19.959222328 +0000 UTC m=+11.981618428" observedRunningTime="2026-04-23 16:35:28.833756117 +0000 UTC m=+20.856152233" watchObservedRunningTime="2026-04-23 16:35:28.834082295 +0000 UTC m=+20.856478410" Apr 23 16:35:29.571788 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:29.571762 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:29.571897 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:29.571788 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:29.571897 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:29.571865 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lpllm" podUID="3c77861a-3b9a-47ae-9a06-cdc3b74145f7" Apr 23 16:35:29.572014 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:29.571968 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5lhlh" podUID="017fd19b-a66e-4805-8f42-625a4749d380" Apr 23 16:35:29.576807 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:29.576780 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 16:35:29.697865 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:29.697775 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2dvdd" event={"ID":"10e49ebb-e9c6-4f87-903f-bb7018d79002","Type":"ContainerStarted","Data":"3ab781616f099e41b341804f99c918e560b35bc4e7c738477c1af01df3e155cd"} Apr 23 16:35:29.699375 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:29.699350 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" event={"ID":"1be7f2cd-30d6-400e-8502-6227dcb98324","Type":"ContainerStarted","Data":"7d3e5e538b0e88893329e2e3a85044317c9419e61df1d8b9b129e068f597304c"} Apr 23 16:35:29.701704 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:29.701686 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 16:35:29.702053 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:29.702031 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" event={"ID":"3301fde8-0566-4365-a9d8-b069eb4bebb7","Type":"ContainerStarted","Data":"680a21f88b52ed15e70eeb4753dbc5ff997625353f62a24197c165a5f88f470a"} Apr 23 16:35:29.702143 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:29.702060 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" event={"ID":"3301fde8-0566-4365-a9d8-b069eb4bebb7","Type":"ContainerStarted","Data":"4b9d030f8667173f7058f099ed98806a78833e9da1e0ee0374690961130bc969"} Apr 23 16:35:29.703287 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:29.703266 2578 generic.go:358] "Generic (PLEG): container finished" podID="07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd" containerID="58e7d41fd7a4ad1868d691aaec8ff5e9205cfb3e568cce61244caca6148b8d4d" exitCode=0 Apr 23 16:35:29.703395 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:29.703361 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fmfdm" event={"ID":"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd","Type":"ContainerDied","Data":"58e7d41fd7a4ad1868d691aaec8ff5e9205cfb3e568cce61244caca6148b8d4d"} Apr 23 16:35:29.716577 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:29.716528 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-s9d8v" podStartSLOduration=4.680319913 podStartE2EDuration="21.716514749s" podCreationTimestamp="2026-04-23 16:35:08 +0000 UTC" firstStartedPulling="2026-04-23 16:35:11.067683338 +0000 UTC m=+3.090079436" lastFinishedPulling="2026-04-23 16:35:28.103878177 +0000 UTC m=+20.126274272" observedRunningTime="2026-04-23 16:35:28.852948687 +0000 UTC m=+20.875344804" watchObservedRunningTime="2026-04-23 16:35:29.716514749 +0000 UTC m=+21.738910864" Apr 23 16:35:29.716835 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:29.716805 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2dvdd" podStartSLOduration=4.681002195 podStartE2EDuration="21.716797698s" podCreationTimestamp="2026-04-23 16:35:08 +0000 UTC" firstStartedPulling="2026-04-23 16:35:11.067413844 +0000 UTC m=+3.089809937" lastFinishedPulling="2026-04-23 16:35:28.103209326 +0000 UTC m=+20.125605440" observedRunningTime="2026-04-23 16:35:29.716644884 +0000 UTC m=+21.739040995" watchObservedRunningTime="2026-04-23 16:35:29.716797698 +0000 UTC m=+21.739193815" Apr 23 16:35:30.524956 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:30.524848 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T16:35:29.576793862Z","UUID":"e056d724-eef2-45c3-a78c-d45cd349aba1","Handler":null,"Name":"","Endpoint":""} Apr 23 16:35:30.527005 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:30.526979 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 16:35:30.527133 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:30.527014 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 16:35:30.571828 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:30.571803 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:30.571976 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:30.571926 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6kzn" podUID="b7f21f2f-2763-41c8-af5e-52de8001226b" Apr 23 16:35:31.428088 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:31.427910 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret\") pod \"global-pull-secret-syncer-lpllm\" (UID: \"3c77861a-3b9a-47ae-9a06-cdc3b74145f7\") " pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:31.428638 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:31.428070 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:31.428638 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:31.428219 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret podName:3c77861a-3b9a-47ae-9a06-cdc3b74145f7 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:47.428197555 +0000 UTC m=+39.450593665 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret") pod "global-pull-secret-syncer-lpllm" (UID: "3c77861a-3b9a-47ae-9a06-cdc3b74145f7") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:31.571412 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:31.571368 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:31.571582 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:31.571368 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:31.571582 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:31.571504 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lpllm" podUID="3c77861a-3b9a-47ae-9a06-cdc3b74145f7" Apr 23 16:35:31.571582 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:31.571573 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5lhlh" podUID="017fd19b-a66e-4805-8f42-625a4749d380" Apr 23 16:35:31.708487 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:31.708457 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" event={"ID":"1be7f2cd-30d6-400e-8502-6227dcb98324","Type":"ContainerStarted","Data":"7aaa9567d12af6e15010839375976a167a0909c15573a12e4684f9416ca8e240"} Apr 23 16:35:31.711314 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:31.711295 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 16:35:31.711651 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:31.711628 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" event={"ID":"3301fde8-0566-4365-a9d8-b069eb4bebb7","Type":"ContainerStarted","Data":"9ddaa5f10d778dab2c987a7a3d43d8386b8ac0902142f35402a6b5a5670e9cf2"} Apr 23 16:35:31.732441 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:31.732371 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sx5rz" podStartSLOduration=3.541222425 podStartE2EDuration="23.732353593s" podCreationTimestamp="2026-04-23 16:35:08 +0000 UTC" firstStartedPulling="2026-04-23 16:35:11.060609955 +0000 UTC m=+3.083006050" lastFinishedPulling="2026-04-23 16:35:31.25174112 +0000 UTC m=+23.274137218" observedRunningTime="2026-04-23 16:35:31.731828315 +0000 UTC m=+23.754224442" watchObservedRunningTime="2026-04-23 16:35:31.732353593 +0000 UTC m=+23.754749713" Apr 23 16:35:32.571563 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:32.571526 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:32.572142 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:32.571656 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6kzn" podUID="b7f21f2f-2763-41c8-af5e-52de8001226b" Apr 23 16:35:33.037244 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:33.037128 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-s9d8v" Apr 23 16:35:33.038056 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:33.038036 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-s9d8v" Apr 23 16:35:33.571258 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:33.571219 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:33.571459 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:33.571267 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:33.571459 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:33.571348 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5lhlh" podUID="017fd19b-a66e-4805-8f42-625a4749d380" Apr 23 16:35:33.571577 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:33.571472 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lpllm" podUID="3c77861a-3b9a-47ae-9a06-cdc3b74145f7" Apr 23 16:35:33.655485 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:33.655448 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-s9d8v" Apr 23 16:35:33.656168 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:33.656147 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-s9d8v" Apr 23 16:35:34.571510 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:34.571330 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:34.571670 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:34.571579 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6kzn" podUID="b7f21f2f-2763-41c8-af5e-52de8001226b" Apr 23 16:35:34.719548 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:34.719524 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 16:35:34.720023 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:34.719818 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" event={"ID":"3301fde8-0566-4365-a9d8-b069eb4bebb7","Type":"ContainerStarted","Data":"83da22435408d318d7b67a9c3c08a12ecac0b87527f153a493042009165a1dbf"} Apr 23 16:35:34.720158 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:34.720137 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:34.720306 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:34.720291 2578 scope.go:117] "RemoveContainer" containerID="c76e56c8ce40fe38ff195afaed823b06ea73b6dda7a8d4d9a3c730bdb709f416" Apr 23 16:35:34.721611 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:34.721587 2578 generic.go:358] "Generic (PLEG): container finished" podID="07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd" containerID="7054b238aa96427735b31a159fefa6eed6af73b1c25f950dc4592c31dac8e190" exitCode=0 Apr 23 16:35:34.721698 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:34.721667 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fmfdm" event={"ID":"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd","Type":"ContainerDied","Data":"7054b238aa96427735b31a159fefa6eed6af73b1c25f950dc4592c31dac8e190"} Apr 23 16:35:34.735668 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:34.735652 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:35.571352 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:35.571327 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:35.571501 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:35.571327 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:35.571501 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:35.571442 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lpllm" podUID="3c77861a-3b9a-47ae-9a06-cdc3b74145f7" Apr 23 16:35:35.571589 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:35.571547 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5lhlh" podUID="017fd19b-a66e-4805-8f42-625a4749d380" Apr 23 16:35:35.729457 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:35.729284 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 16:35:35.729807 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:35.729777 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" event={"ID":"3301fde8-0566-4365-a9d8-b069eb4bebb7","Type":"ContainerStarted","Data":"133c03779c601c70f74445cc479d05adda59603fc090bba7e617c98c4d4a662f"} Apr 23 16:35:35.730095 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:35.730073 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:35.730095 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:35.730105 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:35.731522 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:35.731497 2578 generic.go:358] "Generic (PLEG): container finished" podID="07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd" containerID="d2150585769fc49dfae6fdd425f0b364a6541890b31ffe4d0eb82e73e43fa1e2" exitCode=0 Apr 23 16:35:35.731614 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:35.731541 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fmfdm" event={"ID":"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd","Type":"ContainerDied","Data":"d2150585769fc49dfae6fdd425f0b364a6541890b31ffe4d0eb82e73e43fa1e2"} Apr 23 16:35:35.745519 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:35.745498 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:35:35.780796 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:35.780751 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" podStartSLOduration=10.676058062 podStartE2EDuration="27.780732352s" podCreationTimestamp="2026-04-23 16:35:08 +0000 UTC" firstStartedPulling="2026-04-23 16:35:11.061344496 +0000 UTC m=+3.083740589" lastFinishedPulling="2026-04-23 16:35:28.16601877 +0000 UTC m=+20.188414879" observedRunningTime="2026-04-23 16:35:35.77994015 +0000 UTC m=+27.802336280" watchObservedRunningTime="2026-04-23 16:35:35.780732352 +0000 UTC m=+27.803128467" Apr 23 16:35:35.903449 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:35.903359 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lpllm"] Apr 23 16:35:35.903602 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:35.903490 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:35.903659 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:35.903595 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lpllm" podUID="3c77861a-3b9a-47ae-9a06-cdc3b74145f7" Apr 23 16:35:35.905940 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:35.905908 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5lhlh"] Apr 23 16:35:35.906055 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:35.906023 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:35.906141 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:35.906118 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5lhlh" podUID="017fd19b-a66e-4805-8f42-625a4749d380" Apr 23 16:35:35.912310 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:35.912283 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h6kzn"] Apr 23 16:35:35.912440 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:35.912426 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:35.912547 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:35.912528 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6kzn" podUID="b7f21f2f-2763-41c8-af5e-52de8001226b" Apr 23 16:35:36.735042 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:36.735003 2578 generic.go:358] "Generic (PLEG): container finished" podID="07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd" containerID="4a4d093a139f245f726e3f2f003358307654a4f19052c9721cc2df77bf09be09" exitCode=0 Apr 23 16:35:36.735374 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:36.735065 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fmfdm" event={"ID":"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd","Type":"ContainerDied","Data":"4a4d093a139f245f726e3f2f003358307654a4f19052c9721cc2df77bf09be09"} Apr 23 16:35:37.571860 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:37.571711 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:37.571860 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:37.571733 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:37.571860 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:37.571842 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5lhlh" podUID="017fd19b-a66e-4805-8f42-625a4749d380" Apr 23 16:35:37.572216 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:37.571966 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6kzn" podUID="b7f21f2f-2763-41c8-af5e-52de8001226b" Apr 23 16:35:37.572216 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:37.572019 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:37.572216 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:37.572084 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lpllm" podUID="3c77861a-3b9a-47ae-9a06-cdc3b74145f7" Apr 23 16:35:39.571405 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:39.571365 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:39.572049 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:39.571421 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:39.572049 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:39.571488 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5lhlh" podUID="017fd19b-a66e-4805-8f42-625a4749d380" Apr 23 16:35:39.572049 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:39.571549 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:39.572049 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:39.571547 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lpllm" podUID="3c77861a-3b9a-47ae-9a06-cdc3b74145f7" Apr 23 16:35:39.572049 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:39.571637 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6kzn" podUID="b7f21f2f-2763-41c8-af5e-52de8001226b" Apr 23 16:35:41.347977 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.347947 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-102.ec2.internal" event="NodeReady" Apr 23 16:35:41.348645 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.348106 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 16:35:41.404896 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.404866 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-g2wqn"] Apr 23 16:35:41.409422 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.409398 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g2wqn" Apr 23 16:35:41.411896 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.411869 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 16:35:41.412012 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.411943 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jmk6t\"" Apr 23 16:35:41.412012 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.411956 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 16:35:41.414311 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.414255 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rmrtl"] Apr 23 16:35:41.417139 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.417121 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rmrtl" Apr 23 16:35:41.421060 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.421038 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 16:35:41.421543 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.421522 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 16:35:41.421945 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.421830 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-s982g\"" Apr 23 16:35:41.422634 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.422062 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 16:35:41.422634 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.422525 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g2wqn"] Apr 23 16:35:41.431654 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.431629 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rmrtl"] Apr 23 16:35:41.502564 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.502532 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp2dl\" (UniqueName: \"kubernetes.io/projected/743aa8f5-75e8-4c04-8f4a-d49896428015-kube-api-access-zp2dl\") pod \"ingress-canary-rmrtl\" (UID: \"743aa8f5-75e8-4c04-8f4a-d49896428015\") " pod="openshift-ingress-canary/ingress-canary-rmrtl" Apr 23 16:35:41.502735 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.502579 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert\") pod \"ingress-canary-rmrtl\" (UID: \"743aa8f5-75e8-4c04-8f4a-d49896428015\") " pod="openshift-ingress-canary/ingress-canary-rmrtl" Apr 23 16:35:41.502735 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.502607 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls\") pod \"dns-default-g2wqn\" (UID: \"4db30a17-673a-4844-8750-e939b2e34518\") " pod="openshift-dns/dns-default-g2wqn" Apr 23 16:35:41.502735 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.502665 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4db30a17-673a-4844-8750-e939b2e34518-tmp-dir\") pod \"dns-default-g2wqn\" (UID: \"4db30a17-673a-4844-8750-e939b2e34518\") " pod="openshift-dns/dns-default-g2wqn" Apr 23 16:35:41.502735 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.502699 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4db30a17-673a-4844-8750-e939b2e34518-config-volume\") pod \"dns-default-g2wqn\" (UID: \"4db30a17-673a-4844-8750-e939b2e34518\") " pod="openshift-dns/dns-default-g2wqn" Apr 23 16:35:41.502918 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.502760 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-624fk\" (UniqueName: \"kubernetes.io/projected/4db30a17-673a-4844-8750-e939b2e34518-kube-api-access-624fk\") pod \"dns-default-g2wqn\" (UID: \"4db30a17-673a-4844-8750-e939b2e34518\") " pod="openshift-dns/dns-default-g2wqn" Apr 23 16:35:41.572029 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.571998 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:41.572205 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.572002 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:41.572205 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.572013 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:41.576358 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.576337 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 16:35:41.576667 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.576642 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f89h4\"" Apr 23 16:35:41.577177 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.577160 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-6xqf4\"" Apr 23 16:35:41.577653 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.577637 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 16:35:41.578157 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.578131 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 16:35:41.579167 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.579149 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 16:35:41.603135 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.603074 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls\") pod \"dns-default-g2wqn\" (UID: \"4db30a17-673a-4844-8750-e939b2e34518\") " pod="openshift-dns/dns-default-g2wqn" Apr 23 16:35:41.603135 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.603121 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4db30a17-673a-4844-8750-e939b2e34518-tmp-dir\") pod \"dns-default-g2wqn\" (UID: \"4db30a17-673a-4844-8750-e939b2e34518\") " pod="openshift-dns/dns-default-g2wqn" Apr 23 16:35:41.603270 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.603152 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4db30a17-673a-4844-8750-e939b2e34518-config-volume\") pod \"dns-default-g2wqn\" (UID: \"4db30a17-673a-4844-8750-e939b2e34518\") " pod="openshift-dns/dns-default-g2wqn" Apr 23 16:35:41.603270 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.603212 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-624fk\" (UniqueName: \"kubernetes.io/projected/4db30a17-673a-4844-8750-e939b2e34518-kube-api-access-624fk\") pod \"dns-default-g2wqn\" (UID: \"4db30a17-673a-4844-8750-e939b2e34518\") " pod="openshift-dns/dns-default-g2wqn" Apr 23 16:35:41.603270 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:41.603232 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:41.603408 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.603278 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zp2dl\" (UniqueName: \"kubernetes.io/projected/743aa8f5-75e8-4c04-8f4a-d49896428015-kube-api-access-zp2dl\") pod \"ingress-canary-rmrtl\" (UID: \"743aa8f5-75e8-4c04-8f4a-d49896428015\") " pod="openshift-ingress-canary/ingress-canary-rmrtl" Apr 23 16:35:41.603408 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:41.603288 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls podName:4db30a17-673a-4844-8750-e939b2e34518 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:42.103270157 +0000 UTC m=+34.125666266 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls") pod "dns-default-g2wqn" (UID: "4db30a17-673a-4844-8750-e939b2e34518") : secret "dns-default-metrics-tls" not found Apr 23 16:35:41.603408 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.603320 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert\") pod \"ingress-canary-rmrtl\" (UID: \"743aa8f5-75e8-4c04-8f4a-d49896428015\") " pod="openshift-ingress-canary/ingress-canary-rmrtl" Apr 23 16:35:41.603568 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:41.603417 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:41.603568 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:41.603453 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert podName:743aa8f5-75e8-4c04-8f4a-d49896428015 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:42.103440628 +0000 UTC m=+34.125836725 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert") pod "ingress-canary-rmrtl" (UID: "743aa8f5-75e8-4c04-8f4a-d49896428015") : secret "canary-serving-cert" not found Apr 23 16:35:41.603568 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.603496 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4db30a17-673a-4844-8750-e939b2e34518-tmp-dir\") pod \"dns-default-g2wqn\" (UID: \"4db30a17-673a-4844-8750-e939b2e34518\") " pod="openshift-dns/dns-default-g2wqn" Apr 23 16:35:41.603762 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.603740 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4db30a17-673a-4844-8750-e939b2e34518-config-volume\") pod \"dns-default-g2wqn\" (UID: \"4db30a17-673a-4844-8750-e939b2e34518\") " pod="openshift-dns/dns-default-g2wqn" Apr 23 16:35:41.622675 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.622641 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-624fk\" (UniqueName: \"kubernetes.io/projected/4db30a17-673a-4844-8750-e939b2e34518-kube-api-access-624fk\") pod \"dns-default-g2wqn\" (UID: \"4db30a17-673a-4844-8750-e939b2e34518\") " pod="openshift-dns/dns-default-g2wqn" Apr 23 16:35:41.622832 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:41.622696 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp2dl\" (UniqueName: \"kubernetes.io/projected/743aa8f5-75e8-4c04-8f4a-d49896428015-kube-api-access-zp2dl\") pod \"ingress-canary-rmrtl\" (UID: \"743aa8f5-75e8-4c04-8f4a-d49896428015\") " pod="openshift-ingress-canary/ingress-canary-rmrtl" Apr 23 16:35:42.107726 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:42.107694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert\") pod \"ingress-canary-rmrtl\" (UID: \"743aa8f5-75e8-4c04-8f4a-d49896428015\") " pod="openshift-ingress-canary/ingress-canary-rmrtl" Apr 23 16:35:42.107726 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:42.107735 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls\") pod \"dns-default-g2wqn\" (UID: \"4db30a17-673a-4844-8750-e939b2e34518\") " pod="openshift-dns/dns-default-g2wqn" Apr 23 16:35:42.107991 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:42.107858 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:42.107991 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:42.107862 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:42.107991 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:42.107935 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls podName:4db30a17-673a-4844-8750-e939b2e34518 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:43.107916905 +0000 UTC m=+35.130312999 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls") pod "dns-default-g2wqn" (UID: "4db30a17-673a-4844-8750-e939b2e34518") : secret "dns-default-metrics-tls" not found Apr 23 16:35:42.107991 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:42.107955 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert podName:743aa8f5-75e8-4c04-8f4a-d49896428015 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:43.107945916 +0000 UTC m=+35.130342013 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert") pod "ingress-canary-rmrtl" (UID: "743aa8f5-75e8-4c04-8f4a-d49896428015") : secret "canary-serving-cert" not found Apr 23 16:35:42.208500 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:42.208287 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs\") pod \"network-metrics-daemon-h6kzn\" (UID: \"b7f21f2f-2763-41c8-af5e-52de8001226b\") " pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:35:42.208656 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:42.208441 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 16:35:42.208656 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:42.208580 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs podName:b7f21f2f-2763-41c8-af5e-52de8001226b nodeName:}" failed. No retries permitted until 2026-04-23 16:36:14.20856486 +0000 UTC m=+66.230960970 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs") pod "network-metrics-daemon-h6kzn" (UID: "b7f21f2f-2763-41c8-af5e-52de8001226b") : secret "metrics-daemon-secret" not found Apr 23 16:35:42.308818 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:42.308785 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbbpz\" (UniqueName: \"kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz\") pod \"network-check-target-5lhlh\" (UID: \"017fd19b-a66e-4805-8f42-625a4749d380\") " pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:42.311561 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:42.311538 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbbpz\" (UniqueName: \"kubernetes.io/projected/017fd19b-a66e-4805-8f42-625a4749d380-kube-api-access-kbbpz\") pod \"network-check-target-5lhlh\" (UID: \"017fd19b-a66e-4805-8f42-625a4749d380\") " pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:42.495735 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:42.495616 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:42.653036 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:42.653010 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5lhlh"] Apr 23 16:35:42.657829 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:42.657798 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod017fd19b_a66e_4805_8f42_625a4749d380.slice/crio-070244fb9472ad67e9ec53b2b49c5b886aa6df3d8e39d76a7ef434698346c43e WatchSource:0}: Error finding container 070244fb9472ad67e9ec53b2b49c5b886aa6df3d8e39d76a7ef434698346c43e: Status 404 returned error can't find the container with id 070244fb9472ad67e9ec53b2b49c5b886aa6df3d8e39d76a7ef434698346c43e Apr 23 16:35:42.748834 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:42.748751 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fmfdm" event={"ID":"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd","Type":"ContainerStarted","Data":"6847b449d26df4cbc17abf1c9932ccb03ce245bc57b6043bafd6247f66f96ba1"} Apr 23 16:35:42.749782 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:42.749755 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5lhlh" event={"ID":"017fd19b-a66e-4805-8f42-625a4749d380","Type":"ContainerStarted","Data":"070244fb9472ad67e9ec53b2b49c5b886aa6df3d8e39d76a7ef434698346c43e"} Apr 23 16:35:43.115673 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:43.115640 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert\") pod \"ingress-canary-rmrtl\" (UID: \"743aa8f5-75e8-4c04-8f4a-d49896428015\") " pod="openshift-ingress-canary/ingress-canary-rmrtl" Apr 23 16:35:43.115673 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:43.115677 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls\") pod \"dns-default-g2wqn\" (UID: \"4db30a17-673a-4844-8750-e939b2e34518\") " pod="openshift-dns/dns-default-g2wqn" Apr 23 16:35:43.115915 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:43.115796 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:43.115915 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:43.115803 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:43.115915 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:43.115850 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls podName:4db30a17-673a-4844-8750-e939b2e34518 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:45.115836199 +0000 UTC m=+37.138232292 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls") pod "dns-default-g2wqn" (UID: "4db30a17-673a-4844-8750-e939b2e34518") : secret "dns-default-metrics-tls" not found Apr 23 16:35:43.115915 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:43.115864 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert podName:743aa8f5-75e8-4c04-8f4a-d49896428015 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:45.11585808 +0000 UTC m=+37.138254174 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert") pod "ingress-canary-rmrtl" (UID: "743aa8f5-75e8-4c04-8f4a-d49896428015") : secret "canary-serving-cert" not found Apr 23 16:35:43.754572 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:43.754539 2578 generic.go:358] "Generic (PLEG): container finished" podID="07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd" containerID="6847b449d26df4cbc17abf1c9932ccb03ce245bc57b6043bafd6247f66f96ba1" exitCode=0 Apr 23 16:35:43.755065 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:43.754609 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fmfdm" event={"ID":"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd","Type":"ContainerDied","Data":"6847b449d26df4cbc17abf1c9932ccb03ce245bc57b6043bafd6247f66f96ba1"} Apr 23 16:35:44.759649 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:44.759447 2578 generic.go:358] "Generic (PLEG): container finished" podID="07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd" containerID="00c762af4d0e6c7bf7d415b789d1366134e5653127ca63680961a5636314bf01" exitCode=0 Apr 23 16:35:44.760089 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:44.759533 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fmfdm" event={"ID":"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd","Type":"ContainerDied","Data":"00c762af4d0e6c7bf7d415b789d1366134e5653127ca63680961a5636314bf01"} Apr 23 16:35:45.130024 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:45.129986 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert\") pod \"ingress-canary-rmrtl\" (UID: \"743aa8f5-75e8-4c04-8f4a-d49896428015\") " pod="openshift-ingress-canary/ingress-canary-rmrtl" Apr 23 16:35:45.130024 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:45.130024 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls\") pod \"dns-default-g2wqn\" (UID: \"4db30a17-673a-4844-8750-e939b2e34518\") " pod="openshift-dns/dns-default-g2wqn" Apr 23 16:35:45.130261 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:45.130152 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:45.130261 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:45.130209 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls podName:4db30a17-673a-4844-8750-e939b2e34518 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:49.130191681 +0000 UTC m=+41.152587777 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls") pod "dns-default-g2wqn" (UID: "4db30a17-673a-4844-8750-e939b2e34518") : secret "dns-default-metrics-tls" not found Apr 23 16:35:45.130261 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:45.130150 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:45.130443 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:45.130302 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert podName:743aa8f5-75e8-4c04-8f4a-d49896428015 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:49.130281652 +0000 UTC m=+41.152677761 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert") pod "ingress-canary-rmrtl" (UID: "743aa8f5-75e8-4c04-8f4a-d49896428015") : secret "canary-serving-cert" not found Apr 23 16:35:45.764397 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:45.764352 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fmfdm" event={"ID":"07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd","Type":"ContainerStarted","Data":"edd454f8ee4b00b86810d805931c8ebdfeb1f100a8886f1fced1e4324eb14a08"} Apr 23 16:35:45.765616 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:45.765592 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5lhlh" event={"ID":"017fd19b-a66e-4805-8f42-625a4749d380","Type":"ContainerStarted","Data":"ff4ffdfd057a49712b7d66c655c9d2b6e476e791f0d603b9fefe44c4a502abb7"} Apr 23 16:35:45.765722 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:45.765706 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:35:45.797351 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:45.797309 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fmfdm" podStartSLOduration=6.343275491 podStartE2EDuration="37.797296478s" podCreationTimestamp="2026-04-23 16:35:08 +0000 UTC" firstStartedPulling="2026-04-23 16:35:11.056705971 +0000 UTC m=+3.079102064" lastFinishedPulling="2026-04-23 16:35:42.510726954 +0000 UTC m=+34.533123051" observedRunningTime="2026-04-23 16:35:45.794854796 +0000 UTC m=+37.817250911" watchObservedRunningTime="2026-04-23 16:35:45.797296478 +0000 UTC m=+37.819692593" Apr 23 16:35:45.822176 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:45.822137 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-5lhlh" podStartSLOduration=34.847068772 podStartE2EDuration="37.822123164s" podCreationTimestamp="2026-04-23 16:35:08 +0000 UTC" firstStartedPulling="2026-04-23 16:35:42.65958253 +0000 UTC m=+34.681978627" lastFinishedPulling="2026-04-23 16:35:45.634636924 +0000 UTC m=+37.657033019" observedRunningTime="2026-04-23 16:35:45.821618896 +0000 UTC m=+37.844015009" watchObservedRunningTime="2026-04-23 16:35:45.822123164 +0000 UTC m=+37.844519279" Apr 23 16:35:47.446423 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:47.446363 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret\") pod \"global-pull-secret-syncer-lpllm\" (UID: \"3c77861a-3b9a-47ae-9a06-cdc3b74145f7\") " pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:47.450171 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:47.450150 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c77861a-3b9a-47ae-9a06-cdc3b74145f7-original-pull-secret\") pod \"global-pull-secret-syncer-lpllm\" (UID: \"3c77861a-3b9a-47ae-9a06-cdc3b74145f7\") " pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:47.589941 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:47.589902 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lpllm" Apr 23 16:35:47.722090 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:47.722021 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lpllm"] Apr 23 16:35:47.725039 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:35:47.725012 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c77861a_3b9a_47ae_9a06_cdc3b74145f7.slice/crio-b7fb1350c2f894aa518a458ed6b6dda69c65898dd4a5eb6b8c858ab3e04949fd WatchSource:0}: Error finding container b7fb1350c2f894aa518a458ed6b6dda69c65898dd4a5eb6b8c858ab3e04949fd: Status 404 returned error can't find the container with id b7fb1350c2f894aa518a458ed6b6dda69c65898dd4a5eb6b8c858ab3e04949fd Apr 23 16:35:47.770307 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:47.770271 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lpllm" event={"ID":"3c77861a-3b9a-47ae-9a06-cdc3b74145f7","Type":"ContainerStarted","Data":"b7fb1350c2f894aa518a458ed6b6dda69c65898dd4a5eb6b8c858ab3e04949fd"} Apr 23 16:35:49.160352 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:49.160311 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert\") pod \"ingress-canary-rmrtl\" (UID: \"743aa8f5-75e8-4c04-8f4a-d49896428015\") " pod="openshift-ingress-canary/ingress-canary-rmrtl" Apr 23 16:35:49.160763 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:49.160358 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls\") pod \"dns-default-g2wqn\" (UID: \"4db30a17-673a-4844-8750-e939b2e34518\") " pod="openshift-dns/dns-default-g2wqn" Apr 23 16:35:49.160763 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:49.160469 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:49.160763 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:49.160489 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:49.160763 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:49.160530 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert podName:743aa8f5-75e8-4c04-8f4a-d49896428015 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:57.160512869 +0000 UTC m=+49.182908968 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert") pod "ingress-canary-rmrtl" (UID: "743aa8f5-75e8-4c04-8f4a-d49896428015") : secret "canary-serving-cert" not found Apr 23 16:35:49.160763 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:49.160547 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls podName:4db30a17-673a-4844-8750-e939b2e34518 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:57.160538915 +0000 UTC m=+49.182935013 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls") pod "dns-default-g2wqn" (UID: "4db30a17-673a-4844-8750-e939b2e34518") : secret "dns-default-metrics-tls" not found Apr 23 16:35:51.779094 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:51.779008 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lpllm" event={"ID":"3c77861a-3b9a-47ae-9a06-cdc3b74145f7","Type":"ContainerStarted","Data":"9940b5e34f55f92dd942fb3549ed5c50cc95fe5b3bdcd31ec976b9fcce45b441"} Apr 23 16:35:51.796959 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:51.796911 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-lpllm" podStartSLOduration=33.18672115 podStartE2EDuration="36.796899102s" podCreationTimestamp="2026-04-23 16:35:15 +0000 UTC" firstStartedPulling="2026-04-23 16:35:47.726559841 +0000 UTC m=+39.748955939" lastFinishedPulling="2026-04-23 16:35:51.336737794 +0000 UTC m=+43.359133891" observedRunningTime="2026-04-23 16:35:51.79609949 +0000 UTC m=+43.818495608" watchObservedRunningTime="2026-04-23 16:35:51.796899102 +0000 UTC m=+43.819295273" Apr 23 16:35:57.217865 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:57.217829 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert\") pod \"ingress-canary-rmrtl\" (UID: \"743aa8f5-75e8-4c04-8f4a-d49896428015\") " pod="openshift-ingress-canary/ingress-canary-rmrtl" Apr 23 16:35:57.217865 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:35:57.217865 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls\") pod \"dns-default-g2wqn\" (UID: \"4db30a17-673a-4844-8750-e939b2e34518\") " pod="openshift-dns/dns-default-g2wqn" Apr 23 16:35:57.218343 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:57.218018 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:35:57.218343 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:57.218104 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert podName:743aa8f5-75e8-4c04-8f4a-d49896428015 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:13.218087967 +0000 UTC m=+65.240484066 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert") pod "ingress-canary-rmrtl" (UID: "743aa8f5-75e8-4c04-8f4a-d49896428015") : secret "canary-serving-cert" not found Apr 23 16:35:57.218343 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:57.218018 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:35:57.218343 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:35:57.218146 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls podName:4db30a17-673a-4844-8750-e939b2e34518 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:13.218135354 +0000 UTC m=+65.240531448 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls") pod "dns-default-g2wqn" (UID: "4db30a17-673a-4844-8750-e939b2e34518") : secret "dns-default-metrics-tls" not found Apr 23 16:36:07.747607 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:36:07.747581 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hbtmc" Apr 23 16:36:13.226327 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:36:13.226289 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert\") pod \"ingress-canary-rmrtl\" (UID: \"743aa8f5-75e8-4c04-8f4a-d49896428015\") " pod="openshift-ingress-canary/ingress-canary-rmrtl" Apr 23 16:36:13.226327 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:36:13.226325 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls\") pod \"dns-default-g2wqn\" (UID: \"4db30a17-673a-4844-8750-e939b2e34518\") " pod="openshift-dns/dns-default-g2wqn" Apr 23 16:36:13.226871 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:36:13.226442 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:36:13.226871 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:36:13.226452 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:36:13.226871 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:36:13.226494 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls podName:4db30a17-673a-4844-8750-e939b2e34518 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:45.226479516 +0000 UTC m=+97.248875610 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls") pod "dns-default-g2wqn" (UID: "4db30a17-673a-4844-8750-e939b2e34518") : secret "dns-default-metrics-tls" not found Apr 23 16:36:13.226871 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:36:13.226507 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert podName:743aa8f5-75e8-4c04-8f4a-d49896428015 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:45.226501402 +0000 UTC m=+97.248897495 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert") pod "ingress-canary-rmrtl" (UID: "743aa8f5-75e8-4c04-8f4a-d49896428015") : secret "canary-serving-cert" not found Apr 23 16:36:14.232916 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:36:14.232871 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs\") pod \"network-metrics-daemon-h6kzn\" (UID: \"b7f21f2f-2763-41c8-af5e-52de8001226b\") " pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:36:14.233302 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:36:14.233008 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 16:36:14.233302 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:36:14.233069 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs podName:b7f21f2f-2763-41c8-af5e-52de8001226b nodeName:}" failed. No retries permitted until 2026-04-23 16:37:18.233053016 +0000 UTC m=+130.255449110 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs") pod "network-metrics-daemon-h6kzn" (UID: "b7f21f2f-2763-41c8-af5e-52de8001226b") : secret "metrics-daemon-secret" not found Apr 23 16:36:16.770238 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:36:16.770209 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-5lhlh" Apr 23 16:36:45.237625 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:36:45.237575 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert\") pod \"ingress-canary-rmrtl\" (UID: \"743aa8f5-75e8-4c04-8f4a-d49896428015\") " pod="openshift-ingress-canary/ingress-canary-rmrtl" Apr 23 16:36:45.237625 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:36:45.237628 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls\") pod \"dns-default-g2wqn\" (UID: \"4db30a17-673a-4844-8750-e939b2e34518\") " pod="openshift-dns/dns-default-g2wqn" Apr 23 16:36:45.238163 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:36:45.237727 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 16:36:45.238163 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:36:45.237791 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert podName:743aa8f5-75e8-4c04-8f4a-d49896428015 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:49.237774828 +0000 UTC m=+161.260170927 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert") pod "ingress-canary-rmrtl" (UID: "743aa8f5-75e8-4c04-8f4a-d49896428015") : secret "canary-serving-cert" not found Apr 23 16:36:45.238163 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:36:45.237735 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 16:36:45.238163 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:36:45.237872 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls podName:4db30a17-673a-4844-8750-e939b2e34518 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:49.237860019 +0000 UTC m=+161.260256114 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls") pod "dns-default-g2wqn" (UID: "4db30a17-673a-4844-8750-e939b2e34518") : secret "dns-default-metrics-tls" not found Apr 23 16:37:12.629359 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.629316 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-67c5b6577b-5q6ph"] Apr 23 16:37:12.632082 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.632062 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.635425 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.635404 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 16:37:12.635921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.635901 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 16:37:12.636146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.636127 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 16:37:12.636204 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.636158 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-f8556\"" Apr 23 16:37:12.640969 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.640950 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 16:37:12.647240 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.647221 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67c5b6577b-5q6ph"] Apr 23 16:37:12.722025 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.721992 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1f1ab387-785c-479f-b2b2-27f092332c1b-image-registry-private-configuration\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.722172 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.722040 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-certificates\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.722172 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.722099 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1f1ab387-785c-479f-b2b2-27f092332c1b-ca-trust-extracted\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.722172 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.722129 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1f1ab387-785c-479f-b2b2-27f092332c1b-installation-pull-secrets\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.722172 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.722148 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-bound-sa-token\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.722297 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.722186 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54qlk\" (UniqueName: \"kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-kube-api-access-54qlk\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.722297 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.722280 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.722358 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.722306 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f1ab387-785c-479f-b2b2-27f092332c1b-trusted-ca\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.823456 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.823424 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54qlk\" (UniqueName: \"kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-kube-api-access-54qlk\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.823611 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.823468 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.823611 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.823487 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f1ab387-785c-479f-b2b2-27f092332c1b-trusted-ca\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.823611 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.823516 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1f1ab387-785c-479f-b2b2-27f092332c1b-image-registry-private-configuration\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.823611 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.823546 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-certificates\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.823611 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.823569 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1f1ab387-785c-479f-b2b2-27f092332c1b-ca-trust-extracted\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.823611 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:12.823584 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:37:12.823611 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:12.823608 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67c5b6577b-5q6ph: secret "image-registry-tls" not found Apr 23 16:37:12.824129 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:12.824092 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls podName:1f1ab387-785c-479f-b2b2-27f092332c1b nodeName:}" failed. No retries permitted until 2026-04-23 16:37:13.324062817 +0000 UTC m=+125.346458928 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls") pod "image-registry-67c5b6577b-5q6ph" (UID: "1f1ab387-785c-479f-b2b2-27f092332c1b") : secret "image-registry-tls" not found Apr 23 16:37:12.824791 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.824491 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1f1ab387-785c-479f-b2b2-27f092332c1b-installation-pull-secrets\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.824791 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.824582 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1f1ab387-785c-479f-b2b2-27f092332c1b-ca-trust-extracted\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.824791 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.824591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-bound-sa-token\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.825273 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.825248 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-certificates\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.826861 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.825298 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f1ab387-785c-479f-b2b2-27f092332c1b-trusted-ca\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.828144 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.828117 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1f1ab387-785c-479f-b2b2-27f092332c1b-installation-pull-secrets\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.828986 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.828963 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1f1ab387-785c-479f-b2b2-27f092332c1b-image-registry-private-configuration\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.842299 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.842268 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54qlk\" (UniqueName: \"kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-kube-api-access-54qlk\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.848962 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.848936 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc"] Apr 23 16:37:12.851232 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.851211 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-bound-sa-token\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:12.851990 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.851974 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-94hpw"] Apr 23 16:37:12.852185 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.852113 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc" Apr 23 16:37:12.854527 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.854505 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 16:37:12.854618 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.854541 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 23 16:37:12.854618 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.854516 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 16:37:12.854736 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.854635 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 23 16:37:12.854887 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.854862 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-jth4c\"" Apr 23 16:37:12.854952 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.854923 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lqxpv"] Apr 23 16:37:12.855156 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.855138 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-94hpw" Apr 23 16:37:12.857688 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.857670 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lqxpv" Apr 23 16:37:12.858118 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.858101 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 23 16:37:12.858191 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.858161 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 23 16:37:12.858253 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.858165 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:37:12.858428 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.858412 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-qqmkb\"" Apr 23 16:37:12.859518 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.859505 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 23 16:37:12.860765 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.860748 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:37:12.861203 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.861185 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-9s2w8\"" Apr 23 16:37:12.861868 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.861768 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 23 16:37:12.861980 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.861906 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 23 16:37:12.862838 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.862804 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-94hpw"] Apr 23 16:37:12.891505 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.891446 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lqxpv"] Apr 23 16:37:12.896119 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.896097 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc"] Apr 23 16:37:12.925794 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.925767 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/71af0be6-1f33-49c7-ba45-d12899bb84e6-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-7nckc\" (UID: \"71af0be6-1f33-49c7-ba45-d12899bb84e6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc" Apr 23 16:37:12.925899 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.925803 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twmjf\" (UniqueName: \"kubernetes.io/projected/6b7ec9ae-872e-40fc-8d51-650ccb39c97b-kube-api-access-twmjf\") pod \"kube-storage-version-migrator-operator-6769c5d45-94hpw\" (UID: \"6b7ec9ae-872e-40fc-8d51-650ccb39c97b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-94hpw" Apr 23 16:37:12.925899 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.925833 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b7ec9ae-872e-40fc-8d51-650ccb39c97b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-94hpw\" (UID: \"6b7ec9ae-872e-40fc-8d51-650ccb39c97b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-94hpw" Apr 23 16:37:12.925899 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.925881 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b7ec9ae-872e-40fc-8d51-650ccb39c97b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-94hpw\" (UID: \"6b7ec9ae-872e-40fc-8d51-650ccb39c97b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-94hpw" Apr 23 16:37:12.926001 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.925909 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49k2w\" (UniqueName: \"kubernetes.io/projected/71af0be6-1f33-49c7-ba45-d12899bb84e6-kube-api-access-49k2w\") pod \"cluster-monitoring-operator-75587bd455-7nckc\" (UID: \"71af0be6-1f33-49c7-ba45-d12899bb84e6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc" Apr 23 16:37:12.926001 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.925943 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64z98\" (UniqueName: \"kubernetes.io/projected/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-kube-api-access-64z98\") pod \"cluster-samples-operator-6dc5bdb6b4-lqxpv\" (UID: \"1a83f99d-af3c-4f7f-ba85-ee5701997cd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lqxpv" Apr 23 16:37:12.926001 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.925980 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7nckc\" (UID: \"71af0be6-1f33-49c7-ba45-d12899bb84e6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc" Apr 23 16:37:12.926091 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.926033 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lqxpv\" (UID: \"1a83f99d-af3c-4f7f-ba85-ee5701997cd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lqxpv" Apr 23 16:37:12.961690 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.961657 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r2qw8"] Apr 23 16:37:12.965370 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.965354 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r2qw8" Apr 23 16:37:12.968504 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.968482 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 23 16:37:12.968651 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.968503 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:37:12.968785 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.968654 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-p8vft\"" Apr 23 16:37:12.968785 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.968726 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 23 16:37:12.969375 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.969350 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 23 16:37:12.983597 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:12.983573 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r2qw8"] Apr 23 16:37:13.026661 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.026627 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7nckc\" (UID: \"71af0be6-1f33-49c7-ba45-d12899bb84e6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc" Apr 23 16:37:13.026661 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.026663 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4eef891-cf79-4965-b0db-94974d87932b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-r2qw8\" (UID: \"e4eef891-cf79-4965-b0db-94974d87932b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r2qw8" Apr 23 16:37:13.026848 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.026707 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lqxpv\" (UID: \"1a83f99d-af3c-4f7f-ba85-ee5701997cd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lqxpv" Apr 23 16:37:13.026848 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.026729 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/71af0be6-1f33-49c7-ba45-d12899bb84e6-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-7nckc\" (UID: \"71af0be6-1f33-49c7-ba45-d12899bb84e6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc" Apr 23 16:37:13.026848 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.026754 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twmjf\" (UniqueName: \"kubernetes.io/projected/6b7ec9ae-872e-40fc-8d51-650ccb39c97b-kube-api-access-twmjf\") pod \"kube-storage-version-migrator-operator-6769c5d45-94hpw\" (UID: \"6b7ec9ae-872e-40fc-8d51-650ccb39c97b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-94hpw" Apr 23 16:37:13.026848 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:13.026770 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:13.026848 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.026787 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b7ec9ae-872e-40fc-8d51-650ccb39c97b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-94hpw\" (UID: \"6b7ec9ae-872e-40fc-8d51-650ccb39c97b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-94hpw" Apr 23 16:37:13.026848 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:13.026832 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:37:13.026848 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:13.026840 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls podName:71af0be6-1f33-49c7-ba45-d12899bb84e6 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:13.526824082 +0000 UTC m=+125.549220190 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7nckc" (UID: "71af0be6-1f33-49c7-ba45-d12899bb84e6") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:13.027203 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.026911 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b7ec9ae-872e-40fc-8d51-650ccb39c97b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-94hpw\" (UID: \"6b7ec9ae-872e-40fc-8d51-650ccb39c97b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-94hpw" Apr 23 16:37:13.027203 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:13.026928 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-samples-operator-tls podName:1a83f99d-af3c-4f7f-ba85-ee5701997cd8 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:13.526911404 +0000 UTC m=+125.549307499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lqxpv" (UID: "1a83f99d-af3c-4f7f-ba85-ee5701997cd8") : secret "samples-operator-tls" not found Apr 23 16:37:13.027203 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.026966 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqr2l\" (UniqueName: \"kubernetes.io/projected/e4eef891-cf79-4965-b0db-94974d87932b-kube-api-access-jqr2l\") pod \"service-ca-operator-d6fc45fc5-r2qw8\" (UID: \"e4eef891-cf79-4965-b0db-94974d87932b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r2qw8" Apr 23 16:37:13.027203 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.027004 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49k2w\" (UniqueName: \"kubernetes.io/projected/71af0be6-1f33-49c7-ba45-d12899bb84e6-kube-api-access-49k2w\") pod \"cluster-monitoring-operator-75587bd455-7nckc\" (UID: \"71af0be6-1f33-49c7-ba45-d12899bb84e6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc" Apr 23 16:37:13.027203 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.027057 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4eef891-cf79-4965-b0db-94974d87932b-config\") pod \"service-ca-operator-d6fc45fc5-r2qw8\" (UID: \"e4eef891-cf79-4965-b0db-94974d87932b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r2qw8" Apr 23 16:37:13.027203 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.027119 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64z98\" (UniqueName: \"kubernetes.io/projected/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-kube-api-access-64z98\") pod \"cluster-samples-operator-6dc5bdb6b4-lqxpv\" (UID: \"1a83f99d-af3c-4f7f-ba85-ee5701997cd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lqxpv" Apr 23 16:37:13.027506 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.027478 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b7ec9ae-872e-40fc-8d51-650ccb39c97b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-94hpw\" (UID: \"6b7ec9ae-872e-40fc-8d51-650ccb39c97b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-94hpw" Apr 23 16:37:13.027619 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.027606 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/71af0be6-1f33-49c7-ba45-d12899bb84e6-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-7nckc\" (UID: \"71af0be6-1f33-49c7-ba45-d12899bb84e6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc" Apr 23 16:37:13.029336 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.029320 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b7ec9ae-872e-40fc-8d51-650ccb39c97b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-94hpw\" (UID: \"6b7ec9ae-872e-40fc-8d51-650ccb39c97b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-94hpw" Apr 23 16:37:13.035579 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.035550 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twmjf\" (UniqueName: \"kubernetes.io/projected/6b7ec9ae-872e-40fc-8d51-650ccb39c97b-kube-api-access-twmjf\") pod \"kube-storage-version-migrator-operator-6769c5d45-94hpw\" (UID: \"6b7ec9ae-872e-40fc-8d51-650ccb39c97b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-94hpw" Apr 23 16:37:13.035736 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.035718 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49k2w\" (UniqueName: \"kubernetes.io/projected/71af0be6-1f33-49c7-ba45-d12899bb84e6-kube-api-access-49k2w\") pod \"cluster-monitoring-operator-75587bd455-7nckc\" (UID: \"71af0be6-1f33-49c7-ba45-d12899bb84e6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc" Apr 23 16:37:13.036393 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.036358 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64z98\" (UniqueName: \"kubernetes.io/projected/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-kube-api-access-64z98\") pod \"cluster-samples-operator-6dc5bdb6b4-lqxpv\" (UID: \"1a83f99d-af3c-4f7f-ba85-ee5701997cd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lqxpv" Apr 23 16:37:13.128038 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.128002 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4eef891-cf79-4965-b0db-94974d87932b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-r2qw8\" (UID: \"e4eef891-cf79-4965-b0db-94974d87932b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r2qw8" Apr 23 16:37:13.128193 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.128109 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqr2l\" (UniqueName: \"kubernetes.io/projected/e4eef891-cf79-4965-b0db-94974d87932b-kube-api-access-jqr2l\") pod \"service-ca-operator-d6fc45fc5-r2qw8\" (UID: \"e4eef891-cf79-4965-b0db-94974d87932b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r2qw8" Apr 23 16:37:13.128193 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.128139 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4eef891-cf79-4965-b0db-94974d87932b-config\") pod \"service-ca-operator-d6fc45fc5-r2qw8\" (UID: \"e4eef891-cf79-4965-b0db-94974d87932b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r2qw8" Apr 23 16:37:13.128648 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.128626 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4eef891-cf79-4965-b0db-94974d87932b-config\") pod \"service-ca-operator-d6fc45fc5-r2qw8\" (UID: \"e4eef891-cf79-4965-b0db-94974d87932b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r2qw8" Apr 23 16:37:13.130053 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.130036 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4eef891-cf79-4965-b0db-94974d87932b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-r2qw8\" (UID: \"e4eef891-cf79-4965-b0db-94974d87932b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r2qw8" Apr 23 16:37:13.140671 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.140654 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqr2l\" (UniqueName: \"kubernetes.io/projected/e4eef891-cf79-4965-b0db-94974d87932b-kube-api-access-jqr2l\") pod \"service-ca-operator-d6fc45fc5-r2qw8\" (UID: \"e4eef891-cf79-4965-b0db-94974d87932b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r2qw8" Apr 23 16:37:13.173273 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.173207 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-94hpw" Apr 23 16:37:13.273775 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.273742 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r2qw8" Apr 23 16:37:13.293292 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.293261 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-94hpw"] Apr 23 16:37:13.296315 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:37:13.296279 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b7ec9ae_872e_40fc_8d51_650ccb39c97b.slice/crio-a84e86d0ec47667a88894deefd3660add3536435387301140ff7aa7a3469c056 WatchSource:0}: Error finding container a84e86d0ec47667a88894deefd3660add3536435387301140ff7aa7a3469c056: Status 404 returned error can't find the container with id a84e86d0ec47667a88894deefd3660add3536435387301140ff7aa7a3469c056 Apr 23 16:37:13.330123 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.330096 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:13.330261 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:13.330243 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:37:13.330312 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:13.330263 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67c5b6577b-5q6ph: secret "image-registry-tls" not found Apr 23 16:37:13.330352 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:13.330316 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls podName:1f1ab387-785c-479f-b2b2-27f092332c1b nodeName:}" failed. No retries permitted until 2026-04-23 16:37:14.330299755 +0000 UTC m=+126.352695849 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls") pod "image-registry-67c5b6577b-5q6ph" (UID: "1f1ab387-785c-479f-b2b2-27f092332c1b") : secret "image-registry-tls" not found Apr 23 16:37:13.396315 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.396285 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r2qw8"] Apr 23 16:37:13.399848 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:37:13.399818 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4eef891_cf79_4965_b0db_94974d87932b.slice/crio-06e1e6fa48a12105412f5db6ac4436c380a46a82b962fb5e5825d4a9e8472947 WatchSource:0}: Error finding container 06e1e6fa48a12105412f5db6ac4436c380a46a82b962fb5e5825d4a9e8472947: Status 404 returned error can't find the container with id 06e1e6fa48a12105412f5db6ac4436c380a46a82b962fb5e5825d4a9e8472947 Apr 23 16:37:13.531676 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.531585 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7nckc\" (UID: \"71af0be6-1f33-49c7-ba45-d12899bb84e6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc" Apr 23 16:37:13.531676 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.531670 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lqxpv\" (UID: \"1a83f99d-af3c-4f7f-ba85-ee5701997cd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lqxpv" Apr 23 16:37:13.531884 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:13.531775 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:37:13.531884 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:13.531838 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-samples-operator-tls podName:1a83f99d-af3c-4f7f-ba85-ee5701997cd8 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:14.531818799 +0000 UTC m=+126.554214895 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lqxpv" (UID: "1a83f99d-af3c-4f7f-ba85-ee5701997cd8") : secret "samples-operator-tls" not found Apr 23 16:37:13.531884 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:13.531775 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:13.532027 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:13.531936 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls podName:71af0be6-1f33-49c7-ba45-d12899bb84e6 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:14.531915321 +0000 UTC m=+126.554311426 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7nckc" (UID: "71af0be6-1f33-49c7-ba45-d12899bb84e6") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:13.934228 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.934189 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r2qw8" event={"ID":"e4eef891-cf79-4965-b0db-94974d87932b","Type":"ContainerStarted","Data":"06e1e6fa48a12105412f5db6ac4436c380a46a82b962fb5e5825d4a9e8472947"} Apr 23 16:37:13.935132 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:13.935107 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-94hpw" event={"ID":"6b7ec9ae-872e-40fc-8d51-650ccb39c97b","Type":"ContainerStarted","Data":"a84e86d0ec47667a88894deefd3660add3536435387301140ff7aa7a3469c056"} Apr 23 16:37:14.338333 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:14.338291 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:14.338550 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:14.338483 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:37:14.338550 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:14.338506 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67c5b6577b-5q6ph: secret "image-registry-tls" not found Apr 23 16:37:14.338664 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:14.338575 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls podName:1f1ab387-785c-479f-b2b2-27f092332c1b nodeName:}" failed. No retries permitted until 2026-04-23 16:37:16.338554296 +0000 UTC m=+128.360950395 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls") pod "image-registry-67c5b6577b-5q6ph" (UID: "1f1ab387-785c-479f-b2b2-27f092332c1b") : secret "image-registry-tls" not found Apr 23 16:37:14.539329 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:14.539294 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7nckc\" (UID: \"71af0be6-1f33-49c7-ba45-d12899bb84e6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc" Apr 23 16:37:14.539545 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:14.539457 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:14.539545 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:14.539468 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lqxpv\" (UID: \"1a83f99d-af3c-4f7f-ba85-ee5701997cd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lqxpv" Apr 23 16:37:14.539545 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:14.539527 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls podName:71af0be6-1f33-49c7-ba45-d12899bb84e6 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:16.539507643 +0000 UTC m=+128.561903742 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7nckc" (UID: "71af0be6-1f33-49c7-ba45-d12899bb84e6") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:14.539705 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:14.539564 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:37:14.539705 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:14.539614 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-samples-operator-tls podName:1a83f99d-af3c-4f7f-ba85-ee5701997cd8 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:16.539599305 +0000 UTC m=+128.561995402 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lqxpv" (UID: "1a83f99d-af3c-4f7f-ba85-ee5701997cd8") : secret "samples-operator-tls" not found Apr 23 16:37:15.940856 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:15.940766 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r2qw8" event={"ID":"e4eef891-cf79-4965-b0db-94974d87932b","Type":"ContainerStarted","Data":"768235473bf6e11d44e3c5a4c0986c0cd03d451e2d4cefdebd50e0f5e4b16b9d"} Apr 23 16:37:15.942141 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:15.942111 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-94hpw" event={"ID":"6b7ec9ae-872e-40fc-8d51-650ccb39c97b","Type":"ContainerStarted","Data":"4658ca6d951865bef26ee18d25e5eea2da97882955b6311749cc044e3b541009"} Apr 23 16:37:15.957038 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:15.956988 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r2qw8" podStartSLOduration=1.669658272 podStartE2EDuration="3.956973507s" podCreationTimestamp="2026-04-23 16:37:12 +0000 UTC" firstStartedPulling="2026-04-23 16:37:13.401558642 +0000 UTC m=+125.423954740" lastFinishedPulling="2026-04-23 16:37:15.68887388 +0000 UTC m=+127.711269975" observedRunningTime="2026-04-23 16:37:15.956173555 +0000 UTC m=+127.978569672" watchObservedRunningTime="2026-04-23 16:37:15.956973507 +0000 UTC m=+127.979369624" Apr 23 16:37:15.973456 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:15.973400 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-94hpw" podStartSLOduration=1.5801318439999998 podStartE2EDuration="3.973361233s" podCreationTimestamp="2026-04-23 16:37:12 +0000 UTC" firstStartedPulling="2026-04-23 16:37:13.29794349 +0000 UTC m=+125.320339586" lastFinishedPulling="2026-04-23 16:37:15.691172867 +0000 UTC m=+127.713568975" observedRunningTime="2026-04-23 16:37:15.972418989 +0000 UTC m=+127.994815106" watchObservedRunningTime="2026-04-23 16:37:15.973361233 +0000 UTC m=+127.995757351" Apr 23 16:37:16.357867 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:16.357830 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:16.358027 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:16.358004 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:37:16.358067 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:16.358029 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67c5b6577b-5q6ph: secret "image-registry-tls" not found Apr 23 16:37:16.358123 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:16.358112 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls podName:1f1ab387-785c-479f-b2b2-27f092332c1b nodeName:}" failed. No retries permitted until 2026-04-23 16:37:20.358090183 +0000 UTC m=+132.380486280 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls") pod "image-registry-67c5b6577b-5q6ph" (UID: "1f1ab387-785c-479f-b2b2-27f092332c1b") : secret "image-registry-tls" not found Apr 23 16:37:16.558942 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:16.558900 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7nckc\" (UID: \"71af0be6-1f33-49c7-ba45-d12899bb84e6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc" Apr 23 16:37:16.559118 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:16.558973 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lqxpv\" (UID: \"1a83f99d-af3c-4f7f-ba85-ee5701997cd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lqxpv" Apr 23 16:37:16.559118 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:16.559058 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:16.559118 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:16.559078 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:37:16.559216 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:16.559138 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-samples-operator-tls podName:1a83f99d-af3c-4f7f-ba85-ee5701997cd8 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:20.559119675 +0000 UTC m=+132.581515785 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lqxpv" (UID: "1a83f99d-af3c-4f7f-ba85-ee5701997cd8") : secret "samples-operator-tls" not found Apr 23 16:37:16.559216 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:16.559156 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls podName:71af0be6-1f33-49c7-ba45-d12899bb84e6 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:20.559148242 +0000 UTC m=+132.581544335 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7nckc" (UID: "71af0be6-1f33-49c7-ba45-d12899bb84e6") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:18.284882 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:18.284829 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs\") pod \"network-metrics-daemon-h6kzn\" (UID: \"b7f21f2f-2763-41c8-af5e-52de8001226b\") " pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:37:18.285351 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:18.284976 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 16:37:18.285351 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:18.285040 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs podName:b7f21f2f-2763-41c8-af5e-52de8001226b nodeName:}" failed. No retries permitted until 2026-04-23 16:39:20.285023363 +0000 UTC m=+252.307419456 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs") pod "network-metrics-daemon-h6kzn" (UID: "b7f21f2f-2763-41c8-af5e-52de8001226b") : secret "metrics-daemon-secret" not found Apr 23 16:37:20.271021 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:20.270988 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vrnns"] Apr 23 16:37:20.274146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:20.274129 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vrnns" Apr 23 16:37:20.276221 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:20.276201 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-f2mkr\"" Apr 23 16:37:20.284362 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:20.284341 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vrnns"] Apr 23 16:37:20.391609 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:20.391580 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-frq2q_74494667-d025-4d57-be34-03a72ee7cbaa/dns-node-resolver/0.log" Apr 23 16:37:20.400936 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:20.400909 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:20.401013 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:20.400991 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg6nq\" (UniqueName: \"kubernetes.io/projected/d3c992ed-435c-4d40-bed7-1069bba6e643-kube-api-access-gg6nq\") pod \"network-check-source-8894fc9bd-vrnns\" (UID: \"d3c992ed-435c-4d40-bed7-1069bba6e643\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vrnns" Apr 23 16:37:20.401070 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:20.401056 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 16:37:20.401105 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:20.401073 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67c5b6577b-5q6ph: secret "image-registry-tls" not found Apr 23 16:37:20.401138 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:20.401126 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls podName:1f1ab387-785c-479f-b2b2-27f092332c1b nodeName:}" failed. No retries permitted until 2026-04-23 16:37:28.401110508 +0000 UTC m=+140.423506603 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls") pod "image-registry-67c5b6577b-5q6ph" (UID: "1f1ab387-785c-479f-b2b2-27f092332c1b") : secret "image-registry-tls" not found Apr 23 16:37:20.502045 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:20.502008 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gg6nq\" (UniqueName: \"kubernetes.io/projected/d3c992ed-435c-4d40-bed7-1069bba6e643-kube-api-access-gg6nq\") pod \"network-check-source-8894fc9bd-vrnns\" (UID: \"d3c992ed-435c-4d40-bed7-1069bba6e643\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vrnns" Apr 23 16:37:20.511356 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:20.511325 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg6nq\" (UniqueName: \"kubernetes.io/projected/d3c992ed-435c-4d40-bed7-1069bba6e643-kube-api-access-gg6nq\") pod \"network-check-source-8894fc9bd-vrnns\" (UID: \"d3c992ed-435c-4d40-bed7-1069bba6e643\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vrnns" Apr 23 16:37:20.582919 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:20.582893 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vrnns" Apr 23 16:37:20.602742 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:20.602713 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7nckc\" (UID: \"71af0be6-1f33-49c7-ba45-d12899bb84e6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc" Apr 23 16:37:20.602883 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:20.602786 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lqxpv\" (UID: \"1a83f99d-af3c-4f7f-ba85-ee5701997cd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lqxpv" Apr 23 16:37:20.602883 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:20.602868 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:20.602990 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:20.602895 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 16:37:20.602990 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:20.602952 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls podName:71af0be6-1f33-49c7-ba45-d12899bb84e6 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:28.602931439 +0000 UTC m=+140.625327550 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7nckc" (UID: "71af0be6-1f33-49c7-ba45-d12899bb84e6") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:20.602990 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:20.602970 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-samples-operator-tls podName:1a83f99d-af3c-4f7f-ba85-ee5701997cd8 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:28.602961578 +0000 UTC m=+140.625357678 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lqxpv" (UID: "1a83f99d-af3c-4f7f-ba85-ee5701997cd8") : secret "samples-operator-tls" not found Apr 23 16:37:20.693867 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:20.693811 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vrnns"] Apr 23 16:37:20.696571 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:37:20.696542 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3c992ed_435c_4d40_bed7_1069bba6e643.slice/crio-8acecc12b149db4dd295a4306bbadab2b8294d4e78a43fb95f72508cadf13b1b WatchSource:0}: Error finding container 8acecc12b149db4dd295a4306bbadab2b8294d4e78a43fb95f72508cadf13b1b: Status 404 returned error can't find the container with id 8acecc12b149db4dd295a4306bbadab2b8294d4e78a43fb95f72508cadf13b1b Apr 23 16:37:20.952840 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:20.952739 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vrnns" event={"ID":"d3c992ed-435c-4d40-bed7-1069bba6e643","Type":"ContainerStarted","Data":"5487c6325ccd99303fc59e0a5a24c34e85d25cb6f1e0e09e039174ed2d745fbd"} Apr 23 16:37:20.952840 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:20.952778 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vrnns" event={"ID":"d3c992ed-435c-4d40-bed7-1069bba6e643","Type":"ContainerStarted","Data":"8acecc12b149db4dd295a4306bbadab2b8294d4e78a43fb95f72508cadf13b1b"} Apr 23 16:37:20.968147 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:20.968110 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vrnns" podStartSLOduration=0.968097074 podStartE2EDuration="968.097074ms" podCreationTimestamp="2026-04-23 16:37:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:37:20.967642766 +0000 UTC m=+132.990038882" watchObservedRunningTime="2026-04-23 16:37:20.968097074 +0000 UTC m=+132.990493189" Apr 23 16:37:21.393823 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:21.393799 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xsjmw_0cf2af80-3ff4-4717-af9c-87bb29677708/node-ca/0.log" Apr 23 16:37:22.793630 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:22.793600 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-94hpw_6b7ec9ae-872e-40fc-8d51-650ccb39c97b/kube-storage-version-migrator-operator/0.log" Apr 23 16:37:28.464599 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:28.464570 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:28.466893 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:28.466869 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls\") pod \"image-registry-67c5b6577b-5q6ph\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:28.540876 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:28.540852 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:28.665946 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:28.665912 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7nckc\" (UID: \"71af0be6-1f33-49c7-ba45-d12899bb84e6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc" Apr 23 16:37:28.666140 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:28.665979 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lqxpv\" (UID: \"1a83f99d-af3c-4f7f-ba85-ee5701997cd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lqxpv" Apr 23 16:37:28.666140 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:28.666060 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:28.666140 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:28.666125 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls podName:71af0be6-1f33-49c7-ba45-d12899bb84e6 nodeName:}" failed. No retries permitted until 2026-04-23 16:37:44.666107675 +0000 UTC m=+156.688503786 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7nckc" (UID: "71af0be6-1f33-49c7-ba45-d12899bb84e6") : secret "cluster-monitoring-operator-tls" not found Apr 23 16:37:28.669242 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:28.669213 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a83f99d-af3c-4f7f-ba85-ee5701997cd8-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lqxpv\" (UID: \"1a83f99d-af3c-4f7f-ba85-ee5701997cd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lqxpv" Apr 23 16:37:28.669902 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:28.669878 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67c5b6577b-5q6ph"] Apr 23 16:37:28.673615 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:37:28.673591 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f1ab387_785c_479f_b2b2_27f092332c1b.slice/crio-bebc79c6e41d408a9b1f2fdd8ddab1d4fecef7da45f06c86de93f58356069cad WatchSource:0}: Error finding container bebc79c6e41d408a9b1f2fdd8ddab1d4fecef7da45f06c86de93f58356069cad: Status 404 returned error can't find the container with id bebc79c6e41d408a9b1f2fdd8ddab1d4fecef7da45f06c86de93f58356069cad Apr 23 16:37:28.776955 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:28.776927 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lqxpv" Apr 23 16:37:28.898561 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:28.898528 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lqxpv"] Apr 23 16:37:28.972064 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:28.972033 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" event={"ID":"1f1ab387-785c-479f-b2b2-27f092332c1b","Type":"ContainerStarted","Data":"f9c9787c2acd1a88af15b920115dcaab39e9abf9af00750fa355eb8287e26f3c"} Apr 23 16:37:28.972152 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:28.972069 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" event={"ID":"1f1ab387-785c-479f-b2b2-27f092332c1b","Type":"ContainerStarted","Data":"bebc79c6e41d408a9b1f2fdd8ddab1d4fecef7da45f06c86de93f58356069cad"} Apr 23 16:37:28.972152 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:28.972142 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:37:28.973037 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:28.973012 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lqxpv" event={"ID":"1a83f99d-af3c-4f7f-ba85-ee5701997cd8","Type":"ContainerStarted","Data":"462922a3068b455ee0453c4a25fb1502273e719f970190b80e3bacdff53ef15e"} Apr 23 16:37:31.982765 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:31.982730 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lqxpv" event={"ID":"1a83f99d-af3c-4f7f-ba85-ee5701997cd8","Type":"ContainerStarted","Data":"1bee92765fc83a6a39245692ef5dd20555667f14f36b851ec90ffa6bee97a083"} Apr 23 16:37:31.982765 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:31.982766 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lqxpv" event={"ID":"1a83f99d-af3c-4f7f-ba85-ee5701997cd8","Type":"ContainerStarted","Data":"9a5dc30058d5bd6799949520bcb6c5843281df34bae748b8bd1affc1b382a8f6"} Apr 23 16:37:31.999712 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:31.999653 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" podStartSLOduration=19.999635767 podStartE2EDuration="19.999635767s" podCreationTimestamp="2026-04-23 16:37:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:37:28.997202974 +0000 UTC m=+141.019599115" watchObservedRunningTime="2026-04-23 16:37:31.999635767 +0000 UTC m=+144.022031884" Apr 23 16:37:32.000209 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:32.000179 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lqxpv" podStartSLOduration=17.622904788 podStartE2EDuration="20.000172612s" podCreationTimestamp="2026-04-23 16:37:12 +0000 UTC" firstStartedPulling="2026-04-23 16:37:28.934105705 +0000 UTC m=+140.956501800" lastFinishedPulling="2026-04-23 16:37:31.311373516 +0000 UTC m=+143.333769624" observedRunningTime="2026-04-23 16:37:31.999411561 +0000 UTC m=+144.021807677" watchObservedRunningTime="2026-04-23 16:37:32.000172612 +0000 UTC m=+144.022568728" Apr 23 16:37:41.030700 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.030667 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wxt5q"] Apr 23 16:37:41.035121 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.035103 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wxt5q" Apr 23 16:37:41.039100 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.039067 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 23 16:37:41.039214 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.039069 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-zwp9x\"" Apr 23 16:37:41.039214 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.039118 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 23 16:37:41.049222 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.049196 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wxt5q"] Apr 23 16:37:41.060191 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.060160 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/203f31ff-6191-4108-83b4-7a8cd9446ee7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wxt5q\" (UID: \"203f31ff-6191-4108-83b4-7a8cd9446ee7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wxt5q" Apr 23 16:37:41.060286 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.060211 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/203f31ff-6191-4108-83b4-7a8cd9446ee7-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wxt5q\" (UID: \"203f31ff-6191-4108-83b4-7a8cd9446ee7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wxt5q" Apr 23 16:37:41.160830 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.160788 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/203f31ff-6191-4108-83b4-7a8cd9446ee7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wxt5q\" (UID: \"203f31ff-6191-4108-83b4-7a8cd9446ee7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wxt5q" Apr 23 16:37:41.160990 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.160870 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/203f31ff-6191-4108-83b4-7a8cd9446ee7-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wxt5q\" (UID: \"203f31ff-6191-4108-83b4-7a8cd9446ee7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wxt5q" Apr 23 16:37:41.161559 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.161541 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/203f31ff-6191-4108-83b4-7a8cd9446ee7-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wxt5q\" (UID: \"203f31ff-6191-4108-83b4-7a8cd9446ee7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wxt5q" Apr 23 16:37:41.163066 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.163046 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/203f31ff-6191-4108-83b4-7a8cd9446ee7-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wxt5q\" (UID: \"203f31ff-6191-4108-83b4-7a8cd9446ee7\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wxt5q" Apr 23 16:37:41.175442 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.175420 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67c5b6577b-5q6ph"] Apr 23 16:37:41.210368 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.210336 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-csnvz"] Apr 23 16:37:41.213536 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.213521 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-csnvz" Apr 23 16:37:41.216213 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:41.216181 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-10-0-129-102.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'ip-10-0-129-102.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Apr 23 16:37:41.216324 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.216285 2578 status_manager.go:895] "Failed to get status for pod" podUID="99a16fcc-e537-4736-a7fd-4a673684aa6e" pod="openshift-console/downloads-6bcc868b7-csnvz" err="pods \"downloads-6bcc868b7-csnvz\" is forbidden: User \"system:node:ip-10-0-129-102.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'ip-10-0-129-102.ec2.internal' and this object" Apr 23 16:37:41.217271 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:41.217246 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:ip-10-0-129-102.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'ip-10-0-129-102.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" type="*v1.ConfigMap" Apr 23 16:37:41.217421 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:41.217397 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"default-dockercfg-jnds7\" is forbidden: User \"system:node:ip-10-0-129-102.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'ip-10-0-129-102.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-console\"/\"default-dockercfg-jnds7\"" type="*v1.Secret" Apr 23 16:37:41.223759 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.223739 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-lgpt4"] Apr 23 16:37:41.226833 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.226815 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-lgpt4" Apr 23 16:37:41.236754 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.236735 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-csnvz"] Apr 23 16:37:41.239226 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.239210 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 16:37:41.239289 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.239248 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-nnwmz\"" Apr 23 16:37:41.240921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.240897 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 16:37:41.241042 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.241027 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 16:37:41.241572 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.241532 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 16:37:41.261505 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.261482 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d042bc1e-b16b-4b25-a9e1-19e50f2c799f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lgpt4\" (UID: \"d042bc1e-b16b-4b25-a9e1-19e50f2c799f\") " pod="openshift-insights/insights-runtime-extractor-lgpt4" Apr 23 16:37:41.261610 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.261530 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d042bc1e-b16b-4b25-a9e1-19e50f2c799f-data-volume\") pod \"insights-runtime-extractor-lgpt4\" (UID: \"d042bc1e-b16b-4b25-a9e1-19e50f2c799f\") " pod="openshift-insights/insights-runtime-extractor-lgpt4" Apr 23 16:37:41.261610 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.261557 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d042bc1e-b16b-4b25-a9e1-19e50f2c799f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lgpt4\" (UID: \"d042bc1e-b16b-4b25-a9e1-19e50f2c799f\") " pod="openshift-insights/insights-runtime-extractor-lgpt4" Apr 23 16:37:41.261610 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.261581 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw4l6\" (UniqueName: \"kubernetes.io/projected/d042bc1e-b16b-4b25-a9e1-19e50f2c799f-kube-api-access-zw4l6\") pod \"insights-runtime-extractor-lgpt4\" (UID: \"d042bc1e-b16b-4b25-a9e1-19e50f2c799f\") " pod="openshift-insights/insights-runtime-extractor-lgpt4" Apr 23 16:37:41.261715 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.261660 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpxdl\" (UniqueName: \"kubernetes.io/projected/99a16fcc-e537-4736-a7fd-4a673684aa6e-kube-api-access-mpxdl\") pod \"downloads-6bcc868b7-csnvz\" (UID: \"99a16fcc-e537-4736-a7fd-4a673684aa6e\") " pod="openshift-console/downloads-6bcc868b7-csnvz" Apr 23 16:37:41.261715 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.261704 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d042bc1e-b16b-4b25-a9e1-19e50f2c799f-crio-socket\") pod \"insights-runtime-extractor-lgpt4\" (UID: \"d042bc1e-b16b-4b25-a9e1-19e50f2c799f\") " pod="openshift-insights/insights-runtime-extractor-lgpt4" Apr 23 16:37:41.278123 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.278099 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-lgpt4"] Apr 23 16:37:41.298714 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.298647 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-58544877c9-fd586"] Apr 23 16:37:41.301775 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.301754 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.331781 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.331757 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-58544877c9-fd586"] Apr 23 16:37:41.344083 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.344062 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wxt5q" Apr 23 16:37:41.362074 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.362045 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d042bc1e-b16b-4b25-a9e1-19e50f2c799f-crio-socket\") pod \"insights-runtime-extractor-lgpt4\" (UID: \"d042bc1e-b16b-4b25-a9e1-19e50f2c799f\") " pod="openshift-insights/insights-runtime-extractor-lgpt4" Apr 23 16:37:41.362201 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.362095 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d042bc1e-b16b-4b25-a9e1-19e50f2c799f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lgpt4\" (UID: \"d042bc1e-b16b-4b25-a9e1-19e50f2c799f\") " pod="openshift-insights/insights-runtime-extractor-lgpt4" Apr 23 16:37:41.362201 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.362147 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-trusted-ca\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.362201 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.362175 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d042bc1e-b16b-4b25-a9e1-19e50f2c799f-data-volume\") pod \"insights-runtime-extractor-lgpt4\" (UID: \"d042bc1e-b16b-4b25-a9e1-19e50f2c799f\") " pod="openshift-insights/insights-runtime-extractor-lgpt4" Apr 23 16:37:41.362201 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.362198 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d042bc1e-b16b-4b25-a9e1-19e50f2c799f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lgpt4\" (UID: \"d042bc1e-b16b-4b25-a9e1-19e50f2c799f\") " pod="openshift-insights/insights-runtime-extractor-lgpt4" Apr 23 16:37:41.362430 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.362217 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d042bc1e-b16b-4b25-a9e1-19e50f2c799f-crio-socket\") pod \"insights-runtime-extractor-lgpt4\" (UID: \"d042bc1e-b16b-4b25-a9e1-19e50f2c799f\") " pod="openshift-insights/insights-runtime-extractor-lgpt4" Apr 23 16:37:41.362430 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.362224 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw4l6\" (UniqueName: \"kubernetes.io/projected/d042bc1e-b16b-4b25-a9e1-19e50f2c799f-kube-api-access-zw4l6\") pod \"insights-runtime-extractor-lgpt4\" (UID: \"d042bc1e-b16b-4b25-a9e1-19e50f2c799f\") " pod="openshift-insights/insights-runtime-extractor-lgpt4" Apr 23 16:37:41.362430 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.362277 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-registry-tls\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.362430 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.362319 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-registry-certificates\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.362430 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.362341 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-bound-sa-token\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.362672 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.362469 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpxdl\" (UniqueName: \"kubernetes.io/projected/99a16fcc-e537-4736-a7fd-4a673684aa6e-kube-api-access-mpxdl\") pod \"downloads-6bcc868b7-csnvz\" (UID: \"99a16fcc-e537-4736-a7fd-4a673684aa6e\") " pod="openshift-console/downloads-6bcc868b7-csnvz" Apr 23 16:37:41.362672 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.362515 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-installation-pull-secrets\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.362672 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.362572 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-ca-trust-extracted\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.362672 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.362595 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d042bc1e-b16b-4b25-a9e1-19e50f2c799f-data-volume\") pod \"insights-runtime-extractor-lgpt4\" (UID: \"d042bc1e-b16b-4b25-a9e1-19e50f2c799f\") " pod="openshift-insights/insights-runtime-extractor-lgpt4" Apr 23 16:37:41.362672 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.362617 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-image-registry-private-configuration\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.362672 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.362647 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69wtt\" (UniqueName: \"kubernetes.io/projected/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-kube-api-access-69wtt\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.362932 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.362773 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d042bc1e-b16b-4b25-a9e1-19e50f2c799f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lgpt4\" (UID: \"d042bc1e-b16b-4b25-a9e1-19e50f2c799f\") " pod="openshift-insights/insights-runtime-extractor-lgpt4" Apr 23 16:37:41.364848 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.364828 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d042bc1e-b16b-4b25-a9e1-19e50f2c799f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lgpt4\" (UID: \"d042bc1e-b16b-4b25-a9e1-19e50f2c799f\") " pod="openshift-insights/insights-runtime-extractor-lgpt4" Apr 23 16:37:41.379425 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.379398 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw4l6\" (UniqueName: \"kubernetes.io/projected/d042bc1e-b16b-4b25-a9e1-19e50f2c799f-kube-api-access-zw4l6\") pod \"insights-runtime-extractor-lgpt4\" (UID: \"d042bc1e-b16b-4b25-a9e1-19e50f2c799f\") " pod="openshift-insights/insights-runtime-extractor-lgpt4" Apr 23 16:37:41.463159 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.463122 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69wtt\" (UniqueName: \"kubernetes.io/projected/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-kube-api-access-69wtt\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.463323 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.463182 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-trusted-ca\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.463323 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.463218 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-registry-tls\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.463323 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.463241 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-registry-certificates\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.463323 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.463260 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-bound-sa-token\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.463323 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.463281 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-installation-pull-secrets\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.463591 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.463354 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-ca-trust-extracted\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.463591 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.463426 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-image-registry-private-configuration\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.463907 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.463871 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-ca-trust-extracted\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.464360 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.464341 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-registry-certificates\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.464608 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.464584 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-trusted-ca\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.465608 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.465584 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-installation-pull-secrets\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.465693 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.465634 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-registry-tls\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.465816 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.465799 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-image-registry-private-configuration\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.482599 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.482567 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wxt5q"] Apr 23 16:37:41.485165 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.485135 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-bound-sa-token\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.485460 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.485445 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69wtt\" (UniqueName: \"kubernetes.io/projected/0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44-kube-api-access-69wtt\") pod \"image-registry-58544877c9-fd586\" (UID: \"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44\") " pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.485991 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:37:41.485972 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod203f31ff_6191_4108_83b4_7a8cd9446ee7.slice/crio-81a161ed897eec8e08990e38bcdf2a5e1bfe8187255a41cc16d27292921cb0bb WatchSource:0}: Error finding container 81a161ed897eec8e08990e38bcdf2a5e1bfe8187255a41cc16d27292921cb0bb: Status 404 returned error can't find the container with id 81a161ed897eec8e08990e38bcdf2a5e1bfe8187255a41cc16d27292921cb0bb Apr 23 16:37:41.534597 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.534579 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-lgpt4" Apr 23 16:37:41.609912 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.609884 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:41.655190 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.655132 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-lgpt4"] Apr 23 16:37:41.658960 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:37:41.658917 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd042bc1e_b16b_4b25_a9e1_19e50f2c799f.slice/crio-0de56437964cc91eac584b5074da9ef90ec2f00112c2a6c766f51f5f65c7d802 WatchSource:0}: Error finding container 0de56437964cc91eac584b5074da9ef90ec2f00112c2a6c766f51f5f65c7d802: Status 404 returned error can't find the container with id 0de56437964cc91eac584b5074da9ef90ec2f00112c2a6c766f51f5f65c7d802 Apr 23 16:37:41.762832 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:41.762807 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-58544877c9-fd586"] Apr 23 16:37:41.765913 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:37:41.765891 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f3f1e66_afce_41c0_a9fe_72c3a0eb1f44.slice/crio-5f4085373aa0fbf1a4f10623bb8580b086aebbda69c6387d1d0177b36b57a8fb WatchSource:0}: Error finding container 5f4085373aa0fbf1a4f10623bb8580b086aebbda69c6387d1d0177b36b57a8fb: Status 404 returned error can't find the container with id 5f4085373aa0fbf1a4f10623bb8580b086aebbda69c6387d1d0177b36b57a8fb Apr 23 16:37:42.011278 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:42.011195 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wxt5q" event={"ID":"203f31ff-6191-4108-83b4-7a8cd9446ee7","Type":"ContainerStarted","Data":"81a161ed897eec8e08990e38bcdf2a5e1bfe8187255a41cc16d27292921cb0bb"} Apr 23 16:37:42.012476 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:42.012447 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-58544877c9-fd586" event={"ID":"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44","Type":"ContainerStarted","Data":"812d9f3fa8f80eac73c34403274f8604cb06d2c12f9f713632192673483efad3"} Apr 23 16:37:42.012476 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:42.012481 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-58544877c9-fd586" event={"ID":"0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44","Type":"ContainerStarted","Data":"5f4085373aa0fbf1a4f10623bb8580b086aebbda69c6387d1d0177b36b57a8fb"} Apr 23 16:37:42.012661 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:42.012537 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:37:42.013817 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:42.013796 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lgpt4" event={"ID":"d042bc1e-b16b-4b25-a9e1-19e50f2c799f","Type":"ContainerStarted","Data":"4d4e7186a878d3be2cbad2d5fea287d6f6f21647d6012a79a3393b556bb041df"} Apr 23 16:37:42.013817 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:42.013820 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lgpt4" event={"ID":"d042bc1e-b16b-4b25-a9e1-19e50f2c799f","Type":"ContainerStarted","Data":"0de56437964cc91eac584b5074da9ef90ec2f00112c2a6c766f51f5f65c7d802"} Apr 23 16:37:42.048679 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:42.048627 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-58544877c9-fd586" podStartSLOduration=1.048608319 podStartE2EDuration="1.048608319s" podCreationTimestamp="2026-04-23 16:37:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:37:42.04711443 +0000 UTC m=+154.069510558" watchObservedRunningTime="2026-04-23 16:37:42.048608319 +0000 UTC m=+154.071004438" Apr 23 16:37:42.091136 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:42.091106 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 16:37:42.543922 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:42.543842 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 16:37:42.551179 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:42.551156 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpxdl\" (UniqueName: \"kubernetes.io/projected/99a16fcc-e537-4736-a7fd-4a673684aa6e-kube-api-access-mpxdl\") pod \"downloads-6bcc868b7-csnvz\" (UID: \"99a16fcc-e537-4736-a7fd-4a673684aa6e\") " pod="openshift-console/downloads-6bcc868b7-csnvz" Apr 23 16:37:42.576146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:42.576128 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-jnds7\"" Apr 23 16:37:42.721997 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:42.721960 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-csnvz" Apr 23 16:37:42.846174 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:42.846135 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-csnvz"] Apr 23 16:37:42.850399 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:37:42.850350 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99a16fcc_e537_4736_a7fd_4a673684aa6e.slice/crio-b3ec15c0f9bcedddb17a40ed9d44524d2481ef3e585fcc4027261b21b827744f WatchSource:0}: Error finding container b3ec15c0f9bcedddb17a40ed9d44524d2481ef3e585fcc4027261b21b827744f: Status 404 returned error can't find the container with id b3ec15c0f9bcedddb17a40ed9d44524d2481ef3e585fcc4027261b21b827744f Apr 23 16:37:43.018442 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:43.018396 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lgpt4" event={"ID":"d042bc1e-b16b-4b25-a9e1-19e50f2c799f","Type":"ContainerStarted","Data":"9c1978d6cb2828e88aebf39dc1c7c549134acde5089e43419ff65cc35ed7fa83"} Apr 23 16:37:43.019757 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:43.019728 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wxt5q" event={"ID":"203f31ff-6191-4108-83b4-7a8cd9446ee7","Type":"ContainerStarted","Data":"54b38ae9d007b91fd3471def9c13d7b86b18499f81383a9f43276dfbb76a5ccb"} Apr 23 16:37:43.020924 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:43.020896 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-csnvz" event={"ID":"99a16fcc-e537-4736-a7fd-4a673684aa6e","Type":"ContainerStarted","Data":"b3ec15c0f9bcedddb17a40ed9d44524d2481ef3e585fcc4027261b21b827744f"} Apr 23 16:37:43.038410 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:43.038352 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wxt5q" podStartSLOduration=1.9501168660000001 podStartE2EDuration="3.038342389s" podCreationTimestamp="2026-04-23 16:37:40 +0000 UTC" firstStartedPulling="2026-04-23 16:37:41.488186608 +0000 UTC m=+153.510582702" lastFinishedPulling="2026-04-23 16:37:42.576412116 +0000 UTC m=+154.598808225" observedRunningTime="2026-04-23 16:37:43.03689488 +0000 UTC m=+155.059291008" watchObservedRunningTime="2026-04-23 16:37:43.038342389 +0000 UTC m=+155.060738504" Apr 23 16:37:44.024962 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:44.024888 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lgpt4" event={"ID":"d042bc1e-b16b-4b25-a9e1-19e50f2c799f","Type":"ContainerStarted","Data":"efab61d4d05326529e59a57dec6de1a7e18b0ff10aa4f98605eb4475f1dd3cef"} Apr 23 16:37:44.048125 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:44.048065 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-lgpt4" podStartSLOduration=1.036550295 podStartE2EDuration="3.048045505s" podCreationTimestamp="2026-04-23 16:37:41 +0000 UTC" firstStartedPulling="2026-04-23 16:37:41.741238812 +0000 UTC m=+153.763634910" lastFinishedPulling="2026-04-23 16:37:43.752734012 +0000 UTC m=+155.775130120" observedRunningTime="2026-04-23 16:37:44.046673684 +0000 UTC m=+156.069069823" watchObservedRunningTime="2026-04-23 16:37:44.048045505 +0000 UTC m=+156.070441623" Apr 23 16:37:44.421615 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:44.421567 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-g2wqn" podUID="4db30a17-673a-4844-8750-e939b2e34518" Apr 23 16:37:44.430731 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:44.430690 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-rmrtl" podUID="743aa8f5-75e8-4c04-8f4a-d49896428015" Apr 23 16:37:44.583696 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:37:44.583656 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-h6kzn" podUID="b7f21f2f-2763-41c8-af5e-52de8001226b" Apr 23 16:37:44.690649 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:44.690555 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7nckc\" (UID: \"71af0be6-1f33-49c7-ba45-d12899bb84e6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc" Apr 23 16:37:44.693393 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:44.693348 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/71af0be6-1f33-49c7-ba45-d12899bb84e6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7nckc\" (UID: \"71af0be6-1f33-49c7-ba45-d12899bb84e6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc" Apr 23 16:37:44.966625 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:44.966525 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc" Apr 23 16:37:45.031340 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:45.031315 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rmrtl" Apr 23 16:37:45.031758 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:45.031419 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g2wqn" Apr 23 16:37:45.090001 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:45.089972 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc"] Apr 23 16:37:45.093863 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:37:45.093827 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71af0be6_1f33_49c7_ba45_d12899bb84e6.slice/crio-7688870c2756dda605653c1b831c829c68be6eaa22fb14f6984eef1fc4ddc70d WatchSource:0}: Error finding container 7688870c2756dda605653c1b831c829c68be6eaa22fb14f6984eef1fc4ddc70d: Status 404 returned error can't find the container with id 7688870c2756dda605653c1b831c829c68be6eaa22fb14f6984eef1fc4ddc70d Apr 23 16:37:46.035093 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:46.035050 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc" event={"ID":"71af0be6-1f33-49c7-ba45-d12899bb84e6","Type":"ContainerStarted","Data":"7688870c2756dda605653c1b831c829c68be6eaa22fb14f6984eef1fc4ddc70d"} Apr 23 16:37:47.044531 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:47.044488 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc" event={"ID":"71af0be6-1f33-49c7-ba45-d12899bb84e6","Type":"ContainerStarted","Data":"a0ec454bdd762b581a3d41ba7226ce4994d4d1be632da7a0bc284da2dc39f17c"} Apr 23 16:37:47.087645 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:47.087577 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7nckc" podStartSLOduration=33.288037384 podStartE2EDuration="35.087563288s" podCreationTimestamp="2026-04-23 16:37:12 +0000 UTC" firstStartedPulling="2026-04-23 16:37:45.096281735 +0000 UTC m=+157.118677842" lastFinishedPulling="2026-04-23 16:37:46.895807652 +0000 UTC m=+158.918203746" observedRunningTime="2026-04-23 16:37:47.087349118 +0000 UTC m=+159.109745234" watchObservedRunningTime="2026-04-23 16:37:47.087563288 +0000 UTC m=+159.109959404" Apr 23 16:37:49.330879 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:49.330839 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls\") pod \"dns-default-g2wqn\" (UID: \"4db30a17-673a-4844-8750-e939b2e34518\") " pod="openshift-dns/dns-default-g2wqn" Apr 23 16:37:49.331398 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:49.330916 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert\") pod \"ingress-canary-rmrtl\" (UID: \"743aa8f5-75e8-4c04-8f4a-d49896428015\") " pod="openshift-ingress-canary/ingress-canary-rmrtl" Apr 23 16:37:49.333444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:49.333419 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4db30a17-673a-4844-8750-e939b2e34518-metrics-tls\") pod \"dns-default-g2wqn\" (UID: \"4db30a17-673a-4844-8750-e939b2e34518\") " pod="openshift-dns/dns-default-g2wqn" Apr 23 16:37:49.334421 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:49.334396 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/743aa8f5-75e8-4c04-8f4a-d49896428015-cert\") pod \"ingress-canary-rmrtl\" (UID: \"743aa8f5-75e8-4c04-8f4a-d49896428015\") " pod="openshift-ingress-canary/ingress-canary-rmrtl" Apr 23 16:37:49.535395 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:49.535351 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jmk6t\"" Apr 23 16:37:49.535395 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:49.535353 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-s982g\"" Apr 23 16:37:49.543249 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:49.543228 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g2wqn" Apr 23 16:37:49.543433 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:49.543371 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rmrtl" Apr 23 16:37:49.696995 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:49.696963 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g2wqn"] Apr 23 16:37:49.700341 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:37:49.700312 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4db30a17_673a_4844_8750_e939b2e34518.slice/crio-6cc7aceae70a9ba326143a4b59ef01e58af8544080c160370bff3f51dab552ad WatchSource:0}: Error finding container 6cc7aceae70a9ba326143a4b59ef01e58af8544080c160370bff3f51dab552ad: Status 404 returned error can't find the container with id 6cc7aceae70a9ba326143a4b59ef01e58af8544080c160370bff3f51dab552ad Apr 23 16:37:49.716459 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:49.716429 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rmrtl"] Apr 23 16:37:49.718425 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:37:49.718400 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod743aa8f5_75e8_4c04_8f4a_d49896428015.slice/crio-dc1447665aeee7ced2a2183d902dbf4ebdf7cf79f96543609977754839459e29 WatchSource:0}: Error finding container dc1447665aeee7ced2a2183d902dbf4ebdf7cf79f96543609977754839459e29: Status 404 returned error can't find the container with id dc1447665aeee7ced2a2183d902dbf4ebdf7cf79f96543609977754839459e29 Apr 23 16:37:50.053771 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:50.053691 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rmrtl" event={"ID":"743aa8f5-75e8-4c04-8f4a-d49896428015","Type":"ContainerStarted","Data":"dc1447665aeee7ced2a2183d902dbf4ebdf7cf79f96543609977754839459e29"} Apr 23 16:37:50.054891 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:50.054867 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g2wqn" event={"ID":"4db30a17-673a-4844-8750-e939b2e34518","Type":"ContainerStarted","Data":"6cc7aceae70a9ba326143a4b59ef01e58af8544080c160370bff3f51dab552ad"} Apr 23 16:37:51.182192 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:51.182154 2578 patch_prober.go:28] interesting pod/image-registry-67c5b6577b-5q6ph container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 16:37:51.182621 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:51.182211 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" podUID="1f1ab387-785c-479f-b2b2-27f092332c1b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:37:52.063140 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:52.063096 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rmrtl" event={"ID":"743aa8f5-75e8-4c04-8f4a-d49896428015","Type":"ContainerStarted","Data":"6febba700811d37d6b5350dc1f6a59d8d6f3c29de7f0718fbb4bd6b2d688434b"} Apr 23 16:37:52.066231 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:52.066205 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g2wqn" event={"ID":"4db30a17-673a-4844-8750-e939b2e34518","Type":"ContainerStarted","Data":"68bd56d469ef03ab0d5ac446d1e9993cd806e6ed7fbcca148a84175280ac6641"} Apr 23 16:37:52.066344 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:52.066236 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g2wqn" event={"ID":"4db30a17-673a-4844-8750-e939b2e34518","Type":"ContainerStarted","Data":"8d263407f6c3dd1ff30f57ad849f579faf5147df805b62d5106243ae4aa32a0d"} Apr 23 16:37:52.066544 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:52.066527 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-g2wqn" Apr 23 16:37:52.083191 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:52.083143 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rmrtl" podStartSLOduration=129.024181426 podStartE2EDuration="2m11.083130063s" podCreationTimestamp="2026-04-23 16:35:41 +0000 UTC" firstStartedPulling="2026-04-23 16:37:49.720355561 +0000 UTC m=+161.742751666" lastFinishedPulling="2026-04-23 16:37:51.779304204 +0000 UTC m=+163.801700303" observedRunningTime="2026-04-23 16:37:52.082269962 +0000 UTC m=+164.104666080" watchObservedRunningTime="2026-04-23 16:37:52.083130063 +0000 UTC m=+164.105526230" Apr 23 16:37:52.102877 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:52.102822 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-g2wqn" podStartSLOduration=129.029579739 podStartE2EDuration="2m11.102805047s" podCreationTimestamp="2026-04-23 16:35:41 +0000 UTC" firstStartedPulling="2026-04-23 16:37:49.702518986 +0000 UTC m=+161.724915086" lastFinishedPulling="2026-04-23 16:37:51.775744293 +0000 UTC m=+163.798140394" observedRunningTime="2026-04-23 16:37:52.101229275 +0000 UTC m=+164.123625391" watchObservedRunningTime="2026-04-23 16:37:52.102805047 +0000 UTC m=+164.125201165" Apr 23 16:37:53.364534 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.364496 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-55b96bb664-958rm"] Apr 23 16:37:53.368054 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.368022 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:37:53.371186 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.371161 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 16:37:53.371367 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.371335 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 16:37:53.371921 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.371895 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 16:37:53.372040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.371935 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 16:37:53.372040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.371911 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-q82nr\"" Apr 23 16:37:53.372040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.372028 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 16:37:53.377395 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.377362 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55b96bb664-958rm"] Apr 23 16:37:53.469018 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.468976 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19188d6b-5c97-42ae-92f3-cebd3e2636fb-console-oauth-config\") pod \"console-55b96bb664-958rm\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:37:53.469204 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.469031 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19188d6b-5c97-42ae-92f3-cebd3e2636fb-console-serving-cert\") pod \"console-55b96bb664-958rm\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:37:53.469204 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.469057 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqq6k\" (UniqueName: \"kubernetes.io/projected/19188d6b-5c97-42ae-92f3-cebd3e2636fb-kube-api-access-bqq6k\") pod \"console-55b96bb664-958rm\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:37:53.469204 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.469123 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19188d6b-5c97-42ae-92f3-cebd3e2636fb-console-config\") pod \"console-55b96bb664-958rm\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:37:53.469204 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.469177 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19188d6b-5c97-42ae-92f3-cebd3e2636fb-service-ca\") pod \"console-55b96bb664-958rm\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:37:53.469204 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.469203 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19188d6b-5c97-42ae-92f3-cebd3e2636fb-oauth-serving-cert\") pod \"console-55b96bb664-958rm\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:37:53.570000 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.569967 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19188d6b-5c97-42ae-92f3-cebd3e2636fb-console-config\") pod \"console-55b96bb664-958rm\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:37:53.570174 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.570014 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19188d6b-5c97-42ae-92f3-cebd3e2636fb-service-ca\") pod \"console-55b96bb664-958rm\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:37:53.570174 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.570041 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19188d6b-5c97-42ae-92f3-cebd3e2636fb-oauth-serving-cert\") pod \"console-55b96bb664-958rm\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:37:53.570174 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.570090 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19188d6b-5c97-42ae-92f3-cebd3e2636fb-console-oauth-config\") pod \"console-55b96bb664-958rm\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:37:53.570174 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.570124 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19188d6b-5c97-42ae-92f3-cebd3e2636fb-console-serving-cert\") pod \"console-55b96bb664-958rm\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:37:53.570174 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.570151 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqq6k\" (UniqueName: \"kubernetes.io/projected/19188d6b-5c97-42ae-92f3-cebd3e2636fb-kube-api-access-bqq6k\") pod \"console-55b96bb664-958rm\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:37:53.570689 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.570665 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19188d6b-5c97-42ae-92f3-cebd3e2636fb-service-ca\") pod \"console-55b96bb664-958rm\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:37:53.570803 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.570782 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19188d6b-5c97-42ae-92f3-cebd3e2636fb-oauth-serving-cert\") pod \"console-55b96bb664-958rm\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:37:53.570858 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.570840 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19188d6b-5c97-42ae-92f3-cebd3e2636fb-console-config\") pod \"console-55b96bb664-958rm\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:37:53.572588 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.572568 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19188d6b-5c97-42ae-92f3-cebd3e2636fb-console-oauth-config\") pod \"console-55b96bb664-958rm\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:37:53.572695 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.572573 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19188d6b-5c97-42ae-92f3-cebd3e2636fb-console-serving-cert\") pod \"console-55b96bb664-958rm\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:37:53.583570 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.583549 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqq6k\" (UniqueName: \"kubernetes.io/projected/19188d6b-5c97-42ae-92f3-cebd3e2636fb-kube-api-access-bqq6k\") pod \"console-55b96bb664-958rm\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:37:53.679298 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.679218 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:37:53.803778 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:53.803739 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55b96bb664-958rm"] Apr 23 16:37:53.808025 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:37:53.807993 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19188d6b_5c97_42ae_92f3_cebd3e2636fb.slice/crio-0e76f96ec06c58a36937c25465b25ae523fb3ccc6cf613ba773d0f4bead6f041 WatchSource:0}: Error finding container 0e76f96ec06c58a36937c25465b25ae523fb3ccc6cf613ba773d0f4bead6f041: Status 404 returned error can't find the container with id 0e76f96ec06c58a36937c25465b25ae523fb3ccc6cf613ba773d0f4bead6f041 Apr 23 16:37:54.071625 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:54.071586 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55b96bb664-958rm" event={"ID":"19188d6b-5c97-42ae-92f3-cebd3e2636fb","Type":"ContainerStarted","Data":"0e76f96ec06c58a36937c25465b25ae523fb3ccc6cf613ba773d0f4bead6f041"} Apr 23 16:37:55.065798 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.065751 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m"] Apr 23 16:37:55.069717 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.069690 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m" Apr 23 16:37:55.073255 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.073232 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 23 16:37:55.073970 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.073936 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-djf6z\"" Apr 23 16:37:55.073970 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.073955 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 23 16:37:55.074135 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.073990 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 16:37:55.084767 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.084739 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m"] Apr 23 16:37:55.087160 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.087132 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-77j9v"] Apr 23 16:37:55.090790 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.090771 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.117746 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.117714 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 16:37:55.117746 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.117732 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 16:37:55.117917 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.117794 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-vplks\"" Apr 23 16:37:55.117917 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.117805 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 16:37:55.184743 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.184664 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kww9r\" (UniqueName: \"kubernetes.io/projected/7c860e58-8c97-4fba-b206-c1a4c598ff18-kube-api-access-kww9r\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.184743 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.184710 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7c860e58-8c97-4fba-b206-c1a4c598ff18-root\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.184949 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.184775 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c860e58-8c97-4fba-b206-c1a4c598ff18-sys\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.184949 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.184806 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c860e58-8c97-4fba-b206-c1a4c598ff18-metrics-client-ca\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.184949 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.184836 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7c860e58-8c97-4fba-b206-c1a4c598ff18-node-exporter-textfile\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.184949 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.184860 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/00590187-4b05-446e-b9d1-efc30e43aec4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gd78m\" (UID: \"00590187-4b05-446e-b9d1-efc30e43aec4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m" Apr 23 16:37:55.185130 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.184950 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c860e58-8c97-4fba-b206-c1a4c598ff18-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.185130 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.185010 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00590187-4b05-446e-b9d1-efc30e43aec4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gd78m\" (UID: \"00590187-4b05-446e-b9d1-efc30e43aec4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m" Apr 23 16:37:55.185130 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.185051 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7c860e58-8c97-4fba-b206-c1a4c598ff18-node-exporter-tls\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.185130 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.185099 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7c860e58-8c97-4fba-b206-c1a4c598ff18-node-exporter-accelerators-collector-config\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.185130 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.185126 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjlzb\" (UniqueName: \"kubernetes.io/projected/00590187-4b05-446e-b9d1-efc30e43aec4-kube-api-access-sjlzb\") pod \"openshift-state-metrics-9d44df66c-gd78m\" (UID: \"00590187-4b05-446e-b9d1-efc30e43aec4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m" Apr 23 16:37:55.185446 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.185154 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7c860e58-8c97-4fba-b206-c1a4c598ff18-node-exporter-wtmp\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.185446 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.185186 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/00590187-4b05-446e-b9d1-efc30e43aec4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gd78m\" (UID: \"00590187-4b05-446e-b9d1-efc30e43aec4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m" Apr 23 16:37:55.286468 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.286427 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c860e58-8c97-4fba-b206-c1a4c598ff18-sys\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.286646 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.286474 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c860e58-8c97-4fba-b206-c1a4c598ff18-metrics-client-ca\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.286646 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.286500 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7c860e58-8c97-4fba-b206-c1a4c598ff18-node-exporter-textfile\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.286646 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.286523 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/00590187-4b05-446e-b9d1-efc30e43aec4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gd78m\" (UID: \"00590187-4b05-446e-b9d1-efc30e43aec4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m" Apr 23 16:37:55.286646 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.286558 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c860e58-8c97-4fba-b206-c1a4c598ff18-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.286646 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.286592 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00590187-4b05-446e-b9d1-efc30e43aec4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gd78m\" (UID: \"00590187-4b05-446e-b9d1-efc30e43aec4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m" Apr 23 16:37:55.286646 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.286633 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7c860e58-8c97-4fba-b206-c1a4c598ff18-node-exporter-tls\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.286960 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.286681 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7c860e58-8c97-4fba-b206-c1a4c598ff18-node-exporter-accelerators-collector-config\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.286960 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.286709 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjlzb\" (UniqueName: \"kubernetes.io/projected/00590187-4b05-446e-b9d1-efc30e43aec4-kube-api-access-sjlzb\") pod \"openshift-state-metrics-9d44df66c-gd78m\" (UID: \"00590187-4b05-446e-b9d1-efc30e43aec4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m" Apr 23 16:37:55.286960 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.286737 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7c860e58-8c97-4fba-b206-c1a4c598ff18-node-exporter-wtmp\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.286960 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.286771 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/00590187-4b05-446e-b9d1-efc30e43aec4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gd78m\" (UID: \"00590187-4b05-446e-b9d1-efc30e43aec4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m" Apr 23 16:37:55.286960 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.286802 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kww9r\" (UniqueName: \"kubernetes.io/projected/7c860e58-8c97-4fba-b206-c1a4c598ff18-kube-api-access-kww9r\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.286960 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.286831 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7c860e58-8c97-4fba-b206-c1a4c598ff18-root\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.286960 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.286886 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7c860e58-8c97-4fba-b206-c1a4c598ff18-node-exporter-textfile\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.286960 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.286554 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7c860e58-8c97-4fba-b206-c1a4c598ff18-sys\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.286960 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.286953 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7c860e58-8c97-4fba-b206-c1a4c598ff18-root\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.287418 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.287125 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c860e58-8c97-4fba-b206-c1a4c598ff18-metrics-client-ca\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.287514 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.287493 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7c860e58-8c97-4fba-b206-c1a4c598ff18-node-exporter-accelerators-collector-config\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.287590 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.287567 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00590187-4b05-446e-b9d1-efc30e43aec4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gd78m\" (UID: \"00590187-4b05-446e-b9d1-efc30e43aec4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m" Apr 23 16:37:55.287648 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.287631 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7c860e58-8c97-4fba-b206-c1a4c598ff18-node-exporter-wtmp\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.291504 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.291445 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c860e58-8c97-4fba-b206-c1a4c598ff18-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.291621 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.291585 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/00590187-4b05-446e-b9d1-efc30e43aec4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gd78m\" (UID: \"00590187-4b05-446e-b9d1-efc30e43aec4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m" Apr 23 16:37:55.293238 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.293208 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/00590187-4b05-446e-b9d1-efc30e43aec4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gd78m\" (UID: \"00590187-4b05-446e-b9d1-efc30e43aec4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m" Apr 23 16:37:55.293541 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.293510 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7c860e58-8c97-4fba-b206-c1a4c598ff18-node-exporter-tls\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.299652 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.299607 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjlzb\" (UniqueName: \"kubernetes.io/projected/00590187-4b05-446e-b9d1-efc30e43aec4-kube-api-access-sjlzb\") pod \"openshift-state-metrics-9d44df66c-gd78m\" (UID: \"00590187-4b05-446e-b9d1-efc30e43aec4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m" Apr 23 16:37:55.300199 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.300160 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kww9r\" (UniqueName: \"kubernetes.io/projected/7c860e58-8c97-4fba-b206-c1a4c598ff18-kube-api-access-kww9r\") pod \"node-exporter-77j9v\" (UID: \"7c860e58-8c97-4fba-b206-c1a4c598ff18\") " pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.382527 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.382442 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m" Apr 23 16:37:55.402809 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.402779 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-77j9v" Apr 23 16:37:55.418253 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:37:55.418216 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c860e58_8c97_4fba_b206_c1a4c598ff18.slice/crio-881f7ab219f500915aca789441999c0934eaaa1f0f409df8850bc76d59e093e6 WatchSource:0}: Error finding container 881f7ab219f500915aca789441999c0934eaaa1f0f409df8850bc76d59e093e6: Status 404 returned error can't find the container with id 881f7ab219f500915aca789441999c0934eaaa1f0f409df8850bc76d59e093e6 Apr 23 16:37:55.534463 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:55.534406 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m"] Apr 23 16:37:55.539118 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:37:55.539070 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00590187_4b05_446e_b9d1_efc30e43aec4.slice/crio-b62a71972b54607803abda7b9a10d871a1df6c3f7c43f7c168567809e1652fd9 WatchSource:0}: Error finding container b62a71972b54607803abda7b9a10d871a1df6c3f7c43f7c168567809e1652fd9: Status 404 returned error can't find the container with id b62a71972b54607803abda7b9a10d871a1df6c3f7c43f7c168567809e1652fd9 Apr 23 16:37:56.086493 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.086456 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m" event={"ID":"00590187-4b05-446e-b9d1-efc30e43aec4","Type":"ContainerStarted","Data":"947f5230db4cde951cc2453b0e1d74b572d2f92d513966f73e02557f6cd28152"} Apr 23 16:37:56.086493 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.086498 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m" event={"ID":"00590187-4b05-446e-b9d1-efc30e43aec4","Type":"ContainerStarted","Data":"211a0f7678aec3f1921259ef190470210ae329a87c53d4055cc73d04f6e2e8b9"} Apr 23 16:37:56.087015 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.086514 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m" event={"ID":"00590187-4b05-446e-b9d1-efc30e43aec4","Type":"ContainerStarted","Data":"b62a71972b54607803abda7b9a10d871a1df6c3f7c43f7c168567809e1652fd9"} Apr 23 16:37:56.087858 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.087825 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-77j9v" event={"ID":"7c860e58-8c97-4fba-b206-c1a4c598ff18","Type":"ContainerStarted","Data":"881f7ab219f500915aca789441999c0934eaaa1f0f409df8850bc76d59e093e6"} Apr 23 16:37:56.178028 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.177988 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:37:56.182268 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.182247 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.184591 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.184569 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 16:37:56.184698 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.184627 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 16:37:56.184981 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.184959 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 16:37:56.185096 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.184982 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 16:37:56.185096 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.184982 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-h22r9\"" Apr 23 16:37:56.185510 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.185323 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 16:37:56.185510 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.185445 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 16:37:56.185510 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.185449 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 16:37:56.185510 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.185507 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 16:37:56.185789 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.185493 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 16:37:56.197704 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.197681 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:37:56.294961 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.294917 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/31edc170-24b8-484a-9f30-9c9cc72cd719-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.295145 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.294975 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-config-volume\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.295145 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.295041 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.295145 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.295092 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.295298 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.295133 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-web-config\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.295298 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.295191 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.295298 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.295243 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4j76\" (UniqueName: \"kubernetes.io/projected/31edc170-24b8-484a-9f30-9c9cc72cd719-kube-api-access-f4j76\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.295457 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.295293 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31edc170-24b8-484a-9f30-9c9cc72cd719-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.295457 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.295334 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/31edc170-24b8-484a-9f30-9c9cc72cd719-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.295457 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.295364 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/31edc170-24b8-484a-9f30-9c9cc72cd719-config-out\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.295457 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.295408 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.295457 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.295447 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.295681 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.295507 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/31edc170-24b8-484a-9f30-9c9cc72cd719-tls-assets\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.396547 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.396433 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/31edc170-24b8-484a-9f30-9c9cc72cd719-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.396547 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.396485 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-config-volume\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.396547 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.396533 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.396824 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.396581 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.396824 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.396604 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-web-config\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.396824 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.396640 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.396824 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.396693 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4j76\" (UniqueName: \"kubernetes.io/projected/31edc170-24b8-484a-9f30-9c9cc72cd719-kube-api-access-f4j76\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.396824 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.396720 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31edc170-24b8-484a-9f30-9c9cc72cd719-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.396824 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.396752 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/31edc170-24b8-484a-9f30-9c9cc72cd719-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.396824 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.396784 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/31edc170-24b8-484a-9f30-9c9cc72cd719-config-out\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.396824 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.396811 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.397199 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.396855 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.397199 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.396903 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/31edc170-24b8-484a-9f30-9c9cc72cd719-tls-assets\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.398505 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.397713 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/31edc170-24b8-484a-9f30-9c9cc72cd719-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.398505 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.398488 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/31edc170-24b8-484a-9f30-9c9cc72cd719-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.398775 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.398569 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31edc170-24b8-484a-9f30-9c9cc72cd719-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.400796 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.399931 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/31edc170-24b8-484a-9f30-9c9cc72cd719-config-out\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.401915 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.401863 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.402510 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.402188 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.402510 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.402273 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/31edc170-24b8-484a-9f30-9c9cc72cd719-tls-assets\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.402680 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.402658 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.402774 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.402749 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.403663 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.403639 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-web-config\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.403749 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.403710 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-config-volume\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.404791 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.404766 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.405395 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.405361 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4j76\" (UniqueName: \"kubernetes.io/projected/31edc170-24b8-484a-9f30-9c9cc72cd719-kube-api-access-f4j76\") pod \"alertmanager-main-0\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:56.494555 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:56.494521 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:37:59.502328 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.502290 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-558c4f944f-hzqnx"] Apr 23 16:37:59.505829 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.505804 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.508025 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.507992 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 23 16:37:59.508144 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.507992 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-26gw9\"" Apr 23 16:37:59.508144 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.508048 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 23 16:37:59.508801 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.508591 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 16:37:59.508801 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.508745 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-1jesnsk39id0a\"" Apr 23 16:37:59.508801 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.508770 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 23 16:37:59.517174 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.517156 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-558c4f944f-hzqnx"] Apr 23 16:37:59.571358 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.571327 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:37:59.627472 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.627436 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-secret-metrics-server-tls\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.627633 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.627500 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94xl7\" (UniqueName: \"kubernetes.io/projected/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-kube-api-access-94xl7\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.627633 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.627548 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-metrics-server-audit-profiles\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.627633 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.627582 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.627633 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.627632 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-secret-metrics-server-client-certs\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.627868 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.627666 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-client-ca-bundle\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.627868 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.627729 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-audit-log\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.728368 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.728331 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-metrics-server-audit-profiles\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.728561 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.728403 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.728561 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.728453 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-secret-metrics-server-client-certs\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.728561 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.728490 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-client-ca-bundle\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.728561 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.728540 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-audit-log\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.728783 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.728603 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-secret-metrics-server-tls\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.728783 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.728657 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94xl7\" (UniqueName: \"kubernetes.io/projected/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-kube-api-access-94xl7\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.729143 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.729113 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-audit-log\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.729269 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.729186 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.729535 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.729485 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-metrics-server-audit-profiles\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.731453 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.731430 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-secret-metrics-server-client-certs\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.731568 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.731551 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-client-ca-bundle\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.731633 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.731613 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-secret-metrics-server-tls\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.749810 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.749773 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94xl7\" (UniqueName: \"kubernetes.io/projected/ab3ca3b1-edf8-47a5-8f0d-357e4211c00f-kube-api-access-94xl7\") pod \"metrics-server-558c4f944f-hzqnx\" (UID: \"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f\") " pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.817828 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.817798 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:37:59.839481 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.839449 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-6btkg"] Apr 23 16:37:59.844155 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.844126 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6btkg" Apr 23 16:37:59.847943 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.847918 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-qjjxh\"" Apr 23 16:37:59.848102 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.848079 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 23 16:37:59.857346 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.857280 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-6btkg"] Apr 23 16:37:59.930956 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:37:59.930902 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1a7f563a-e902-46a6-ba9c-dab961c3b378-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6btkg\" (UID: \"1a7f563a-e902-46a6-ba9c-dab961c3b378\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6btkg" Apr 23 16:38:00.032370 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:00.032283 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1a7f563a-e902-46a6-ba9c-dab961c3b378-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6btkg\" (UID: \"1a7f563a-e902-46a6-ba9c-dab961c3b378\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6btkg" Apr 23 16:38:00.035313 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:00.035286 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1a7f563a-e902-46a6-ba9c-dab961c3b378-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-6btkg\" (UID: \"1a7f563a-e902-46a6-ba9c-dab961c3b378\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6btkg" Apr 23 16:38:00.155232 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:00.155140 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6btkg" Apr 23 16:38:01.181015 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:01.180974 2578 patch_prober.go:28] interesting pod/image-registry-67c5b6577b-5q6ph container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 16:38:01.181444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:01.181035 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" podUID="1f1ab387-785c-479f-b2b2-27f092332c1b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:38:01.184458 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:01.184429 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55b96bb664-958rm"] Apr 23 16:38:01.614724 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:01.614688 2578 patch_prober.go:28] interesting pod/image-registry-58544877c9-fd586 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 16:38:01.614887 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:01.614746 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-58544877c9-fd586" podUID="0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:38:02.070283 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:02.070259 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-g2wqn" Apr 23 16:38:03.026347 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:03.026314 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-58544877c9-fd586" Apr 23 16:38:04.062679 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:04.062649 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-6btkg"] Apr 23 16:38:04.113119 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:38:04.113087 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a7f563a_e902_46a6_ba9c_dab961c3b378.slice/crio-d0ab139166b3948a7f508d3bba8fe63b0903a117f3d30664e297152421ac793c WatchSource:0}: Error finding container d0ab139166b3948a7f508d3bba8fe63b0903a117f3d30664e297152421ac793c: Status 404 returned error can't find the container with id d0ab139166b3948a7f508d3bba8fe63b0903a117f3d30664e297152421ac793c Apr 23 16:38:04.269891 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:04.269861 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-558c4f944f-hzqnx"] Apr 23 16:38:04.272112 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:38:04.272069 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab3ca3b1_edf8_47a5_8f0d_357e4211c00f.slice/crio-77d5bab27da2ec5a041358349aeeef7687931432f4723777853adf5659e10fa2 WatchSource:0}: Error finding container 77d5bab27da2ec5a041358349aeeef7687931432f4723777853adf5659e10fa2: Status 404 returned error can't find the container with id 77d5bab27da2ec5a041358349aeeef7687931432f4723777853adf5659e10fa2 Apr 23 16:38:04.278588 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:04.278473 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:38:04.281189 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:38:04.281112 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31edc170_24b8_484a_9f30_9c9cc72cd719.slice/crio-54eb42b3f3e62d47178f50d6ac39b9a15125d6e32ed8caf1aac68a639e123039 WatchSource:0}: Error finding container 54eb42b3f3e62d47178f50d6ac39b9a15125d6e32ed8caf1aac68a639e123039: Status 404 returned error can't find the container with id 54eb42b3f3e62d47178f50d6ac39b9a15125d6e32ed8caf1aac68a639e123039 Apr 23 16:38:05.121543 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.121486 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" event={"ID":"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f","Type":"ContainerStarted","Data":"77d5bab27da2ec5a041358349aeeef7687931432f4723777853adf5659e10fa2"} Apr 23 16:38:05.124270 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.124237 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-csnvz" event={"ID":"99a16fcc-e537-4736-a7fd-4a673684aa6e","Type":"ContainerStarted","Data":"2483dfae3216fedb61e286eedd05a1461c8d800dd934e4a96c0330f5d3b97383"} Apr 23 16:38:05.124604 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.124577 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-csnvz" Apr 23 16:38:05.126773 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.126724 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6btkg" event={"ID":"1a7f563a-e902-46a6-ba9c-dab961c3b378","Type":"ContainerStarted","Data":"d0ab139166b3948a7f508d3bba8fe63b0903a117f3d30664e297152421ac793c"} Apr 23 16:38:05.129500 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.129139 2578 generic.go:358] "Generic (PLEG): container finished" podID="7c860e58-8c97-4fba-b206-c1a4c598ff18" containerID="df5f53a0002d4e82e4bd042e63411d15b6b6e94b2d7fcedff7a7a3b1717d7841" exitCode=0 Apr 23 16:38:05.129500 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.129226 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-77j9v" event={"ID":"7c860e58-8c97-4fba-b206-c1a4c598ff18","Type":"ContainerDied","Data":"df5f53a0002d4e82e4bd042e63411d15b6b6e94b2d7fcedff7a7a3b1717d7841"} Apr 23 16:38:05.131962 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.131917 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31edc170-24b8-484a-9f30-9c9cc72cd719","Type":"ContainerStarted","Data":"54eb42b3f3e62d47178f50d6ac39b9a15125d6e32ed8caf1aac68a639e123039"} Apr 23 16:38:05.136321 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.136294 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m" event={"ID":"00590187-4b05-446e-b9d1-efc30e43aec4","Type":"ContainerStarted","Data":"bbfbe13d536eb8d584ae9ff6dffe4cad95f8340f64506a44bf69944c5d30cd96"} Apr 23 16:38:05.139348 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.139321 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55b96bb664-958rm" event={"ID":"19188d6b-5c97-42ae-92f3-cebd3e2636fb","Type":"ContainerStarted","Data":"d631fa529d845a9dd118ae169be9ca5a927ffa594b7b16e14b9f5ef061a171d1"} Apr 23 16:38:05.140743 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.140720 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-csnvz" Apr 23 16:38:05.145154 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.143271 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-csnvz" podStartSLOduration=2.879764068 podStartE2EDuration="24.143257343s" podCreationTimestamp="2026-04-23 16:37:41 +0000 UTC" firstStartedPulling="2026-04-23 16:37:42.852571049 +0000 UTC m=+154.874967143" lastFinishedPulling="2026-04-23 16:38:04.116064316 +0000 UTC m=+176.138460418" observedRunningTime="2026-04-23 16:38:05.142127679 +0000 UTC m=+177.164523795" watchObservedRunningTime="2026-04-23 16:38:05.143257343 +0000 UTC m=+177.165653456" Apr 23 16:38:05.189112 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.189062 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gd78m" podStartSLOduration=1.7704619560000001 podStartE2EDuration="10.189043432s" podCreationTimestamp="2026-04-23 16:37:55 +0000 UTC" firstStartedPulling="2026-04-23 16:37:55.70437298 +0000 UTC m=+167.726769082" lastFinishedPulling="2026-04-23 16:38:04.122954451 +0000 UTC m=+176.145350558" observedRunningTime="2026-04-23 16:38:05.188442361 +0000 UTC m=+177.210838477" watchObservedRunningTime="2026-04-23 16:38:05.189043432 +0000 UTC m=+177.211439549" Apr 23 16:38:05.189922 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.189766 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55b96bb664-958rm" podStartSLOduration=2.130613646 podStartE2EDuration="12.189754074s" podCreationTimestamp="2026-04-23 16:37:53 +0000 UTC" firstStartedPulling="2026-04-23 16:37:53.809920816 +0000 UTC m=+165.832316910" lastFinishedPulling="2026-04-23 16:38:03.869061234 +0000 UTC m=+175.891457338" observedRunningTime="2026-04-23 16:38:05.161133649 +0000 UTC m=+177.183529766" watchObservedRunningTime="2026-04-23 16:38:05.189754074 +0000 UTC m=+177.212150191" Apr 23 16:38:05.674179 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.674133 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c76c6945d-7thrk"] Apr 23 16:38:05.698444 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.698411 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.698885 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.698855 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c76c6945d-7thrk"] Apr 23 16:38:05.706560 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.706524 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 16:38:05.795633 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.795606 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e52370b-5f13-4378-b500-d5e555a57e65-console-oauth-config\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.795825 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.795642 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-console-config\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.795825 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.795696 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fss2s\" (UniqueName: \"kubernetes.io/projected/8e52370b-5f13-4378-b500-d5e555a57e65-kube-api-access-fss2s\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.795942 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.795908 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-service-ca\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.795999 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.795962 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-trusted-ca-bundle\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.796097 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.796070 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-oauth-serving-cert\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.796160 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.796117 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e52370b-5f13-4378-b500-d5e555a57e65-console-serving-cert\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.896962 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.896809 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-oauth-serving-cert\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.896962 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.896866 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e52370b-5f13-4378-b500-d5e555a57e65-console-serving-cert\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.896962 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.896915 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e52370b-5f13-4378-b500-d5e555a57e65-console-oauth-config\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.897252 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.897112 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-console-config\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.897252 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.897194 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fss2s\" (UniqueName: \"kubernetes.io/projected/8e52370b-5f13-4378-b500-d5e555a57e65-kube-api-access-fss2s\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.897252 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.897244 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-service-ca\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.897412 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.897269 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-trusted-ca-bundle\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.898027 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.897966 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-oauth-serving-cert\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.898793 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.898768 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-service-ca\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.898892 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.898845 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-console-config\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.899267 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.899235 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-trusted-ca-bundle\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.901577 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.901534 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e52370b-5f13-4378-b500-d5e555a57e65-console-oauth-config\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.903349 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.903303 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e52370b-5f13-4378-b500-d5e555a57e65-console-serving-cert\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:05.905655 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:05.905636 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fss2s\" (UniqueName: \"kubernetes.io/projected/8e52370b-5f13-4378-b500-d5e555a57e65-kube-api-access-fss2s\") pod \"console-6c76c6945d-7thrk\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:06.013067 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:06.013015 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:06.144213 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:06.144176 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-77j9v" event={"ID":"7c860e58-8c97-4fba-b206-c1a4c598ff18","Type":"ContainerStarted","Data":"58b64540819b76864e9ce6b2b0486d2ac69b3c3cadd7b3387ba39d636c4e2ef4"} Apr 23 16:38:06.194589 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:06.194546 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" podUID="1f1ab387-785c-479f-b2b2-27f092332c1b" containerName="registry" containerID="cri-o://f9c9787c2acd1a88af15b920115dcaab39e9abf9af00750fa355eb8287e26f3c" gracePeriod=30 Apr 23 16:38:07.148896 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.148820 2578 generic.go:358] "Generic (PLEG): container finished" podID="1f1ab387-785c-479f-b2b2-27f092332c1b" containerID="f9c9787c2acd1a88af15b920115dcaab39e9abf9af00750fa355eb8287e26f3c" exitCode=0 Apr 23 16:38:07.149284 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.148909 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" event={"ID":"1f1ab387-785c-479f-b2b2-27f092332c1b","Type":"ContainerDied","Data":"f9c9787c2acd1a88af15b920115dcaab39e9abf9af00750fa355eb8287e26f3c"} Apr 23 16:38:07.425425 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.424630 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:38:07.512781 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.512533 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c76c6945d-7thrk"] Apr 23 16:38:07.519630 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.518867 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-bound-sa-token\") pod \"1f1ab387-785c-479f-b2b2-27f092332c1b\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " Apr 23 16:38:07.519630 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.518941 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f1ab387-785c-479f-b2b2-27f092332c1b-trusted-ca\") pod \"1f1ab387-785c-479f-b2b2-27f092332c1b\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " Apr 23 16:38:07.519630 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.519004 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1f1ab387-785c-479f-b2b2-27f092332c1b-image-registry-private-configuration\") pod \"1f1ab387-785c-479f-b2b2-27f092332c1b\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " Apr 23 16:38:07.519630 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.519036 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54qlk\" (UniqueName: \"kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-kube-api-access-54qlk\") pod \"1f1ab387-785c-479f-b2b2-27f092332c1b\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " Apr 23 16:38:07.519630 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.519067 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls\") pod \"1f1ab387-785c-479f-b2b2-27f092332c1b\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " Apr 23 16:38:07.519630 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.519110 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-certificates\") pod \"1f1ab387-785c-479f-b2b2-27f092332c1b\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " Apr 23 16:38:07.519630 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.519129 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1f1ab387-785c-479f-b2b2-27f092332c1b-installation-pull-secrets\") pod \"1f1ab387-785c-479f-b2b2-27f092332c1b\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " Apr 23 16:38:07.519630 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.519155 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1f1ab387-785c-479f-b2b2-27f092332c1b-ca-trust-extracted\") pod \"1f1ab387-785c-479f-b2b2-27f092332c1b\" (UID: \"1f1ab387-785c-479f-b2b2-27f092332c1b\") " Apr 23 16:38:07.520137 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.519970 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f1ab387-785c-479f-b2b2-27f092332c1b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1f1ab387-785c-479f-b2b2-27f092332c1b" (UID: "1f1ab387-785c-479f-b2b2-27f092332c1b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:07.521782 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.520997 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1f1ab387-785c-479f-b2b2-27f092332c1b" (UID: "1f1ab387-785c-479f-b2b2-27f092332c1b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:07.526495 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.526454 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "1f1ab387-785c-479f-b2b2-27f092332c1b" (UID: "1f1ab387-785c-479f-b2b2-27f092332c1b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:38:07.529649 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.529584 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f1ab387-785c-479f-b2b2-27f092332c1b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1f1ab387-785c-479f-b2b2-27f092332c1b" (UID: "1f1ab387-785c-479f-b2b2-27f092332c1b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:38:07.529649 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.529605 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f1ab387-785c-479f-b2b2-27f092332c1b-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "1f1ab387-785c-479f-b2b2-27f092332c1b" (UID: "1f1ab387-785c-479f-b2b2-27f092332c1b"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:38:07.529943 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.529907 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-kube-api-access-54qlk" (OuterVolumeSpecName: "kube-api-access-54qlk") pod "1f1ab387-785c-479f-b2b2-27f092332c1b" (UID: "1f1ab387-785c-479f-b2b2-27f092332c1b"). InnerVolumeSpecName "kube-api-access-54qlk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:38:07.530087 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.530059 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1f1ab387-785c-479f-b2b2-27f092332c1b" (UID: "1f1ab387-785c-479f-b2b2-27f092332c1b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:38:07.535463 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.535402 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f1ab387-785c-479f-b2b2-27f092332c1b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1f1ab387-785c-479f-b2b2-27f092332c1b" (UID: "1f1ab387-785c-479f-b2b2-27f092332c1b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:38:07.620675 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.620647 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1f1ab387-785c-479f-b2b2-27f092332c1b-trusted-ca\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:07.620675 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.620677 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1f1ab387-785c-479f-b2b2-27f092332c1b-image-registry-private-configuration\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:07.620860 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.620693 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-54qlk\" (UniqueName: \"kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-kube-api-access-54qlk\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:07.620860 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.620707 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-tls\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:07.620860 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.620720 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1f1ab387-785c-479f-b2b2-27f092332c1b-registry-certificates\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:07.620860 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.620735 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1f1ab387-785c-479f-b2b2-27f092332c1b-installation-pull-secrets\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:07.620860 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.620748 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1f1ab387-785c-479f-b2b2-27f092332c1b-ca-trust-extracted\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:07.620860 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:07.620760 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1f1ab387-785c-479f-b2b2-27f092332c1b-bound-sa-token\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:08.153426 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:08.153362 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" event={"ID":"1f1ab387-785c-479f-b2b2-27f092332c1b","Type":"ContainerDied","Data":"bebc79c6e41d408a9b1f2fdd8ddab1d4fecef7da45f06c86de93f58356069cad"} Apr 23 16:38:08.153426 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:08.153411 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67c5b6577b-5q6ph" Apr 23 16:38:08.153909 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:08.153473 2578 scope.go:117] "RemoveContainer" containerID="f9c9787c2acd1a88af15b920115dcaab39e9abf9af00750fa355eb8287e26f3c" Apr 23 16:38:08.154993 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:08.154962 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" event={"ID":"ab3ca3b1-edf8-47a5-8f0d-357e4211c00f","Type":"ContainerStarted","Data":"041d1f93cc29018a616bb8f926bb5c041d897f4260f67576cd719609fd5445ff"} Apr 23 16:38:08.156637 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:08.156613 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c76c6945d-7thrk" event={"ID":"8e52370b-5f13-4378-b500-d5e555a57e65","Type":"ContainerStarted","Data":"c78b018771796beaa0f3e0b031776ce705bb1e78e50a297d7c91bf20bf47e4f5"} Apr 23 16:38:08.156736 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:08.156645 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c76c6945d-7thrk" event={"ID":"8e52370b-5f13-4378-b500-d5e555a57e65","Type":"ContainerStarted","Data":"a5cdeb4f878855a8ad9adb4fd8f0748e28a396cb1cb0643e49bcbb2b921e8950"} Apr 23 16:38:08.158170 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:08.158140 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6btkg" event={"ID":"1a7f563a-e902-46a6-ba9c-dab961c3b378","Type":"ContainerStarted","Data":"608ad2184cd44d63f0ef9bc4f8937f556650f22a2aeabcc3737172dc07a8785e"} Apr 23 16:38:08.158335 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:08.158318 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6btkg" Apr 23 16:38:08.160274 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:08.160247 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-77j9v" event={"ID":"7c860e58-8c97-4fba-b206-c1a4c598ff18","Type":"ContainerStarted","Data":"4468a03f6ffafaf4a12658faa74c4ecba6a8e120e4add81e9a98b49ca14937a1"} Apr 23 16:38:08.161891 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:08.161863 2578 generic.go:358] "Generic (PLEG): container finished" podID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerID="ad8637153f980fc903f44a604159fd1b53ffcec31aaca0a78d865147d527f06f" exitCode=0 Apr 23 16:38:08.161976 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:08.161915 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31edc170-24b8-484a-9f30-9c9cc72cd719","Type":"ContainerDied","Data":"ad8637153f980fc903f44a604159fd1b53ffcec31aaca0a78d865147d527f06f"} Apr 23 16:38:08.164469 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:08.164449 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6btkg" Apr 23 16:38:08.177412 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:08.177195 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c76c6945d-7thrk" podStartSLOduration=3.177181604 podStartE2EDuration="3.177181604s" podCreationTimestamp="2026-04-23 16:38:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:38:08.177067204 +0000 UTC m=+180.199463331" watchObservedRunningTime="2026-04-23 16:38:08.177181604 +0000 UTC m=+180.199577721" Apr 23 16:38:08.198808 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:08.198751 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" podStartSLOduration=6.140079223 podStartE2EDuration="9.198736035s" podCreationTimestamp="2026-04-23 16:37:59 +0000 UTC" firstStartedPulling="2026-04-23 16:38:04.274535452 +0000 UTC m=+176.296931546" lastFinishedPulling="2026-04-23 16:38:07.33319225 +0000 UTC m=+179.355588358" observedRunningTime="2026-04-23 16:38:08.196849402 +0000 UTC m=+180.219245745" watchObservedRunningTime="2026-04-23 16:38:08.198736035 +0000 UTC m=+180.221132152" Apr 23 16:38:08.218815 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:08.218759 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-77j9v" podStartSLOduration=4.4713530089999995 podStartE2EDuration="13.218745026s" podCreationTimestamp="2026-04-23 16:37:55 +0000 UTC" firstStartedPulling="2026-04-23 16:37:55.421688089 +0000 UTC m=+167.444084182" lastFinishedPulling="2026-04-23 16:38:04.169080088 +0000 UTC m=+176.191476199" observedRunningTime="2026-04-23 16:38:08.217130953 +0000 UTC m=+180.239527093" watchObservedRunningTime="2026-04-23 16:38:08.218745026 +0000 UTC m=+180.241141143" Apr 23 16:38:08.260612 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:08.260576 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67c5b6577b-5q6ph"] Apr 23 16:38:08.264244 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:08.264214 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-67c5b6577b-5q6ph"] Apr 23 16:38:08.281859 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:08.281803 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-6btkg" podStartSLOduration=6.058814943 podStartE2EDuration="9.281788167s" podCreationTimestamp="2026-04-23 16:37:59 +0000 UTC" firstStartedPulling="2026-04-23 16:38:04.115627469 +0000 UTC m=+176.138023577" lastFinishedPulling="2026-04-23 16:38:07.338600706 +0000 UTC m=+179.360996801" observedRunningTime="2026-04-23 16:38:08.28028168 +0000 UTC m=+180.302677795" watchObservedRunningTime="2026-04-23 16:38:08.281788167 +0000 UTC m=+180.304184285" Apr 23 16:38:08.577997 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:08.577966 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f1ab387-785c-479f-b2b2-27f092332c1b" path="/var/lib/kubelet/pods/1f1ab387-785c-479f-b2b2-27f092332c1b/volumes" Apr 23 16:38:11.175524 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:11.175490 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31edc170-24b8-484a-9f30-9c9cc72cd719","Type":"ContainerStarted","Data":"bbda637009580acf9c3c3d3d1fd003627fc07f7b7673c818d483277ed497ec2c"} Apr 23 16:38:11.175886 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:11.175532 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31edc170-24b8-484a-9f30-9c9cc72cd719","Type":"ContainerStarted","Data":"8ae1728f1c79e9449d50255555e1d5a17d84507a54335ece02732b06b6d8680d"} Apr 23 16:38:11.175886 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:11.175547 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31edc170-24b8-484a-9f30-9c9cc72cd719","Type":"ContainerStarted","Data":"723ae54146747a495bed6487ade7e83cefdc75493edbca6d8efc99dbe936ebc7"} Apr 23 16:38:11.175886 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:11.175562 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31edc170-24b8-484a-9f30-9c9cc72cd719","Type":"ContainerStarted","Data":"8ab0cfa49c48b59026ba2d521ebb91e151938a97b2f243d701978f74f81b4d43"} Apr 23 16:38:12.182121 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:12.182085 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31edc170-24b8-484a-9f30-9c9cc72cd719","Type":"ContainerStarted","Data":"8a918b6e8badf6cc0241f8ffcd43df40f687a20e0e724edb9897b6f7831ed48b"} Apr 23 16:38:13.188851 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:13.188815 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31edc170-24b8-484a-9f30-9c9cc72cd719","Type":"ContainerStarted","Data":"5c23002ab498ee4dd928acbe131c663caa21dc88252067b95b128d3121285808"} Apr 23 16:38:13.218684 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:13.218618 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=8.840160383 podStartE2EDuration="17.2185978s" podCreationTimestamp="2026-04-23 16:37:56 +0000 UTC" firstStartedPulling="2026-04-23 16:38:04.283640276 +0000 UTC m=+176.306036371" lastFinishedPulling="2026-04-23 16:38:12.662077686 +0000 UTC m=+184.684473788" observedRunningTime="2026-04-23 16:38:13.216979081 +0000 UTC m=+185.239375211" watchObservedRunningTime="2026-04-23 16:38:13.2185978 +0000 UTC m=+185.240993916" Apr 23 16:38:13.679879 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:13.679840 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:38:16.013228 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:16.013191 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:16.013744 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:16.013262 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:16.018088 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:16.018066 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:16.202407 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:16.202366 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:38:19.818012 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:19.817976 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:38:19.818399 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:19.818055 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:38:30.168071 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.168006 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-55b96bb664-958rm" podUID="19188d6b-5c97-42ae-92f3-cebd3e2636fb" containerName="console" containerID="cri-o://d631fa529d845a9dd118ae169be9ca5a927ffa594b7b16e14b9f5ef061a171d1" gracePeriod=15 Apr 23 16:38:30.405551 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.405527 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55b96bb664-958rm_19188d6b-5c97-42ae-92f3-cebd3e2636fb/console/0.log" Apr 23 16:38:30.405674 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.405589 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:38:30.525712 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.525615 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19188d6b-5c97-42ae-92f3-cebd3e2636fb-console-config\") pod \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " Apr 23 16:38:30.525712 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.525680 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19188d6b-5c97-42ae-92f3-cebd3e2636fb-oauth-serving-cert\") pod \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " Apr 23 16:38:30.525944 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.525793 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19188d6b-5c97-42ae-92f3-cebd3e2636fb-console-oauth-config\") pod \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " Apr 23 16:38:30.525944 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.525876 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqq6k\" (UniqueName: \"kubernetes.io/projected/19188d6b-5c97-42ae-92f3-cebd3e2636fb-kube-api-access-bqq6k\") pod \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " Apr 23 16:38:30.525944 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.525927 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19188d6b-5c97-42ae-92f3-cebd3e2636fb-console-serving-cert\") pod \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " Apr 23 16:38:30.526082 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.525955 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19188d6b-5c97-42ae-92f3-cebd3e2636fb-service-ca\") pod \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\" (UID: \"19188d6b-5c97-42ae-92f3-cebd3e2636fb\") " Apr 23 16:38:30.526082 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.526025 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19188d6b-5c97-42ae-92f3-cebd3e2636fb-console-config" (OuterVolumeSpecName: "console-config") pod "19188d6b-5c97-42ae-92f3-cebd3e2636fb" (UID: "19188d6b-5c97-42ae-92f3-cebd3e2636fb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:30.526178 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.526089 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19188d6b-5c97-42ae-92f3-cebd3e2636fb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "19188d6b-5c97-42ae-92f3-cebd3e2636fb" (UID: "19188d6b-5c97-42ae-92f3-cebd3e2636fb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:30.526255 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.526235 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19188d6b-5c97-42ae-92f3-cebd3e2636fb-console-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:30.526310 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.526261 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19188d6b-5c97-42ae-92f3-cebd3e2636fb-oauth-serving-cert\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:30.526423 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.526395 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19188d6b-5c97-42ae-92f3-cebd3e2636fb-service-ca" (OuterVolumeSpecName: "service-ca") pod "19188d6b-5c97-42ae-92f3-cebd3e2636fb" (UID: "19188d6b-5c97-42ae-92f3-cebd3e2636fb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:38:30.528213 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.528180 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19188d6b-5c97-42ae-92f3-cebd3e2636fb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "19188d6b-5c97-42ae-92f3-cebd3e2636fb" (UID: "19188d6b-5c97-42ae-92f3-cebd3e2636fb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:38:30.528213 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.528201 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19188d6b-5c97-42ae-92f3-cebd3e2636fb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "19188d6b-5c97-42ae-92f3-cebd3e2636fb" (UID: "19188d6b-5c97-42ae-92f3-cebd3e2636fb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:38:30.528335 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.528267 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19188d6b-5c97-42ae-92f3-cebd3e2636fb-kube-api-access-bqq6k" (OuterVolumeSpecName: "kube-api-access-bqq6k") pod "19188d6b-5c97-42ae-92f3-cebd3e2636fb" (UID: "19188d6b-5c97-42ae-92f3-cebd3e2636fb"). InnerVolumeSpecName "kube-api-access-bqq6k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:38:30.627496 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.627464 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bqq6k\" (UniqueName: \"kubernetes.io/projected/19188d6b-5c97-42ae-92f3-cebd3e2636fb-kube-api-access-bqq6k\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:30.627496 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.627491 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19188d6b-5c97-42ae-92f3-cebd3e2636fb-console-serving-cert\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:30.627496 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.627501 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19188d6b-5c97-42ae-92f3-cebd3e2636fb-service-ca\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:30.627694 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:30.627509 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19188d6b-5c97-42ae-92f3-cebd3e2636fb-console-oauth-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:38:31.242274 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:31.242245 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55b96bb664-958rm_19188d6b-5c97-42ae-92f3-cebd3e2636fb/console/0.log" Apr 23 16:38:31.242691 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:31.242287 2578 generic.go:358] "Generic (PLEG): container finished" podID="19188d6b-5c97-42ae-92f3-cebd3e2636fb" containerID="d631fa529d845a9dd118ae169be9ca5a927ffa594b7b16e14b9f5ef061a171d1" exitCode=2 Apr 23 16:38:31.242691 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:31.242321 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55b96bb664-958rm" event={"ID":"19188d6b-5c97-42ae-92f3-cebd3e2636fb","Type":"ContainerDied","Data":"d631fa529d845a9dd118ae169be9ca5a927ffa594b7b16e14b9f5ef061a171d1"} Apr 23 16:38:31.242691 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:31.242359 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55b96bb664-958rm" Apr 23 16:38:31.242691 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:31.242389 2578 scope.go:117] "RemoveContainer" containerID="d631fa529d845a9dd118ae169be9ca5a927ffa594b7b16e14b9f5ef061a171d1" Apr 23 16:38:31.242691 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:31.242364 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55b96bb664-958rm" event={"ID":"19188d6b-5c97-42ae-92f3-cebd3e2636fb","Type":"ContainerDied","Data":"0e76f96ec06c58a36937c25465b25ae523fb3ccc6cf613ba773d0f4bead6f041"} Apr 23 16:38:31.250322 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:31.250305 2578 scope.go:117] "RemoveContainer" containerID="d631fa529d845a9dd118ae169be9ca5a927ffa594b7b16e14b9f5ef061a171d1" Apr 23 16:38:31.250621 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:38:31.250592 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d631fa529d845a9dd118ae169be9ca5a927ffa594b7b16e14b9f5ef061a171d1\": container with ID starting with d631fa529d845a9dd118ae169be9ca5a927ffa594b7b16e14b9f5ef061a171d1 not found: ID does not exist" containerID="d631fa529d845a9dd118ae169be9ca5a927ffa594b7b16e14b9f5ef061a171d1" Apr 23 16:38:31.250724 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:31.250627 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d631fa529d845a9dd118ae169be9ca5a927ffa594b7b16e14b9f5ef061a171d1"} err="failed to get container status \"d631fa529d845a9dd118ae169be9ca5a927ffa594b7b16e14b9f5ef061a171d1\": rpc error: code = NotFound desc = could not find container \"d631fa529d845a9dd118ae169be9ca5a927ffa594b7b16e14b9f5ef061a171d1\": container with ID starting with d631fa529d845a9dd118ae169be9ca5a927ffa594b7b16e14b9f5ef061a171d1 not found: ID does not exist" Apr 23 16:38:31.262639 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:31.262615 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55b96bb664-958rm"] Apr 23 16:38:31.266806 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:31.266781 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-55b96bb664-958rm"] Apr 23 16:38:32.575560 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:32.575529 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19188d6b-5c97-42ae-92f3-cebd3e2636fb" path="/var/lib/kubelet/pods/19188d6b-5c97-42ae-92f3-cebd3e2636fb/volumes" Apr 23 16:38:37.267170 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:37.267130 2578 generic.go:358] "Generic (PLEG): container finished" podID="e4eef891-cf79-4965-b0db-94974d87932b" containerID="768235473bf6e11d44e3c5a4c0986c0cd03d451e2d4cefdebd50e0f5e4b16b9d" exitCode=0 Apr 23 16:38:37.267558 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:37.267188 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r2qw8" event={"ID":"e4eef891-cf79-4965-b0db-94974d87932b","Type":"ContainerDied","Data":"768235473bf6e11d44e3c5a4c0986c0cd03d451e2d4cefdebd50e0f5e4b16b9d"} Apr 23 16:38:37.267558 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:37.267551 2578 scope.go:117] "RemoveContainer" containerID="768235473bf6e11d44e3c5a4c0986c0cd03d451e2d4cefdebd50e0f5e4b16b9d" Apr 23 16:38:38.271523 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:38.271489 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r2qw8" event={"ID":"e4eef891-cf79-4965-b0db-94974d87932b","Type":"ContainerStarted","Data":"a555ae2ba8207ae35567643b5dd24364e076a7f2e76b055734cf276802af253b"} Apr 23 16:38:39.823500 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:39.823470 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:38:39.827203 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:39.827182 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-558c4f944f-hzqnx" Apr 23 16:38:41.281390 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:41.281358 2578 generic.go:358] "Generic (PLEG): container finished" podID="6b7ec9ae-872e-40fc-8d51-650ccb39c97b" containerID="4658ca6d951865bef26ee18d25e5eea2da97882955b6311749cc044e3b541009" exitCode=0 Apr 23 16:38:41.281743 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:41.281431 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-94hpw" event={"ID":"6b7ec9ae-872e-40fc-8d51-650ccb39c97b","Type":"ContainerDied","Data":"4658ca6d951865bef26ee18d25e5eea2da97882955b6311749cc044e3b541009"} Apr 23 16:38:41.281743 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:41.281726 2578 scope.go:117] "RemoveContainer" containerID="4658ca6d951865bef26ee18d25e5eea2da97882955b6311749cc044e3b541009" Apr 23 16:38:42.286214 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:38:42.286176 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-94hpw" event={"ID":"6b7ec9ae-872e-40fc-8d51-650ccb39c97b","Type":"ContainerStarted","Data":"a70e6b4d0a61873adde8e9cf454a6ebc5dcc56b6d85ed5ba572b5774c2b4d216"} Apr 23 16:39:15.532852 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:15.532818 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:39:15.533267 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:15.533224 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="alertmanager" containerID="cri-o://8ab0cfa49c48b59026ba2d521ebb91e151938a97b2f243d701978f74f81b4d43" gracePeriod=120 Apr 23 16:39:15.533454 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:15.533320 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="kube-rbac-proxy-web" containerID="cri-o://8ae1728f1c79e9449d50255555e1d5a17d84507a54335ece02732b06b6d8680d" gracePeriod=120 Apr 23 16:39:15.533454 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:15.533352 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="kube-rbac-proxy" containerID="cri-o://bbda637009580acf9c3c3d3d1fd003627fc07f7b7673c818d483277ed497ec2c" gracePeriod=120 Apr 23 16:39:15.533454 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:15.533365 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="prom-label-proxy" containerID="cri-o://5c23002ab498ee4dd928acbe131c663caa21dc88252067b95b128d3121285808" gracePeriod=120 Apr 23 16:39:15.533646 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:15.533397 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="config-reloader" containerID="cri-o://723ae54146747a495bed6487ade7e83cefdc75493edbca6d8efc99dbe936ebc7" gracePeriod=120 Apr 23 16:39:15.534154 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:15.533306 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="kube-rbac-proxy-metric" containerID="cri-o://8a918b6e8badf6cc0241f8ffcd43df40f687a20e0e724edb9897b6f7831ed48b" gracePeriod=120 Apr 23 16:39:16.396699 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.396665 2578 generic.go:358] "Generic (PLEG): container finished" podID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerID="5c23002ab498ee4dd928acbe131c663caa21dc88252067b95b128d3121285808" exitCode=0 Apr 23 16:39:16.396699 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.396692 2578 generic.go:358] "Generic (PLEG): container finished" podID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerID="8a918b6e8badf6cc0241f8ffcd43df40f687a20e0e724edb9897b6f7831ed48b" exitCode=0 Apr 23 16:39:16.396699 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.396700 2578 generic.go:358] "Generic (PLEG): container finished" podID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerID="bbda637009580acf9c3c3d3d1fd003627fc07f7b7673c818d483277ed497ec2c" exitCode=0 Apr 23 16:39:16.396699 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.396706 2578 generic.go:358] "Generic (PLEG): container finished" podID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerID="723ae54146747a495bed6487ade7e83cefdc75493edbca6d8efc99dbe936ebc7" exitCode=0 Apr 23 16:39:16.396699 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.396712 2578 generic.go:358] "Generic (PLEG): container finished" podID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerID="8ab0cfa49c48b59026ba2d521ebb91e151938a97b2f243d701978f74f81b4d43" exitCode=0 Apr 23 16:39:16.396990 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.396735 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31edc170-24b8-484a-9f30-9c9cc72cd719","Type":"ContainerDied","Data":"5c23002ab498ee4dd928acbe131c663caa21dc88252067b95b128d3121285808"} Apr 23 16:39:16.396990 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.396799 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31edc170-24b8-484a-9f30-9c9cc72cd719","Type":"ContainerDied","Data":"8a918b6e8badf6cc0241f8ffcd43df40f687a20e0e724edb9897b6f7831ed48b"} Apr 23 16:39:16.396990 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.396810 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31edc170-24b8-484a-9f30-9c9cc72cd719","Type":"ContainerDied","Data":"bbda637009580acf9c3c3d3d1fd003627fc07f7b7673c818d483277ed497ec2c"} Apr 23 16:39:16.396990 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.396820 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31edc170-24b8-484a-9f30-9c9cc72cd719","Type":"ContainerDied","Data":"723ae54146747a495bed6487ade7e83cefdc75493edbca6d8efc99dbe936ebc7"} Apr 23 16:39:16.396990 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.396829 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31edc170-24b8-484a-9f30-9c9cc72cd719","Type":"ContainerDied","Data":"8ab0cfa49c48b59026ba2d521ebb91e151938a97b2f243d701978f74f81b4d43"} Apr 23 16:39:16.759938 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.759916 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:16.808219 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.808186 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/31edc170-24b8-484a-9f30-9c9cc72cd719-tls-assets\") pod \"31edc170-24b8-484a-9f30-9c9cc72cd719\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " Apr 23 16:39:16.808400 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.808229 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-kube-rbac-proxy\") pod \"31edc170-24b8-484a-9f30-9c9cc72cd719\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " Apr 23 16:39:16.808400 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.808263 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31edc170-24b8-484a-9f30-9c9cc72cd719-alertmanager-trusted-ca-bundle\") pod \"31edc170-24b8-484a-9f30-9c9cc72cd719\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " Apr 23 16:39:16.808400 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.808292 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/31edc170-24b8-484a-9f30-9c9cc72cd719-alertmanager-main-db\") pod \"31edc170-24b8-484a-9f30-9c9cc72cd719\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " Apr 23 16:39:16.808400 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.808335 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-kube-rbac-proxy-metric\") pod \"31edc170-24b8-484a-9f30-9c9cc72cd719\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " Apr 23 16:39:16.808400 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.808365 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/31edc170-24b8-484a-9f30-9c9cc72cd719-metrics-client-ca\") pod \"31edc170-24b8-484a-9f30-9c9cc72cd719\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " Apr 23 16:39:16.808663 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.808406 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4j76\" (UniqueName: \"kubernetes.io/projected/31edc170-24b8-484a-9f30-9c9cc72cd719-kube-api-access-f4j76\") pod \"31edc170-24b8-484a-9f30-9c9cc72cd719\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " Apr 23 16:39:16.808663 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.808431 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/31edc170-24b8-484a-9f30-9c9cc72cd719-config-out\") pod \"31edc170-24b8-484a-9f30-9c9cc72cd719\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " Apr 23 16:39:16.808663 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.808462 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-kube-rbac-proxy-web\") pod \"31edc170-24b8-484a-9f30-9c9cc72cd719\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " Apr 23 16:39:16.808663 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.808508 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-cluster-tls-config\") pod \"31edc170-24b8-484a-9f30-9c9cc72cd719\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " Apr 23 16:39:16.808663 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.808534 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-web-config\") pod \"31edc170-24b8-484a-9f30-9c9cc72cd719\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " Apr 23 16:39:16.808663 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.808567 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-config-volume\") pod \"31edc170-24b8-484a-9f30-9c9cc72cd719\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " Apr 23 16:39:16.808663 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.808603 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-main-tls\") pod \"31edc170-24b8-484a-9f30-9c9cc72cd719\" (UID: \"31edc170-24b8-484a-9f30-9c9cc72cd719\") " Apr 23 16:39:16.809001 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.808681 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31edc170-24b8-484a-9f30-9c9cc72cd719-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "31edc170-24b8-484a-9f30-9c9cc72cd719" (UID: "31edc170-24b8-484a-9f30-9c9cc72cd719"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:39:16.809001 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.808707 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31edc170-24b8-484a-9f30-9c9cc72cd719-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "31edc170-24b8-484a-9f30-9c9cc72cd719" (UID: "31edc170-24b8-484a-9f30-9c9cc72cd719"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:39:16.809001 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.808921 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31edc170-24b8-484a-9f30-9c9cc72cd719-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:16.809001 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.808945 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/31edc170-24b8-484a-9f30-9c9cc72cd719-alertmanager-main-db\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:16.809204 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.809147 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31edc170-24b8-484a-9f30-9c9cc72cd719-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "31edc170-24b8-484a-9f30-9c9cc72cd719" (UID: "31edc170-24b8-484a-9f30-9c9cc72cd719"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:39:16.812283 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.812253 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "31edc170-24b8-484a-9f30-9c9cc72cd719" (UID: "31edc170-24b8-484a-9f30-9c9cc72cd719"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:16.812589 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.812561 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "31edc170-24b8-484a-9f30-9c9cc72cd719" (UID: "31edc170-24b8-484a-9f30-9c9cc72cd719"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:16.812721 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.812661 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31edc170-24b8-484a-9f30-9c9cc72cd719-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "31edc170-24b8-484a-9f30-9c9cc72cd719" (UID: "31edc170-24b8-484a-9f30-9c9cc72cd719"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:39:16.812721 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.812710 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31edc170-24b8-484a-9f30-9c9cc72cd719-config-out" (OuterVolumeSpecName: "config-out") pod "31edc170-24b8-484a-9f30-9c9cc72cd719" (UID: "31edc170-24b8-484a-9f30-9c9cc72cd719"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:39:16.812823 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.812756 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31edc170-24b8-484a-9f30-9c9cc72cd719-kube-api-access-f4j76" (OuterVolumeSpecName: "kube-api-access-f4j76") pod "31edc170-24b8-484a-9f30-9c9cc72cd719" (UID: "31edc170-24b8-484a-9f30-9c9cc72cd719"). InnerVolumeSpecName "kube-api-access-f4j76". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:39:16.812823 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.812799 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "31edc170-24b8-484a-9f30-9c9cc72cd719" (UID: "31edc170-24b8-484a-9f30-9c9cc72cd719"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:16.813640 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.813605 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "31edc170-24b8-484a-9f30-9c9cc72cd719" (UID: "31edc170-24b8-484a-9f30-9c9cc72cd719"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:16.814146 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.814121 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-config-volume" (OuterVolumeSpecName: "config-volume") pod "31edc170-24b8-484a-9f30-9c9cc72cd719" (UID: "31edc170-24b8-484a-9f30-9c9cc72cd719"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:16.816980 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.816962 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "31edc170-24b8-484a-9f30-9c9cc72cd719" (UID: "31edc170-24b8-484a-9f30-9c9cc72cd719"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:16.823518 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.823484 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-web-config" (OuterVolumeSpecName: "web-config") pod "31edc170-24b8-484a-9f30-9c9cc72cd719" (UID: "31edc170-24b8-484a-9f30-9c9cc72cd719"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:39:16.909764 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.909733 2578 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/31edc170-24b8-484a-9f30-9c9cc72cd719-tls-assets\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:16.909764 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.909759 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:16.909764 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.909769 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:16.909970 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.909780 2578 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/31edc170-24b8-484a-9f30-9c9cc72cd719-metrics-client-ca\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:16.909970 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.909790 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f4j76\" (UniqueName: \"kubernetes.io/projected/31edc170-24b8-484a-9f30-9c9cc72cd719-kube-api-access-f4j76\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:16.909970 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.909800 2578 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/31edc170-24b8-484a-9f30-9c9cc72cd719-config-out\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:16.909970 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.909810 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:16.909970 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.909819 2578 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-cluster-tls-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:16.909970 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.909828 2578 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-web-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:16.909970 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.909837 2578 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-config-volume\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:16.909970 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:16.909844 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/31edc170-24b8-484a-9f30-9c9cc72cd719-secret-alertmanager-main-tls\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:39:17.402348 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.402308 2578 generic.go:358] "Generic (PLEG): container finished" podID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerID="8ae1728f1c79e9449d50255555e1d5a17d84507a54335ece02732b06b6d8680d" exitCode=0 Apr 23 16:39:17.402503 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.402404 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31edc170-24b8-484a-9f30-9c9cc72cd719","Type":"ContainerDied","Data":"8ae1728f1c79e9449d50255555e1d5a17d84507a54335ece02732b06b6d8680d"} Apr 23 16:39:17.402503 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.402435 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.402503 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.402448 2578 scope.go:117] "RemoveContainer" containerID="5c23002ab498ee4dd928acbe131c663caa21dc88252067b95b128d3121285808" Apr 23 16:39:17.402663 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.402438 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31edc170-24b8-484a-9f30-9c9cc72cd719","Type":"ContainerDied","Data":"54eb42b3f3e62d47178f50d6ac39b9a15125d6e32ed8caf1aac68a639e123039"} Apr 23 16:39:17.411230 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.411158 2578 scope.go:117] "RemoveContainer" containerID="8a918b6e8badf6cc0241f8ffcd43df40f687a20e0e724edb9897b6f7831ed48b" Apr 23 16:39:17.420584 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.420564 2578 scope.go:117] "RemoveContainer" containerID="bbda637009580acf9c3c3d3d1fd003627fc07f7b7673c818d483277ed497ec2c" Apr 23 16:39:17.427022 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.427005 2578 scope.go:117] "RemoveContainer" containerID="8ae1728f1c79e9449d50255555e1d5a17d84507a54335ece02732b06b6d8680d" Apr 23 16:39:17.430845 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.430819 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:39:17.433801 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.433770 2578 scope.go:117] "RemoveContainer" containerID="723ae54146747a495bed6487ade7e83cefdc75493edbca6d8efc99dbe936ebc7" Apr 23 16:39:17.440709 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.440691 2578 scope.go:117] "RemoveContainer" containerID="8ab0cfa49c48b59026ba2d521ebb91e151938a97b2f243d701978f74f81b4d43" Apr 23 16:39:17.441162 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.441144 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:39:17.446925 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.446910 2578 scope.go:117] "RemoveContainer" containerID="ad8637153f980fc903f44a604159fd1b53ffcec31aaca0a78d865147d527f06f" Apr 23 16:39:17.453230 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.453213 2578 scope.go:117] "RemoveContainer" containerID="5c23002ab498ee4dd928acbe131c663caa21dc88252067b95b128d3121285808" Apr 23 16:39:17.453512 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:39:17.453494 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c23002ab498ee4dd928acbe131c663caa21dc88252067b95b128d3121285808\": container with ID starting with 5c23002ab498ee4dd928acbe131c663caa21dc88252067b95b128d3121285808 not found: ID does not exist" containerID="5c23002ab498ee4dd928acbe131c663caa21dc88252067b95b128d3121285808" Apr 23 16:39:17.453586 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.453518 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c23002ab498ee4dd928acbe131c663caa21dc88252067b95b128d3121285808"} err="failed to get container status \"5c23002ab498ee4dd928acbe131c663caa21dc88252067b95b128d3121285808\": rpc error: code = NotFound desc = could not find container \"5c23002ab498ee4dd928acbe131c663caa21dc88252067b95b128d3121285808\": container with ID starting with 5c23002ab498ee4dd928acbe131c663caa21dc88252067b95b128d3121285808 not found: ID does not exist" Apr 23 16:39:17.453586 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.453538 2578 scope.go:117] "RemoveContainer" containerID="8a918b6e8badf6cc0241f8ffcd43df40f687a20e0e724edb9897b6f7831ed48b" Apr 23 16:39:17.453745 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:39:17.453730 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a918b6e8badf6cc0241f8ffcd43df40f687a20e0e724edb9897b6f7831ed48b\": container with ID starting with 8a918b6e8badf6cc0241f8ffcd43df40f687a20e0e724edb9897b6f7831ed48b not found: ID does not exist" containerID="8a918b6e8badf6cc0241f8ffcd43df40f687a20e0e724edb9897b6f7831ed48b" Apr 23 16:39:17.453789 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.453748 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a918b6e8badf6cc0241f8ffcd43df40f687a20e0e724edb9897b6f7831ed48b"} err="failed to get container status \"8a918b6e8badf6cc0241f8ffcd43df40f687a20e0e724edb9897b6f7831ed48b\": rpc error: code = NotFound desc = could not find container \"8a918b6e8badf6cc0241f8ffcd43df40f687a20e0e724edb9897b6f7831ed48b\": container with ID starting with 8a918b6e8badf6cc0241f8ffcd43df40f687a20e0e724edb9897b6f7831ed48b not found: ID does not exist" Apr 23 16:39:17.453789 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.453759 2578 scope.go:117] "RemoveContainer" containerID="bbda637009580acf9c3c3d3d1fd003627fc07f7b7673c818d483277ed497ec2c" Apr 23 16:39:17.453933 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:39:17.453919 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbda637009580acf9c3c3d3d1fd003627fc07f7b7673c818d483277ed497ec2c\": container with ID starting with bbda637009580acf9c3c3d3d1fd003627fc07f7b7673c818d483277ed497ec2c not found: ID does not exist" containerID="bbda637009580acf9c3c3d3d1fd003627fc07f7b7673c818d483277ed497ec2c" Apr 23 16:39:17.453975 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.453934 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbda637009580acf9c3c3d3d1fd003627fc07f7b7673c818d483277ed497ec2c"} err="failed to get container status \"bbda637009580acf9c3c3d3d1fd003627fc07f7b7673c818d483277ed497ec2c\": rpc error: code = NotFound desc = could not find container \"bbda637009580acf9c3c3d3d1fd003627fc07f7b7673c818d483277ed497ec2c\": container with ID starting with bbda637009580acf9c3c3d3d1fd003627fc07f7b7673c818d483277ed497ec2c not found: ID does not exist" Apr 23 16:39:17.453975 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.453945 2578 scope.go:117] "RemoveContainer" containerID="8ae1728f1c79e9449d50255555e1d5a17d84507a54335ece02732b06b6d8680d" Apr 23 16:39:17.454164 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:39:17.454147 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ae1728f1c79e9449d50255555e1d5a17d84507a54335ece02732b06b6d8680d\": container with ID starting with 8ae1728f1c79e9449d50255555e1d5a17d84507a54335ece02732b06b6d8680d not found: ID does not exist" containerID="8ae1728f1c79e9449d50255555e1d5a17d84507a54335ece02732b06b6d8680d" Apr 23 16:39:17.454208 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.454170 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ae1728f1c79e9449d50255555e1d5a17d84507a54335ece02732b06b6d8680d"} err="failed to get container status \"8ae1728f1c79e9449d50255555e1d5a17d84507a54335ece02732b06b6d8680d\": rpc error: code = NotFound desc = could not find container \"8ae1728f1c79e9449d50255555e1d5a17d84507a54335ece02732b06b6d8680d\": container with ID starting with 8ae1728f1c79e9449d50255555e1d5a17d84507a54335ece02732b06b6d8680d not found: ID does not exist" Apr 23 16:39:17.454208 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.454186 2578 scope.go:117] "RemoveContainer" containerID="723ae54146747a495bed6487ade7e83cefdc75493edbca6d8efc99dbe936ebc7" Apr 23 16:39:17.454420 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:39:17.454400 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"723ae54146747a495bed6487ade7e83cefdc75493edbca6d8efc99dbe936ebc7\": container with ID starting with 723ae54146747a495bed6487ade7e83cefdc75493edbca6d8efc99dbe936ebc7 not found: ID does not exist" containerID="723ae54146747a495bed6487ade7e83cefdc75493edbca6d8efc99dbe936ebc7" Apr 23 16:39:17.454464 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.454429 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"723ae54146747a495bed6487ade7e83cefdc75493edbca6d8efc99dbe936ebc7"} err="failed to get container status \"723ae54146747a495bed6487ade7e83cefdc75493edbca6d8efc99dbe936ebc7\": rpc error: code = NotFound desc = could not find container \"723ae54146747a495bed6487ade7e83cefdc75493edbca6d8efc99dbe936ebc7\": container with ID starting with 723ae54146747a495bed6487ade7e83cefdc75493edbca6d8efc99dbe936ebc7 not found: ID does not exist" Apr 23 16:39:17.454464 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.454445 2578 scope.go:117] "RemoveContainer" containerID="8ab0cfa49c48b59026ba2d521ebb91e151938a97b2f243d701978f74f81b4d43" Apr 23 16:39:17.454658 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:39:17.454639 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ab0cfa49c48b59026ba2d521ebb91e151938a97b2f243d701978f74f81b4d43\": container with ID starting with 8ab0cfa49c48b59026ba2d521ebb91e151938a97b2f243d701978f74f81b4d43 not found: ID does not exist" containerID="8ab0cfa49c48b59026ba2d521ebb91e151938a97b2f243d701978f74f81b4d43" Apr 23 16:39:17.454699 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.454665 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ab0cfa49c48b59026ba2d521ebb91e151938a97b2f243d701978f74f81b4d43"} err="failed to get container status \"8ab0cfa49c48b59026ba2d521ebb91e151938a97b2f243d701978f74f81b4d43\": rpc error: code = NotFound desc = could not find container \"8ab0cfa49c48b59026ba2d521ebb91e151938a97b2f243d701978f74f81b4d43\": container with ID starting with 8ab0cfa49c48b59026ba2d521ebb91e151938a97b2f243d701978f74f81b4d43 not found: ID does not exist" Apr 23 16:39:17.454699 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.454683 2578 scope.go:117] "RemoveContainer" containerID="ad8637153f980fc903f44a604159fd1b53ffcec31aaca0a78d865147d527f06f" Apr 23 16:39:17.454872 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:39:17.454858 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad8637153f980fc903f44a604159fd1b53ffcec31aaca0a78d865147d527f06f\": container with ID starting with ad8637153f980fc903f44a604159fd1b53ffcec31aaca0a78d865147d527f06f not found: ID does not exist" containerID="ad8637153f980fc903f44a604159fd1b53ffcec31aaca0a78d865147d527f06f" Apr 23 16:39:17.454916 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.454877 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad8637153f980fc903f44a604159fd1b53ffcec31aaca0a78d865147d527f06f"} err="failed to get container status \"ad8637153f980fc903f44a604159fd1b53ffcec31aaca0a78d865147d527f06f\": rpc error: code = NotFound desc = could not find container \"ad8637153f980fc903f44a604159fd1b53ffcec31aaca0a78d865147d527f06f\": container with ID starting with ad8637153f980fc903f44a604159fd1b53ffcec31aaca0a78d865147d527f06f not found: ID does not exist" Apr 23 16:39:17.474320 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474299 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:39:17.474635 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474621 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="kube-rbac-proxy-web" Apr 23 16:39:17.474688 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474637 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="kube-rbac-proxy-web" Apr 23 16:39:17.474688 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474645 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="prom-label-proxy" Apr 23 16:39:17.474688 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474650 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="prom-label-proxy" Apr 23 16:39:17.474688 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474660 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19188d6b-5c97-42ae-92f3-cebd3e2636fb" containerName="console" Apr 23 16:39:17.474688 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474665 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="19188d6b-5c97-42ae-92f3-cebd3e2636fb" containerName="console" Apr 23 16:39:17.474688 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474671 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="alertmanager" Apr 23 16:39:17.474688 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474676 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="alertmanager" Apr 23 16:39:17.474688 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474681 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="config-reloader" Apr 23 16:39:17.474688 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474687 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="config-reloader" Apr 23 16:39:17.474930 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474698 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f1ab387-785c-479f-b2b2-27f092332c1b" containerName="registry" Apr 23 16:39:17.474930 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474706 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f1ab387-785c-479f-b2b2-27f092332c1b" containerName="registry" Apr 23 16:39:17.474930 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474716 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="kube-rbac-proxy" Apr 23 16:39:17.474930 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474720 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="kube-rbac-proxy" Apr 23 16:39:17.474930 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474727 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="init-config-reloader" Apr 23 16:39:17.474930 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474733 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="init-config-reloader" Apr 23 16:39:17.474930 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474741 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="kube-rbac-proxy-metric" Apr 23 16:39:17.474930 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474746 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="kube-rbac-proxy-metric" Apr 23 16:39:17.474930 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474796 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="kube-rbac-proxy-web" Apr 23 16:39:17.474930 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474804 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="config-reloader" Apr 23 16:39:17.474930 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474809 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="prom-label-proxy" Apr 23 16:39:17.474930 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474815 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="alertmanager" Apr 23 16:39:17.474930 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474821 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="kube-rbac-proxy" Apr 23 16:39:17.474930 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474827 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="19188d6b-5c97-42ae-92f3-cebd3e2636fb" containerName="console" Apr 23 16:39:17.474930 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474833 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" containerName="kube-rbac-proxy-metric" Apr 23 16:39:17.474930 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.474841 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f1ab387-785c-479f-b2b2-27f092332c1b" containerName="registry" Apr 23 16:39:17.480266 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.480243 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.482837 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.482816 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-h22r9\"" Apr 23 16:39:17.483150 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.483128 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 16:39:17.483247 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.483154 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 16:39:17.483247 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.483165 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 16:39:17.483247 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.483155 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 16:39:17.483659 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.483641 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 16:39:17.483765 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.483750 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 16:39:17.483834 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.483779 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 16:39:17.483834 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.483814 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 16:39:17.489912 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.489894 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 16:39:17.498172 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.498147 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:39:17.514698 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.514670 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-web-config\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.514809 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.514710 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.514809 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.514744 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.514809 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.514787 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-config-volume\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.514973 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.514836 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.514973 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.514867 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.514973 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.514895 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.514973 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.514924 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.514973 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.514965 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-config-out\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.515199 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.514997 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.515199 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.515059 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.515199 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.515083 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.515199 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.515116 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztdnd\" (UniqueName: \"kubernetes.io/projected/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-kube-api-access-ztdnd\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.615598 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.615566 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.615736 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.615603 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.615736 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.615634 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztdnd\" (UniqueName: \"kubernetes.io/projected/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-kube-api-access-ztdnd\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.615736 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.615665 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-web-config\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.615736 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.615697 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.615736 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.615728 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.615986 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.615845 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-config-volume\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.615986 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.615907 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.615986 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.615940 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.615986 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.615966 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.616143 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.615994 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.616143 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.616037 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-config-out\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.616143 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.616063 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.616581 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.616557 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.616845 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.616819 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.617446 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.617423 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.618908 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.618880 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-config-volume\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.619022 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.618998 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.619111 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.619093 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-web-config\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.619339 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.619313 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.619574 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.619546 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.619681 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.619637 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.619681 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.619637 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.619795 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.619767 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-config-out\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.620833 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.620814 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.623172 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.623155 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztdnd\" (UniqueName: \"kubernetes.io/projected/c2a18e2c-6e57-4281-89be-0b5ff6a32cfb-kube-api-access-ztdnd\") pod \"alertmanager-main-0\" (UID: \"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.790729 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.790633 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:39:17.917768 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:17.917734 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:39:17.920652 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:39:17.920619 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2a18e2c_6e57_4281_89be_0b5ff6a32cfb.slice/crio-56ef2932ad0495098fefad0cd65625c953fe184d390eb9cb32b50bcc915198db WatchSource:0}: Error finding container 56ef2932ad0495098fefad0cd65625c953fe184d390eb9cb32b50bcc915198db: Status 404 returned error can't find the container with id 56ef2932ad0495098fefad0cd65625c953fe184d390eb9cb32b50bcc915198db Apr 23 16:39:18.407865 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:18.407825 2578 generic.go:358] "Generic (PLEG): container finished" podID="c2a18e2c-6e57-4281-89be-0b5ff6a32cfb" containerID="3ee40241dfbce416d4960e63d1673dec33e8098a5287ec0bc13b4e5d1d7d1257" exitCode=0 Apr 23 16:39:18.408011 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:18.407875 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb","Type":"ContainerDied","Data":"3ee40241dfbce416d4960e63d1673dec33e8098a5287ec0bc13b4e5d1d7d1257"} Apr 23 16:39:18.408011 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:18.407901 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb","Type":"ContainerStarted","Data":"56ef2932ad0495098fefad0cd65625c953fe184d390eb9cb32b50bcc915198db"} Apr 23 16:39:18.575727 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:18.575699 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31edc170-24b8-484a-9f30-9c9cc72cd719" path="/var/lib/kubelet/pods/31edc170-24b8-484a-9f30-9c9cc72cd719/volumes" Apr 23 16:39:19.414555 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:19.414517 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb","Type":"ContainerStarted","Data":"eba7047d491c3fcdec5e9d4bf97116c851ff22627508d9f5e8517d68383a12b9"} Apr 23 16:39:19.414555 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:19.414553 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb","Type":"ContainerStarted","Data":"c84fe8ef8ddad1ea03f6ca48254b356452668806466e7d47456ccb699b5427aa"} Apr 23 16:39:19.414555 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:19.414562 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb","Type":"ContainerStarted","Data":"dba16609eb85487bb330605208d44f5a05868fa52d8eb3b55f8b662c8397ddbd"} Apr 23 16:39:19.414952 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:19.414571 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb","Type":"ContainerStarted","Data":"9a83716a05510b5f1604f4477752675fca0b48603107114b6efb377d5b169cc0"} Apr 23 16:39:19.414952 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:19.414579 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb","Type":"ContainerStarted","Data":"6b61e2175468c64af740e43f370f7cd5067eb1ce445ba9ba80526d2edd51a1b4"} Apr 23 16:39:19.414952 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:19.414587 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c2a18e2c-6e57-4281-89be-0b5ff6a32cfb","Type":"ContainerStarted","Data":"c76e4c5f6888b8fe6331e27000e661f7499d295ae815c5987c9dbdfcdbbc7c0b"} Apr 23 16:39:19.443967 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:19.443914 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.443900631 podStartE2EDuration="2.443900631s" podCreationTimestamp="2026-04-23 16:39:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:39:19.442974313 +0000 UTC m=+251.465370455" watchObservedRunningTime="2026-04-23 16:39:19.443900631 +0000 UTC m=+251.466296754" Apr 23 16:39:20.339289 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:20.339252 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs\") pod \"network-metrics-daemon-h6kzn\" (UID: \"b7f21f2f-2763-41c8-af5e-52de8001226b\") " pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:39:20.341579 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:20.341550 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7f21f2f-2763-41c8-af5e-52de8001226b-metrics-certs\") pod \"network-metrics-daemon-h6kzn\" (UID: \"b7f21f2f-2763-41c8-af5e-52de8001226b\") " pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:39:20.574907 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:20.574866 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f89h4\"" Apr 23 16:39:20.583633 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:20.583609 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6kzn" Apr 23 16:39:20.699078 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:20.699045 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h6kzn"] Apr 23 16:39:20.702107 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:39:20.702080 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7f21f2f_2763_41c8_af5e_52de8001226b.slice/crio-b520a774cc0f75332fb550e2c2047265619d028633db41f9dc65a7643281c95d WatchSource:0}: Error finding container b520a774cc0f75332fb550e2c2047265619d028633db41f9dc65a7643281c95d: Status 404 returned error can't find the container with id b520a774cc0f75332fb550e2c2047265619d028633db41f9dc65a7643281c95d Apr 23 16:39:21.422043 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:21.422003 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h6kzn" event={"ID":"b7f21f2f-2763-41c8-af5e-52de8001226b","Type":"ContainerStarted","Data":"b520a774cc0f75332fb550e2c2047265619d028633db41f9dc65a7643281c95d"} Apr 23 16:39:22.426964 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:22.426930 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h6kzn" event={"ID":"b7f21f2f-2763-41c8-af5e-52de8001226b","Type":"ContainerStarted","Data":"f0fbe67a39bddc3868bb16cad3b3b1927758a34b58f43eeb4c382f90967233a5"} Apr 23 16:39:22.426964 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:22.426969 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h6kzn" event={"ID":"b7f21f2f-2763-41c8-af5e-52de8001226b","Type":"ContainerStarted","Data":"2ce9d348b257b979245062aa8535ed406d1eb994588d753d9f81e674f81d6f54"} Apr 23 16:39:22.442456 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:22.442397 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-h6kzn" podStartSLOduration=253.361722444 podStartE2EDuration="4m14.442370264s" podCreationTimestamp="2026-04-23 16:35:08 +0000 UTC" firstStartedPulling="2026-04-23 16:39:20.704484122 +0000 UTC m=+252.726880231" lastFinishedPulling="2026-04-23 16:39:21.785131957 +0000 UTC m=+253.807528051" observedRunningTime="2026-04-23 16:39:22.440115498 +0000 UTC m=+254.462511614" watchObservedRunningTime="2026-04-23 16:39:22.442370264 +0000 UTC m=+254.464766380" Apr 23 16:39:40.643954 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.643914 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5bc586b777-wlsqt"] Apr 23 16:39:40.647454 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.647431 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.659613 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.659590 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bc586b777-wlsqt"] Apr 23 16:39:40.697877 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.697836 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-service-ca\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.698049 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.697889 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-oauth-serving-cert\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.698049 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.697975 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9wvx\" (UniqueName: \"kubernetes.io/projected/e7f39b63-535e-42da-809f-6ae8bafc6786-kube-api-access-g9wvx\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.698166 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.698066 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-console-config\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.698166 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.698098 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f39b63-535e-42da-809f-6ae8bafc6786-console-serving-cert\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.698166 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.698124 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e7f39b63-535e-42da-809f-6ae8bafc6786-console-oauth-config\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.698289 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.698212 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-trusted-ca-bundle\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.799575 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.799542 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-service-ca\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.799575 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.799575 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-oauth-serving-cert\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.799767 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.799604 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9wvx\" (UniqueName: \"kubernetes.io/projected/e7f39b63-535e-42da-809f-6ae8bafc6786-kube-api-access-g9wvx\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.799767 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.799647 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-console-config\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.799767 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.799674 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f39b63-535e-42da-809f-6ae8bafc6786-console-serving-cert\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.799767 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.799700 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e7f39b63-535e-42da-809f-6ae8bafc6786-console-oauth-config\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.799767 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.799740 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-trusted-ca-bundle\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.800347 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.800308 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-service-ca\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.800485 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.800359 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-oauth-serving-cert\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.800485 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.800443 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-console-config\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.800485 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.800466 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-trusted-ca-bundle\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.802160 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.802141 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e7f39b63-535e-42da-809f-6ae8bafc6786-console-oauth-config\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.802238 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.802169 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f39b63-535e-42da-809f-6ae8bafc6786-console-serving-cert\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.808205 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.808187 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9wvx\" (UniqueName: \"kubernetes.io/projected/e7f39b63-535e-42da-809f-6ae8bafc6786-kube-api-access-g9wvx\") pod \"console-5bc586b777-wlsqt\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:40.956906 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:40.956835 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:41.075783 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:41.075761 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bc586b777-wlsqt"] Apr 23 16:39:41.077865 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:39:41.077838 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7f39b63_535e_42da_809f_6ae8bafc6786.slice/crio-4451dc59a2bffdd01585b46d405c4f0d6c63ebb1cc053a6a309803a57d1ad4f7 WatchSource:0}: Error finding container 4451dc59a2bffdd01585b46d405c4f0d6c63ebb1cc053a6a309803a57d1ad4f7: Status 404 returned error can't find the container with id 4451dc59a2bffdd01585b46d405c4f0d6c63ebb1cc053a6a309803a57d1ad4f7 Apr 23 16:39:41.485086 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:41.485053 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bc586b777-wlsqt" event={"ID":"e7f39b63-535e-42da-809f-6ae8bafc6786","Type":"ContainerStarted","Data":"fc9136da011170266950d47d700c659323b7f9fbfd5711a8cc8d0a3fe96f67eb"} Apr 23 16:39:41.485086 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:41.485088 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bc586b777-wlsqt" event={"ID":"e7f39b63-535e-42da-809f-6ae8bafc6786","Type":"ContainerStarted","Data":"4451dc59a2bffdd01585b46d405c4f0d6c63ebb1cc053a6a309803a57d1ad4f7"} Apr 23 16:39:41.503631 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:41.503588 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5bc586b777-wlsqt" podStartSLOduration=1.5035762400000001 podStartE2EDuration="1.50357624s" podCreationTimestamp="2026-04-23 16:39:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:39:41.502688869 +0000 UTC m=+273.525084984" watchObservedRunningTime="2026-04-23 16:39:41.50357624 +0000 UTC m=+273.525972355" Apr 23 16:39:50.957229 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:50.957178 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:50.957229 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:50.957245 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:50.961930 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:50.961906 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:51.517601 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:51.517573 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:39:51.567797 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:39:51.567764 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c76c6945d-7thrk"] Apr 23 16:40:08.460579 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:08.460549 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 16:40:08.461007 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:08.460886 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 16:40:08.465201 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:08.465180 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 16:40:16.586909 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.586872 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c76c6945d-7thrk" podUID="8e52370b-5f13-4378-b500-d5e555a57e65" containerName="console" containerID="cri-o://c78b018771796beaa0f3e0b031776ce705bb1e78e50a297d7c91bf20bf47e4f5" gracePeriod=15 Apr 23 16:40:16.823057 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.823031 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c76c6945d-7thrk_8e52370b-5f13-4378-b500-d5e555a57e65/console/0.log" Apr 23 16:40:16.823191 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.823106 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:40:16.895520 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.895433 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-trusted-ca-bundle\") pod \"8e52370b-5f13-4378-b500-d5e555a57e65\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " Apr 23 16:40:16.895520 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.895479 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e52370b-5f13-4378-b500-d5e555a57e65-console-oauth-config\") pod \"8e52370b-5f13-4378-b500-d5e555a57e65\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " Apr 23 16:40:16.895520 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.895507 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-oauth-serving-cert\") pod \"8e52370b-5f13-4378-b500-d5e555a57e65\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " Apr 23 16:40:16.895780 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.895533 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fss2s\" (UniqueName: \"kubernetes.io/projected/8e52370b-5f13-4378-b500-d5e555a57e65-kube-api-access-fss2s\") pod \"8e52370b-5f13-4378-b500-d5e555a57e65\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " Apr 23 16:40:16.895780 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.895551 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-console-config\") pod \"8e52370b-5f13-4378-b500-d5e555a57e65\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " Apr 23 16:40:16.895780 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.895570 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-service-ca\") pod \"8e52370b-5f13-4378-b500-d5e555a57e65\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " Apr 23 16:40:16.895780 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.895593 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e52370b-5f13-4378-b500-d5e555a57e65-console-serving-cert\") pod \"8e52370b-5f13-4378-b500-d5e555a57e65\" (UID: \"8e52370b-5f13-4378-b500-d5e555a57e65\") " Apr 23 16:40:16.896001 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.895972 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8e52370b-5f13-4378-b500-d5e555a57e65" (UID: "8e52370b-5f13-4378-b500-d5e555a57e65"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:40:16.896061 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.896018 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-service-ca" (OuterVolumeSpecName: "service-ca") pod "8e52370b-5f13-4378-b500-d5e555a57e65" (UID: "8e52370b-5f13-4378-b500-d5e555a57e65"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:40:16.896114 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.896061 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-console-config" (OuterVolumeSpecName: "console-config") pod "8e52370b-5f13-4378-b500-d5e555a57e65" (UID: "8e52370b-5f13-4378-b500-d5e555a57e65"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:40:16.896114 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.896092 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8e52370b-5f13-4378-b500-d5e555a57e65" (UID: "8e52370b-5f13-4378-b500-d5e555a57e65"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:40:16.897914 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.897887 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e52370b-5f13-4378-b500-d5e555a57e65-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8e52370b-5f13-4378-b500-d5e555a57e65" (UID: "8e52370b-5f13-4378-b500-d5e555a57e65"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:40:16.898006 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.897920 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e52370b-5f13-4378-b500-d5e555a57e65-kube-api-access-fss2s" (OuterVolumeSpecName: "kube-api-access-fss2s") pod "8e52370b-5f13-4378-b500-d5e555a57e65" (UID: "8e52370b-5f13-4378-b500-d5e555a57e65"). InnerVolumeSpecName "kube-api-access-fss2s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:40:16.898006 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.897938 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e52370b-5f13-4378-b500-d5e555a57e65-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8e52370b-5f13-4378-b500-d5e555a57e65" (UID: "8e52370b-5f13-4378-b500-d5e555a57e65"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:40:16.996653 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.996617 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-trusted-ca-bundle\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:40:16.996653 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.996646 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8e52370b-5f13-4378-b500-d5e555a57e65-console-oauth-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:40:16.996653 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.996656 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-oauth-serving-cert\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:40:16.996868 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.996665 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fss2s\" (UniqueName: \"kubernetes.io/projected/8e52370b-5f13-4378-b500-d5e555a57e65-kube-api-access-fss2s\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:40:16.996868 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.996674 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-console-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:40:16.996868 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.996683 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e52370b-5f13-4378-b500-d5e555a57e65-service-ca\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:40:16.996868 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:16.996691 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e52370b-5f13-4378-b500-d5e555a57e65-console-serving-cert\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:40:17.586103 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:17.586077 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c76c6945d-7thrk_8e52370b-5f13-4378-b500-d5e555a57e65/console/0.log" Apr 23 16:40:17.586239 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:17.586116 2578 generic.go:358] "Generic (PLEG): container finished" podID="8e52370b-5f13-4378-b500-d5e555a57e65" containerID="c78b018771796beaa0f3e0b031776ce705bb1e78e50a297d7c91bf20bf47e4f5" exitCode=2 Apr 23 16:40:17.586239 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:17.586147 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c76c6945d-7thrk" event={"ID":"8e52370b-5f13-4378-b500-d5e555a57e65","Type":"ContainerDied","Data":"c78b018771796beaa0f3e0b031776ce705bb1e78e50a297d7c91bf20bf47e4f5"} Apr 23 16:40:17.586239 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:17.586173 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c76c6945d-7thrk" Apr 23 16:40:17.586239 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:17.586187 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c76c6945d-7thrk" event={"ID":"8e52370b-5f13-4378-b500-d5e555a57e65","Type":"ContainerDied","Data":"a5cdeb4f878855a8ad9adb4fd8f0748e28a396cb1cb0643e49bcbb2b921e8950"} Apr 23 16:40:17.586239 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:17.586205 2578 scope.go:117] "RemoveContainer" containerID="c78b018771796beaa0f3e0b031776ce705bb1e78e50a297d7c91bf20bf47e4f5" Apr 23 16:40:17.594271 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:17.594132 2578 scope.go:117] "RemoveContainer" containerID="c78b018771796beaa0f3e0b031776ce705bb1e78e50a297d7c91bf20bf47e4f5" Apr 23 16:40:17.594521 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:40:17.594405 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c78b018771796beaa0f3e0b031776ce705bb1e78e50a297d7c91bf20bf47e4f5\": container with ID starting with c78b018771796beaa0f3e0b031776ce705bb1e78e50a297d7c91bf20bf47e4f5 not found: ID does not exist" containerID="c78b018771796beaa0f3e0b031776ce705bb1e78e50a297d7c91bf20bf47e4f5" Apr 23 16:40:17.594521 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:17.594433 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c78b018771796beaa0f3e0b031776ce705bb1e78e50a297d7c91bf20bf47e4f5"} err="failed to get container status \"c78b018771796beaa0f3e0b031776ce705bb1e78e50a297d7c91bf20bf47e4f5\": rpc error: code = NotFound desc = could not find container \"c78b018771796beaa0f3e0b031776ce705bb1e78e50a297d7c91bf20bf47e4f5\": container with ID starting with c78b018771796beaa0f3e0b031776ce705bb1e78e50a297d7c91bf20bf47e4f5 not found: ID does not exist" Apr 23 16:40:17.606176 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:17.606152 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c76c6945d-7thrk"] Apr 23 16:40:17.610194 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:17.610172 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c76c6945d-7thrk"] Apr 23 16:40:18.578569 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:18.576930 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e52370b-5f13-4378-b500-d5e555a57e65" path="/var/lib/kubelet/pods/8e52370b-5f13-4378-b500-d5e555a57e65/volumes" Apr 23 16:40:33.753765 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:33.753732 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj"] Apr 23 16:40:33.754164 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:33.754062 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e52370b-5f13-4378-b500-d5e555a57e65" containerName="console" Apr 23 16:40:33.754164 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:33.754072 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52370b-5f13-4378-b500-d5e555a57e65" containerName="console" Apr 23 16:40:33.754164 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:33.754121 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e52370b-5f13-4378-b500-d5e555a57e65" containerName="console" Apr 23 16:40:33.756131 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:33.756114 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj" Apr 23 16:40:33.758969 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:33.758943 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-tt7pr\"" Apr 23 16:40:33.759585 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:33.759556 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 16:40:33.759585 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:33.759562 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 16:40:33.767167 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:33.767144 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj"] Apr 23 16:40:33.832938 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:33.832901 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c0992f9-da57-4a88-8d1e-8b5c9635bac0-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj\" (UID: \"0c0992f9-da57-4a88-8d1e-8b5c9635bac0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj" Apr 23 16:40:33.833101 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:33.832963 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c0992f9-da57-4a88-8d1e-8b5c9635bac0-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj\" (UID: \"0c0992f9-da57-4a88-8d1e-8b5c9635bac0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj" Apr 23 16:40:33.833101 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:33.833035 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25dmt\" (UniqueName: \"kubernetes.io/projected/0c0992f9-da57-4a88-8d1e-8b5c9635bac0-kube-api-access-25dmt\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj\" (UID: \"0c0992f9-da57-4a88-8d1e-8b5c9635bac0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj" Apr 23 16:40:33.934427 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:33.934392 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25dmt\" (UniqueName: \"kubernetes.io/projected/0c0992f9-da57-4a88-8d1e-8b5c9635bac0-kube-api-access-25dmt\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj\" (UID: \"0c0992f9-da57-4a88-8d1e-8b5c9635bac0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj" Apr 23 16:40:33.934592 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:33.934464 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c0992f9-da57-4a88-8d1e-8b5c9635bac0-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj\" (UID: \"0c0992f9-da57-4a88-8d1e-8b5c9635bac0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj" Apr 23 16:40:33.934592 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:33.934498 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c0992f9-da57-4a88-8d1e-8b5c9635bac0-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj\" (UID: \"0c0992f9-da57-4a88-8d1e-8b5c9635bac0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj" Apr 23 16:40:33.934855 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:33.934835 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c0992f9-da57-4a88-8d1e-8b5c9635bac0-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj\" (UID: \"0c0992f9-da57-4a88-8d1e-8b5c9635bac0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj" Apr 23 16:40:33.934891 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:33.934861 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c0992f9-da57-4a88-8d1e-8b5c9635bac0-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj\" (UID: \"0c0992f9-da57-4a88-8d1e-8b5c9635bac0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj" Apr 23 16:40:33.943045 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:33.943021 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25dmt\" (UniqueName: \"kubernetes.io/projected/0c0992f9-da57-4a88-8d1e-8b5c9635bac0-kube-api-access-25dmt\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj\" (UID: \"0c0992f9-da57-4a88-8d1e-8b5c9635bac0\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj" Apr 23 16:40:34.065871 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:34.065843 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj" Apr 23 16:40:34.183309 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:34.183160 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj"] Apr 23 16:40:34.186297 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:40:34.186275 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c0992f9_da57_4a88_8d1e_8b5c9635bac0.slice/crio-3e13afe6ff0de9e6a69eab5c19450f6129a7cd5184bfbcd705097df48dfe6905 WatchSource:0}: Error finding container 3e13afe6ff0de9e6a69eab5c19450f6129a7cd5184bfbcd705097df48dfe6905: Status 404 returned error can't find the container with id 3e13afe6ff0de9e6a69eab5c19450f6129a7cd5184bfbcd705097df48dfe6905 Apr 23 16:40:34.187956 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:34.187940 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:40:34.637938 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:34.637901 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj" event={"ID":"0c0992f9-da57-4a88-8d1e-8b5c9635bac0","Type":"ContainerStarted","Data":"3e13afe6ff0de9e6a69eab5c19450f6129a7cd5184bfbcd705097df48dfe6905"} Apr 23 16:40:40.658797 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:40.658758 2578 generic.go:358] "Generic (PLEG): container finished" podID="0c0992f9-da57-4a88-8d1e-8b5c9635bac0" containerID="a5e5a65446ea83a419139fd8e594e5720bcf0e398cde0bfa09bc6ab8d069d80b" exitCode=0 Apr 23 16:40:40.659140 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:40.658834 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj" event={"ID":"0c0992f9-da57-4a88-8d1e-8b5c9635bac0","Type":"ContainerDied","Data":"a5e5a65446ea83a419139fd8e594e5720bcf0e398cde0bfa09bc6ab8d069d80b"} Apr 23 16:40:43.669726 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:43.669686 2578 generic.go:358] "Generic (PLEG): container finished" podID="0c0992f9-da57-4a88-8d1e-8b5c9635bac0" containerID="b82efd0a37108e030f4b61c0fd7d6275a175a0a3f1f39bf79146d74e3e917892" exitCode=0 Apr 23 16:40:43.670202 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:43.669770 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj" event={"ID":"0c0992f9-da57-4a88-8d1e-8b5c9635bac0","Type":"ContainerDied","Data":"b82efd0a37108e030f4b61c0fd7d6275a175a0a3f1f39bf79146d74e3e917892"} Apr 23 16:40:50.694523 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:50.694487 2578 generic.go:358] "Generic (PLEG): container finished" podID="0c0992f9-da57-4a88-8d1e-8b5c9635bac0" containerID="d1760f1b5d8c3b88eaa06f8d3f698219ef11fb1ce6e17d0680ba2f80cf579612" exitCode=0 Apr 23 16:40:50.694914 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:50.694535 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj" event={"ID":"0c0992f9-da57-4a88-8d1e-8b5c9635bac0","Type":"ContainerDied","Data":"d1760f1b5d8c3b88eaa06f8d3f698219ef11fb1ce6e17d0680ba2f80cf579612"} Apr 23 16:40:51.816099 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:51.816074 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj" Apr 23 16:40:51.889435 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:51.889368 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c0992f9-da57-4a88-8d1e-8b5c9635bac0-bundle\") pod \"0c0992f9-da57-4a88-8d1e-8b5c9635bac0\" (UID: \"0c0992f9-da57-4a88-8d1e-8b5c9635bac0\") " Apr 23 16:40:51.889584 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:51.889446 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25dmt\" (UniqueName: \"kubernetes.io/projected/0c0992f9-da57-4a88-8d1e-8b5c9635bac0-kube-api-access-25dmt\") pod \"0c0992f9-da57-4a88-8d1e-8b5c9635bac0\" (UID: \"0c0992f9-da57-4a88-8d1e-8b5c9635bac0\") " Apr 23 16:40:51.889584 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:51.889514 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c0992f9-da57-4a88-8d1e-8b5c9635bac0-util\") pod \"0c0992f9-da57-4a88-8d1e-8b5c9635bac0\" (UID: \"0c0992f9-da57-4a88-8d1e-8b5c9635bac0\") " Apr 23 16:40:51.889957 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:51.889925 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c0992f9-da57-4a88-8d1e-8b5c9635bac0-bundle" (OuterVolumeSpecName: "bundle") pod "0c0992f9-da57-4a88-8d1e-8b5c9635bac0" (UID: "0c0992f9-da57-4a88-8d1e-8b5c9635bac0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:40:51.891596 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:51.891561 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c0992f9-da57-4a88-8d1e-8b5c9635bac0-kube-api-access-25dmt" (OuterVolumeSpecName: "kube-api-access-25dmt") pod "0c0992f9-da57-4a88-8d1e-8b5c9635bac0" (UID: "0c0992f9-da57-4a88-8d1e-8b5c9635bac0"). InnerVolumeSpecName "kube-api-access-25dmt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:40:51.893357 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:51.893335 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c0992f9-da57-4a88-8d1e-8b5c9635bac0-util" (OuterVolumeSpecName: "util") pod "0c0992f9-da57-4a88-8d1e-8b5c9635bac0" (UID: "0c0992f9-da57-4a88-8d1e-8b5c9635bac0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:40:51.991151 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:51.991072 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0c0992f9-da57-4a88-8d1e-8b5c9635bac0-bundle\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:40:51.991151 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:51.991101 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-25dmt\" (UniqueName: \"kubernetes.io/projected/0c0992f9-da57-4a88-8d1e-8b5c9635bac0-kube-api-access-25dmt\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:40:51.991151 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:51.991111 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0c0992f9-da57-4a88-8d1e-8b5c9635bac0-util\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:40:52.702459 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:52.702425 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj" event={"ID":"0c0992f9-da57-4a88-8d1e-8b5c9635bac0","Type":"ContainerDied","Data":"3e13afe6ff0de9e6a69eab5c19450f6129a7cd5184bfbcd705097df48dfe6905"} Apr 23 16:40:52.702459 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:52.702459 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e13afe6ff0de9e6a69eab5c19450f6129a7cd5184bfbcd705097df48dfe6905" Apr 23 16:40:52.702648 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:52.702485 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c68bvj" Apr 23 16:40:56.337719 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.337630 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x9knc"] Apr 23 16:40:56.338088 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.337964 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c0992f9-da57-4a88-8d1e-8b5c9635bac0" containerName="pull" Apr 23 16:40:56.338088 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.337978 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0992f9-da57-4a88-8d1e-8b5c9635bac0" containerName="pull" Apr 23 16:40:56.338088 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.337998 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c0992f9-da57-4a88-8d1e-8b5c9635bac0" containerName="extract" Apr 23 16:40:56.338088 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.338003 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0992f9-da57-4a88-8d1e-8b5c9635bac0" containerName="extract" Apr 23 16:40:56.338088 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.338009 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c0992f9-da57-4a88-8d1e-8b5c9635bac0" containerName="util" Apr 23 16:40:56.338088 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.338013 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0992f9-da57-4a88-8d1e-8b5c9635bac0" containerName="util" Apr 23 16:40:56.343234 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.338473 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c0992f9-da57-4a88-8d1e-8b5c9635bac0" containerName="extract" Apr 23 16:40:56.346601 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.346581 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x9knc" Apr 23 16:40:56.349020 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.348996 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 23 16:40:56.349253 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.349235 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 23 16:40:56.349308 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.349254 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 23 16:40:56.349308 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.349277 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-qd64s\"" Apr 23 16:40:56.355277 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.355253 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x9knc"] Apr 23 16:40:56.528885 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.528848 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/fa30c036-dd8a-4855-b626-20fa3aab5ea8-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-x9knc\" (UID: \"fa30c036-dd8a-4855-b626-20fa3aab5ea8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x9knc" Apr 23 16:40:56.529050 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.528893 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mzhx\" (UniqueName: \"kubernetes.io/projected/fa30c036-dd8a-4855-b626-20fa3aab5ea8-kube-api-access-6mzhx\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-x9knc\" (UID: \"fa30c036-dd8a-4855-b626-20fa3aab5ea8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x9knc" Apr 23 16:40:56.629732 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.629644 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/fa30c036-dd8a-4855-b626-20fa3aab5ea8-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-x9knc\" (UID: \"fa30c036-dd8a-4855-b626-20fa3aab5ea8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x9knc" Apr 23 16:40:56.629732 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.629691 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mzhx\" (UniqueName: \"kubernetes.io/projected/fa30c036-dd8a-4855-b626-20fa3aab5ea8-kube-api-access-6mzhx\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-x9knc\" (UID: \"fa30c036-dd8a-4855-b626-20fa3aab5ea8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x9knc" Apr 23 16:40:56.632020 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.632000 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/fa30c036-dd8a-4855-b626-20fa3aab5ea8-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-x9knc\" (UID: \"fa30c036-dd8a-4855-b626-20fa3aab5ea8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x9knc" Apr 23 16:40:56.641777 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.641748 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mzhx\" (UniqueName: \"kubernetes.io/projected/fa30c036-dd8a-4855-b626-20fa3aab5ea8-kube-api-access-6mzhx\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-x9knc\" (UID: \"fa30c036-dd8a-4855-b626-20fa3aab5ea8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x9knc" Apr 23 16:40:56.657877 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.657846 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x9knc" Apr 23 16:40:56.789315 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:56.789270 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x9knc"] Apr 23 16:40:56.791551 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:40:56.791524 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa30c036_dd8a_4855_b626_20fa3aab5ea8.slice/crio-b6fbf2474f36164a36e2adba7a4c1cc2864237c67d484392b551e33d1fa6f12f WatchSource:0}: Error finding container b6fbf2474f36164a36e2adba7a4c1cc2864237c67d484392b551e33d1fa6f12f: Status 404 returned error can't find the container with id b6fbf2474f36164a36e2adba7a4c1cc2864237c67d484392b551e33d1fa6f12f Apr 23 16:40:57.720313 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:40:57.720269 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x9knc" event={"ID":"fa30c036-dd8a-4855-b626-20fa3aab5ea8","Type":"ContainerStarted","Data":"b6fbf2474f36164a36e2adba7a4c1cc2864237c67d484392b551e33d1fa6f12f"} Apr 23 16:41:02.737673 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:02.737634 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x9knc" event={"ID":"fa30c036-dd8a-4855-b626-20fa3aab5ea8","Type":"ContainerStarted","Data":"adc5bff8e63db30167bcc30753c12fe5cc7e8cc897236dfb524b39a216440845"} Apr 23 16:41:02.738101 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:02.737703 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x9knc" Apr 23 16:41:02.750416 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:02.750369 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-c742c"] Apr 23 16:41:02.753762 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:02.753741 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-c742c" Apr 23 16:41:02.755740 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:02.755722 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 23 16:41:02.755852 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:02.755722 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 23 16:41:02.755920 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:02.755902 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-5rpsh\"" Apr 23 16:41:02.758200 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:02.758163 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x9knc" podStartSLOduration=1.299968241 podStartE2EDuration="6.75814862s" podCreationTimestamp="2026-04-23 16:40:56 +0000 UTC" firstStartedPulling="2026-04-23 16:40:56.793211799 +0000 UTC m=+348.815607897" lastFinishedPulling="2026-04-23 16:41:02.251392169 +0000 UTC m=+354.273788276" observedRunningTime="2026-04-23 16:41:02.756654896 +0000 UTC m=+354.779051009" watchObservedRunningTime="2026-04-23 16:41:02.75814862 +0000 UTC m=+354.780544737" Apr 23 16:41:02.761560 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:02.761541 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-c742c"] Apr 23 16:41:02.884637 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:02.884597 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/46edb611-6399-46a8-b392-e4531ab8924e-certificates\") pod \"keda-operator-ffbb595cb-c742c\" (UID: \"46edb611-6399-46a8-b392-e4531ab8924e\") " pod="openshift-keda/keda-operator-ffbb595cb-c742c" Apr 23 16:41:02.884823 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:02.884644 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/46edb611-6399-46a8-b392-e4531ab8924e-cabundle0\") pod \"keda-operator-ffbb595cb-c742c\" (UID: \"46edb611-6399-46a8-b392-e4531ab8924e\") " pod="openshift-keda/keda-operator-ffbb595cb-c742c" Apr 23 16:41:02.884823 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:02.884674 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znbkh\" (UniqueName: \"kubernetes.io/projected/46edb611-6399-46a8-b392-e4531ab8924e-kube-api-access-znbkh\") pod \"keda-operator-ffbb595cb-c742c\" (UID: \"46edb611-6399-46a8-b392-e4531ab8924e\") " pod="openshift-keda/keda-operator-ffbb595cb-c742c" Apr 23 16:41:02.985485 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:02.985442 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/46edb611-6399-46a8-b392-e4531ab8924e-cabundle0\") pod \"keda-operator-ffbb595cb-c742c\" (UID: \"46edb611-6399-46a8-b392-e4531ab8924e\") " pod="openshift-keda/keda-operator-ffbb595cb-c742c" Apr 23 16:41:02.985651 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:02.985496 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znbkh\" (UniqueName: \"kubernetes.io/projected/46edb611-6399-46a8-b392-e4531ab8924e-kube-api-access-znbkh\") pod \"keda-operator-ffbb595cb-c742c\" (UID: \"46edb611-6399-46a8-b392-e4531ab8924e\") " pod="openshift-keda/keda-operator-ffbb595cb-c742c" Apr 23 16:41:02.985651 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:02.985629 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/46edb611-6399-46a8-b392-e4531ab8924e-certificates\") pod \"keda-operator-ffbb595cb-c742c\" (UID: \"46edb611-6399-46a8-b392-e4531ab8924e\") " pod="openshift-keda/keda-operator-ffbb595cb-c742c" Apr 23 16:41:02.985768 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:02.985750 2578 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 23 16:41:02.985825 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:02.985775 2578 secret.go:281] references non-existent secret key: ca.crt Apr 23 16:41:02.985825 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:02.985782 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 16:41:02.985825 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:02.985793 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-c742c: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 23 16:41:02.985982 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:02.985839 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/46edb611-6399-46a8-b392-e4531ab8924e-certificates podName:46edb611-6399-46a8-b392-e4531ab8924e nodeName:}" failed. No retries permitted until 2026-04-23 16:41:03.485824562 +0000 UTC m=+355.508220656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/46edb611-6399-46a8-b392-e4531ab8924e-certificates") pod "keda-operator-ffbb595cb-c742c" (UID: "46edb611-6399-46a8-b392-e4531ab8924e") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 23 16:41:02.986051 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:02.986033 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/46edb611-6399-46a8-b392-e4531ab8924e-cabundle0\") pod \"keda-operator-ffbb595cb-c742c\" (UID: \"46edb611-6399-46a8-b392-e4531ab8924e\") " pod="openshift-keda/keda-operator-ffbb595cb-c742c" Apr 23 16:41:02.996777 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:02.996721 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znbkh\" (UniqueName: \"kubernetes.io/projected/46edb611-6399-46a8-b392-e4531ab8924e-kube-api-access-znbkh\") pod \"keda-operator-ffbb595cb-c742c\" (UID: \"46edb611-6399-46a8-b392-e4531ab8924e\") " pod="openshift-keda/keda-operator-ffbb595cb-c742c" Apr 23 16:41:03.027293 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.027267 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj"] Apr 23 16:41:03.030575 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.030560 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj" Apr 23 16:41:03.032729 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.032712 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 23 16:41:03.039423 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.039399 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj"] Apr 23 16:41:03.187390 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.187342 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/67a8b929-5160-4c98-9982-963d4a652e67-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-jtnmj\" (UID: \"67a8b929-5160-4c98-9982-963d4a652e67\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj" Apr 23 16:41:03.187549 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.187462 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njh7d\" (UniqueName: \"kubernetes.io/projected/67a8b929-5160-4c98-9982-963d4a652e67-kube-api-access-njh7d\") pod \"keda-metrics-apiserver-7c9f485588-jtnmj\" (UID: \"67a8b929-5160-4c98-9982-963d4a652e67\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj" Apr 23 16:41:03.187615 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.187549 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/67a8b929-5160-4c98-9982-963d4a652e67-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jtnmj\" (UID: \"67a8b929-5160-4c98-9982-963d4a652e67\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj" Apr 23 16:41:03.228020 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.227984 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-f8tjb"] Apr 23 16:41:03.231200 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.231177 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-f8tjb" Apr 23 16:41:03.234171 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.234153 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 23 16:41:03.242302 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.242281 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-f8tjb"] Apr 23 16:41:03.288664 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.288588 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/67a8b929-5160-4c98-9982-963d4a652e67-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-jtnmj\" (UID: \"67a8b929-5160-4c98-9982-963d4a652e67\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj" Apr 23 16:41:03.288664 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.288653 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-njh7d\" (UniqueName: \"kubernetes.io/projected/67a8b929-5160-4c98-9982-963d4a652e67-kube-api-access-njh7d\") pod \"keda-metrics-apiserver-7c9f485588-jtnmj\" (UID: \"67a8b929-5160-4c98-9982-963d4a652e67\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj" Apr 23 16:41:03.288856 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.288696 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/67a8b929-5160-4c98-9982-963d4a652e67-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jtnmj\" (UID: \"67a8b929-5160-4c98-9982-963d4a652e67\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj" Apr 23 16:41:03.288856 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:03.288831 2578 secret.go:281] references non-existent secret key: tls.crt Apr 23 16:41:03.288856 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:03.288846 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 16:41:03.289007 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:03.288863 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj: references non-existent secret key: tls.crt Apr 23 16:41:03.289007 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:03.288906 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/67a8b929-5160-4c98-9982-963d4a652e67-certificates podName:67a8b929-5160-4c98-9982-963d4a652e67 nodeName:}" failed. No retries permitted until 2026-04-23 16:41:03.788893568 +0000 UTC m=+355.811289663 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/67a8b929-5160-4c98-9982-963d4a652e67-certificates") pod "keda-metrics-apiserver-7c9f485588-jtnmj" (UID: "67a8b929-5160-4c98-9982-963d4a652e67") : references non-existent secret key: tls.crt Apr 23 16:41:03.289007 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.288919 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/67a8b929-5160-4c98-9982-963d4a652e67-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-jtnmj\" (UID: \"67a8b929-5160-4c98-9982-963d4a652e67\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj" Apr 23 16:41:03.296736 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.296710 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-njh7d\" (UniqueName: \"kubernetes.io/projected/67a8b929-5160-4c98-9982-963d4a652e67-kube-api-access-njh7d\") pod \"keda-metrics-apiserver-7c9f485588-jtnmj\" (UID: \"67a8b929-5160-4c98-9982-963d4a652e67\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj" Apr 23 16:41:03.389992 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.389955 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzxx6\" (UniqueName: \"kubernetes.io/projected/8bdbb2e9-3143-4234-991f-f16bbcf9f0a1-kube-api-access-dzxx6\") pod \"keda-admission-cf49989db-f8tjb\" (UID: \"8bdbb2e9-3143-4234-991f-f16bbcf9f0a1\") " pod="openshift-keda/keda-admission-cf49989db-f8tjb" Apr 23 16:41:03.389992 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.389996 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8bdbb2e9-3143-4234-991f-f16bbcf9f0a1-certificates\") pod \"keda-admission-cf49989db-f8tjb\" (UID: \"8bdbb2e9-3143-4234-991f-f16bbcf9f0a1\") " pod="openshift-keda/keda-admission-cf49989db-f8tjb" Apr 23 16:41:03.490569 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.490530 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzxx6\" (UniqueName: \"kubernetes.io/projected/8bdbb2e9-3143-4234-991f-f16bbcf9f0a1-kube-api-access-dzxx6\") pod \"keda-admission-cf49989db-f8tjb\" (UID: \"8bdbb2e9-3143-4234-991f-f16bbcf9f0a1\") " pod="openshift-keda/keda-admission-cf49989db-f8tjb" Apr 23 16:41:03.490747 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.490581 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8bdbb2e9-3143-4234-991f-f16bbcf9f0a1-certificates\") pod \"keda-admission-cf49989db-f8tjb\" (UID: \"8bdbb2e9-3143-4234-991f-f16bbcf9f0a1\") " pod="openshift-keda/keda-admission-cf49989db-f8tjb" Apr 23 16:41:03.490747 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.490671 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/46edb611-6399-46a8-b392-e4531ab8924e-certificates\") pod \"keda-operator-ffbb595cb-c742c\" (UID: \"46edb611-6399-46a8-b392-e4531ab8924e\") " pod="openshift-keda/keda-operator-ffbb595cb-c742c" Apr 23 16:41:03.490832 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:03.490810 2578 secret.go:281] references non-existent secret key: ca.crt Apr 23 16:41:03.490866 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:03.490831 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 16:41:03.490866 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:03.490842 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-c742c: references non-existent secret key: ca.crt Apr 23 16:41:03.490930 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:03.490902 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/46edb611-6399-46a8-b392-e4531ab8924e-certificates podName:46edb611-6399-46a8-b392-e4531ab8924e nodeName:}" failed. No retries permitted until 2026-04-23 16:41:04.490883491 +0000 UTC m=+356.513279588 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/46edb611-6399-46a8-b392-e4531ab8924e-certificates") pod "keda-operator-ffbb595cb-c742c" (UID: "46edb611-6399-46a8-b392-e4531ab8924e") : references non-existent secret key: ca.crt Apr 23 16:41:03.493272 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.493237 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8bdbb2e9-3143-4234-991f-f16bbcf9f0a1-certificates\") pod \"keda-admission-cf49989db-f8tjb\" (UID: \"8bdbb2e9-3143-4234-991f-f16bbcf9f0a1\") " pod="openshift-keda/keda-admission-cf49989db-f8tjb" Apr 23 16:41:03.498359 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.498332 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzxx6\" (UniqueName: \"kubernetes.io/projected/8bdbb2e9-3143-4234-991f-f16bbcf9f0a1-kube-api-access-dzxx6\") pod \"keda-admission-cf49989db-f8tjb\" (UID: \"8bdbb2e9-3143-4234-991f-f16bbcf9f0a1\") " pod="openshift-keda/keda-admission-cf49989db-f8tjb" Apr 23 16:41:03.541195 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.541136 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-f8tjb" Apr 23 16:41:03.663831 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.663795 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-f8tjb"] Apr 23 16:41:03.667477 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:41:03.667448 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bdbb2e9_3143_4234_991f_f16bbcf9f0a1.slice/crio-0982561f6a03e640d08ef54f885584e7b600960eb8f9cfeac6eaa1437bb25efd WatchSource:0}: Error finding container 0982561f6a03e640d08ef54f885584e7b600960eb8f9cfeac6eaa1437bb25efd: Status 404 returned error can't find the container with id 0982561f6a03e640d08ef54f885584e7b600960eb8f9cfeac6eaa1437bb25efd Apr 23 16:41:03.741746 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.741708 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-f8tjb" event={"ID":"8bdbb2e9-3143-4234-991f-f16bbcf9f0a1","Type":"ContainerStarted","Data":"0982561f6a03e640d08ef54f885584e7b600960eb8f9cfeac6eaa1437bb25efd"} Apr 23 16:41:03.792920 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:03.792850 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/67a8b929-5160-4c98-9982-963d4a652e67-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jtnmj\" (UID: \"67a8b929-5160-4c98-9982-963d4a652e67\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj" Apr 23 16:41:03.793053 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:03.792984 2578 secret.go:281] references non-existent secret key: tls.crt Apr 23 16:41:03.793053 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:03.792999 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 16:41:03.793053 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:03.793017 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj: references non-existent secret key: tls.crt Apr 23 16:41:03.793159 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:03.793066 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/67a8b929-5160-4c98-9982-963d4a652e67-certificates podName:67a8b929-5160-4c98-9982-963d4a652e67 nodeName:}" failed. No retries permitted until 2026-04-23 16:41:04.793049488 +0000 UTC m=+356.815445600 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/67a8b929-5160-4c98-9982-963d4a652e67-certificates") pod "keda-metrics-apiserver-7c9f485588-jtnmj" (UID: "67a8b929-5160-4c98-9982-963d4a652e67") : references non-existent secret key: tls.crt Apr 23 16:41:04.499711 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:04.499674 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/46edb611-6399-46a8-b392-e4531ab8924e-certificates\") pod \"keda-operator-ffbb595cb-c742c\" (UID: \"46edb611-6399-46a8-b392-e4531ab8924e\") " pod="openshift-keda/keda-operator-ffbb595cb-c742c" Apr 23 16:41:04.499886 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:04.499821 2578 secret.go:281] references non-existent secret key: ca.crt Apr 23 16:41:04.499886 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:04.499842 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 16:41:04.499886 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:04.499852 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-c742c: references non-existent secret key: ca.crt Apr 23 16:41:04.499983 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:04.499906 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/46edb611-6399-46a8-b392-e4531ab8924e-certificates podName:46edb611-6399-46a8-b392-e4531ab8924e nodeName:}" failed. No retries permitted until 2026-04-23 16:41:06.499892013 +0000 UTC m=+358.522288107 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/46edb611-6399-46a8-b392-e4531ab8924e-certificates") pod "keda-operator-ffbb595cb-c742c" (UID: "46edb611-6399-46a8-b392-e4531ab8924e") : references non-existent secret key: ca.crt Apr 23 16:41:04.803220 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:04.803141 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/67a8b929-5160-4c98-9982-963d4a652e67-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jtnmj\" (UID: \"67a8b929-5160-4c98-9982-963d4a652e67\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj" Apr 23 16:41:04.803583 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:04.803258 2578 secret.go:281] references non-existent secret key: tls.crt Apr 23 16:41:04.803583 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:04.803272 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 16:41:04.803583 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:04.803290 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj: references non-existent secret key: tls.crt Apr 23 16:41:04.803583 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:41:04.803339 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/67a8b929-5160-4c98-9982-963d4a652e67-certificates podName:67a8b929-5160-4c98-9982-963d4a652e67 nodeName:}" failed. No retries permitted until 2026-04-23 16:41:06.803322979 +0000 UTC m=+358.825719072 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/67a8b929-5160-4c98-9982-963d4a652e67-certificates") pod "keda-metrics-apiserver-7c9f485588-jtnmj" (UID: "67a8b929-5160-4c98-9982-963d4a652e67") : references non-existent secret key: tls.crt Apr 23 16:41:06.515879 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:06.515845 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/46edb611-6399-46a8-b392-e4531ab8924e-certificates\") pod \"keda-operator-ffbb595cb-c742c\" (UID: \"46edb611-6399-46a8-b392-e4531ab8924e\") " pod="openshift-keda/keda-operator-ffbb595cb-c742c" Apr 23 16:41:06.518177 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:06.518155 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/46edb611-6399-46a8-b392-e4531ab8924e-certificates\") pod \"keda-operator-ffbb595cb-c742c\" (UID: \"46edb611-6399-46a8-b392-e4531ab8924e\") " pod="openshift-keda/keda-operator-ffbb595cb-c742c" Apr 23 16:41:06.664853 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:06.664821 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-c742c" Apr 23 16:41:06.795184 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:06.795107 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-c742c"] Apr 23 16:41:06.797793 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:41:06.797766 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46edb611_6399_46a8_b392_e4531ab8924e.slice/crio-854bbc6bb572b4248822d3ebc9c89214f3597e22dda1ec299cfdf3a1b2902ffe WatchSource:0}: Error finding container 854bbc6bb572b4248822d3ebc9c89214f3597e22dda1ec299cfdf3a1b2902ffe: Status 404 returned error can't find the container with id 854bbc6bb572b4248822d3ebc9c89214f3597e22dda1ec299cfdf3a1b2902ffe Apr 23 16:41:06.818447 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:06.818418 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/67a8b929-5160-4c98-9982-963d4a652e67-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jtnmj\" (UID: \"67a8b929-5160-4c98-9982-963d4a652e67\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj" Apr 23 16:41:06.820781 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:06.820758 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/67a8b929-5160-4c98-9982-963d4a652e67-certificates\") pod \"keda-metrics-apiserver-7c9f485588-jtnmj\" (UID: \"67a8b929-5160-4c98-9982-963d4a652e67\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj" Apr 23 16:41:06.940814 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:06.940783 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj" Apr 23 16:41:07.070173 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:07.070133 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj"] Apr 23 16:41:07.071476 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:41:07.071444 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67a8b929_5160_4c98_9982_963d4a652e67.slice/crio-4ba1a5cdc74c50fd25e2a0f97a3dcce2a2cccbc00c58816553ce31415a57580b WatchSource:0}: Error finding container 4ba1a5cdc74c50fd25e2a0f97a3dcce2a2cccbc00c58816553ce31415a57580b: Status 404 returned error can't find the container with id 4ba1a5cdc74c50fd25e2a0f97a3dcce2a2cccbc00c58816553ce31415a57580b Apr 23 16:41:07.758993 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:07.758938 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-c742c" event={"ID":"46edb611-6399-46a8-b392-e4531ab8924e","Type":"ContainerStarted","Data":"854bbc6bb572b4248822d3ebc9c89214f3597e22dda1ec299cfdf3a1b2902ffe"} Apr 23 16:41:07.760103 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:07.760073 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj" event={"ID":"67a8b929-5160-4c98-9982-963d4a652e67","Type":"ContainerStarted","Data":"4ba1a5cdc74c50fd25e2a0f97a3dcce2a2cccbc00c58816553ce31415a57580b"} Apr 23 16:41:08.764551 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:08.764463 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-f8tjb" event={"ID":"8bdbb2e9-3143-4234-991f-f16bbcf9f0a1","Type":"ContainerStarted","Data":"2d7b2246f8bb0f92c674cb977cfeb57fa70d837e84be99ae67f51be123996d2f"} Apr 23 16:41:08.764971 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:08.764665 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-f8tjb" Apr 23 16:41:18.803889 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:18.803847 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-c742c" event={"ID":"46edb611-6399-46a8-b392-e4531ab8924e","Type":"ContainerStarted","Data":"65ad99dc7967297c4225bb30a9497bbf339c3b51f94051c7197d6a75489d50b5"} Apr 23 16:41:18.804350 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:18.803968 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-c742c" Apr 23 16:41:18.805267 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:18.805241 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj" event={"ID":"67a8b929-5160-4c98-9982-963d4a652e67","Type":"ContainerStarted","Data":"b67b36c3744848c4fd71a0c9a2fee43bbfe65dd10c35748cc7d78c04ec69024d"} Apr 23 16:41:18.805422 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:18.805411 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj" Apr 23 16:41:18.822620 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:18.822567 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-c742c" podStartSLOduration=4.984306822 podStartE2EDuration="16.822553583s" podCreationTimestamp="2026-04-23 16:41:02 +0000 UTC" firstStartedPulling="2026-04-23 16:41:06.798979935 +0000 UTC m=+358.821376029" lastFinishedPulling="2026-04-23 16:41:18.637226636 +0000 UTC m=+370.659622790" observedRunningTime="2026-04-23 16:41:18.82164582 +0000 UTC m=+370.844041949" watchObservedRunningTime="2026-04-23 16:41:18.822553583 +0000 UTC m=+370.844949708" Apr 23 16:41:18.823372 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:18.823335 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-f8tjb" podStartSLOduration=11.085904671 podStartE2EDuration="15.823325747s" podCreationTimestamp="2026-04-23 16:41:03 +0000 UTC" firstStartedPulling="2026-04-23 16:41:03.669078535 +0000 UTC m=+355.691474629" lastFinishedPulling="2026-04-23 16:41:08.406499606 +0000 UTC m=+360.428895705" observedRunningTime="2026-04-23 16:41:08.783409486 +0000 UTC m=+360.805805602" watchObservedRunningTime="2026-04-23 16:41:18.823325747 +0000 UTC m=+370.845721864" Apr 23 16:41:18.839682 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:18.839636 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj" podStartSLOduration=4.281543645 podStartE2EDuration="15.839627376s" podCreationTimestamp="2026-04-23 16:41:03 +0000 UTC" firstStartedPulling="2026-04-23 16:41:07.072981226 +0000 UTC m=+359.095377323" lastFinishedPulling="2026-04-23 16:41:18.631064957 +0000 UTC m=+370.653461054" observedRunningTime="2026-04-23 16:41:18.837138205 +0000 UTC m=+370.859534320" watchObservedRunningTime="2026-04-23 16:41:18.839627376 +0000 UTC m=+370.862023494" Apr 23 16:41:23.744506 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:23.744472 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-x9knc" Apr 23 16:41:29.770078 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:29.770044 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-f8tjb" Apr 23 16:41:29.813203 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:29.813175 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-jtnmj" Apr 23 16:41:39.810974 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:41:39.810944 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-c742c" Apr 23 16:42:08.549553 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.549523 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-6b94ff949c-rkqk9"] Apr 23 16:42:08.552791 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.552772 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6b94ff949c-rkqk9" Apr 23 16:42:08.555163 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.555140 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 16:42:08.555261 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.555164 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-l9gnn\"" Apr 23 16:42:08.555563 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.555547 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 23 16:42:08.555620 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.555570 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 16:42:08.565832 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.565811 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6b94ff949c-rkqk9"] Apr 23 16:42:08.605873 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.605846 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-tllcv"] Apr 23 16:42:08.608885 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.608868 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-tllcv" Apr 23 16:42:08.611292 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.611274 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 23 16:42:08.611422 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.611292 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-xrv5h\"" Apr 23 16:42:08.624356 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.624332 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-tllcv"] Apr 23 16:42:08.629952 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.629934 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59c1528b-83e6-4e0d-b9a9-37900b7098b2-cert\") pod \"llmisvc-controller-manager-6b94ff949c-rkqk9\" (UID: \"59c1528b-83e6-4e0d-b9a9-37900b7098b2\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-rkqk9" Apr 23 16:42:08.630048 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.629983 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/421a788d-3f91-42f6-a112-cdf4bd228da3-data\") pod \"seaweedfs-86cc847c5c-tllcv\" (UID: \"421a788d-3f91-42f6-a112-cdf4bd228da3\") " pod="kserve/seaweedfs-86cc847c5c-tllcv" Apr 23 16:42:08.630097 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.630055 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw5f4\" (UniqueName: \"kubernetes.io/projected/59c1528b-83e6-4e0d-b9a9-37900b7098b2-kube-api-access-hw5f4\") pod \"llmisvc-controller-manager-6b94ff949c-rkqk9\" (UID: \"59c1528b-83e6-4e0d-b9a9-37900b7098b2\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-rkqk9" Apr 23 16:42:08.630097 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.630083 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wzhr\" (UniqueName: \"kubernetes.io/projected/421a788d-3f91-42f6-a112-cdf4bd228da3-kube-api-access-8wzhr\") pod \"seaweedfs-86cc847c5c-tllcv\" (UID: \"421a788d-3f91-42f6-a112-cdf4bd228da3\") " pod="kserve/seaweedfs-86cc847c5c-tllcv" Apr 23 16:42:08.730741 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.730702 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59c1528b-83e6-4e0d-b9a9-37900b7098b2-cert\") pod \"llmisvc-controller-manager-6b94ff949c-rkqk9\" (UID: \"59c1528b-83e6-4e0d-b9a9-37900b7098b2\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-rkqk9" Apr 23 16:42:08.730898 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.730771 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/421a788d-3f91-42f6-a112-cdf4bd228da3-data\") pod \"seaweedfs-86cc847c5c-tllcv\" (UID: \"421a788d-3f91-42f6-a112-cdf4bd228da3\") " pod="kserve/seaweedfs-86cc847c5c-tllcv" Apr 23 16:42:08.730898 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.730798 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hw5f4\" (UniqueName: \"kubernetes.io/projected/59c1528b-83e6-4e0d-b9a9-37900b7098b2-kube-api-access-hw5f4\") pod \"llmisvc-controller-manager-6b94ff949c-rkqk9\" (UID: \"59c1528b-83e6-4e0d-b9a9-37900b7098b2\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-rkqk9" Apr 23 16:42:08.730898 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.730818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wzhr\" (UniqueName: \"kubernetes.io/projected/421a788d-3f91-42f6-a112-cdf4bd228da3-kube-api-access-8wzhr\") pod \"seaweedfs-86cc847c5c-tllcv\" (UID: \"421a788d-3f91-42f6-a112-cdf4bd228da3\") " pod="kserve/seaweedfs-86cc847c5c-tllcv" Apr 23 16:42:08.731263 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.731238 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/421a788d-3f91-42f6-a112-cdf4bd228da3-data\") pod \"seaweedfs-86cc847c5c-tllcv\" (UID: \"421a788d-3f91-42f6-a112-cdf4bd228da3\") " pod="kserve/seaweedfs-86cc847c5c-tllcv" Apr 23 16:42:08.733280 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.733254 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59c1528b-83e6-4e0d-b9a9-37900b7098b2-cert\") pod \"llmisvc-controller-manager-6b94ff949c-rkqk9\" (UID: \"59c1528b-83e6-4e0d-b9a9-37900b7098b2\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-rkqk9" Apr 23 16:42:08.740052 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.740025 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wzhr\" (UniqueName: \"kubernetes.io/projected/421a788d-3f91-42f6-a112-cdf4bd228da3-kube-api-access-8wzhr\") pod \"seaweedfs-86cc847c5c-tllcv\" (UID: \"421a788d-3f91-42f6-a112-cdf4bd228da3\") " pod="kserve/seaweedfs-86cc847c5c-tllcv" Apr 23 16:42:08.741450 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.741429 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw5f4\" (UniqueName: \"kubernetes.io/projected/59c1528b-83e6-4e0d-b9a9-37900b7098b2-kube-api-access-hw5f4\") pod \"llmisvc-controller-manager-6b94ff949c-rkqk9\" (UID: \"59c1528b-83e6-4e0d-b9a9-37900b7098b2\") " pod="kserve/llmisvc-controller-manager-6b94ff949c-rkqk9" Apr 23 16:42:08.862298 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.862254 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6b94ff949c-rkqk9" Apr 23 16:42:08.918081 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.918053 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-tllcv" Apr 23 16:42:08.987558 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:08.987513 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6b94ff949c-rkqk9"] Apr 23 16:42:08.989236 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:42:08.989180 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod59c1528b_83e6_4e0d_b9a9_37900b7098b2.slice/crio-d003182d89526a6a0a37978a9cb1b9e30d077e8e2f231ba0197f97684d1887c4 WatchSource:0}: Error finding container d003182d89526a6a0a37978a9cb1b9e30d077e8e2f231ba0197f97684d1887c4: Status 404 returned error can't find the container with id d003182d89526a6a0a37978a9cb1b9e30d077e8e2f231ba0197f97684d1887c4 Apr 23 16:42:09.054336 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:09.054311 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-tllcv"] Apr 23 16:42:09.056224 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:42:09.056195 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod421a788d_3f91_42f6_a112_cdf4bd228da3.slice/crio-aa744c4b10eca118e9f34860bc8ac830363097eb0169e69c20c1ec3fe932ab1e WatchSource:0}: Error finding container aa744c4b10eca118e9f34860bc8ac830363097eb0169e69c20c1ec3fe932ab1e: Status 404 returned error can't find the container with id aa744c4b10eca118e9f34860bc8ac830363097eb0169e69c20c1ec3fe932ab1e Apr 23 16:42:09.974435 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:09.974371 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-tllcv" event={"ID":"421a788d-3f91-42f6-a112-cdf4bd228da3","Type":"ContainerStarted","Data":"aa744c4b10eca118e9f34860bc8ac830363097eb0169e69c20c1ec3fe932ab1e"} Apr 23 16:42:09.975838 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:09.975808 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6b94ff949c-rkqk9" event={"ID":"59c1528b-83e6-4e0d-b9a9-37900b7098b2","Type":"ContainerStarted","Data":"d003182d89526a6a0a37978a9cb1b9e30d077e8e2f231ba0197f97684d1887c4"} Apr 23 16:42:12.988656 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:12.988572 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-tllcv" event={"ID":"421a788d-3f91-42f6-a112-cdf4bd228da3","Type":"ContainerStarted","Data":"3b383a1a3306c6c80ed3d51cf3f1784109321932d80f8317b5d7bb01846cb2b1"} Apr 23 16:42:12.989045 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:12.988662 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-tllcv" Apr 23 16:42:12.990043 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:12.990021 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6b94ff949c-rkqk9" event={"ID":"59c1528b-83e6-4e0d-b9a9-37900b7098b2","Type":"ContainerStarted","Data":"898c551489d4be44098b8ba9b1ae0cb771b9eb516931467e2cf9b16275a6ac1d"} Apr 23 16:42:12.990143 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:12.990124 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-6b94ff949c-rkqk9" Apr 23 16:42:13.006262 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:13.006217 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-tllcv" podStartSLOduration=1.4780745149999999 podStartE2EDuration="5.006205094s" podCreationTimestamp="2026-04-23 16:42:08 +0000 UTC" firstStartedPulling="2026-04-23 16:42:09.057557395 +0000 UTC m=+421.079953489" lastFinishedPulling="2026-04-23 16:42:12.585687975 +0000 UTC m=+424.608084068" observedRunningTime="2026-04-23 16:42:13.0040222 +0000 UTC m=+425.026418326" watchObservedRunningTime="2026-04-23 16:42:13.006205094 +0000 UTC m=+425.028601210" Apr 23 16:42:13.020232 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:13.020185 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-6b94ff949c-rkqk9" podStartSLOduration=1.481283884 podStartE2EDuration="5.020175224s" podCreationTimestamp="2026-04-23 16:42:08 +0000 UTC" firstStartedPulling="2026-04-23 16:42:08.990985981 +0000 UTC m=+421.013382091" lastFinishedPulling="2026-04-23 16:42:12.529877322 +0000 UTC m=+424.552273431" observedRunningTime="2026-04-23 16:42:13.018438521 +0000 UTC m=+425.040834650" watchObservedRunningTime="2026-04-23 16:42:13.020175224 +0000 UTC m=+425.042571340" Apr 23 16:42:18.995232 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:18.995202 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-tllcv" Apr 23 16:42:43.995808 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:42:43.995776 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-6b94ff949c-rkqk9" Apr 23 16:43:25.025194 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:25.025162 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-5wlsz"] Apr 23 16:43:25.028605 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:25.028589 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-5wlsz" Apr 23 16:43:25.031018 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:25.030986 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-54fkq\"" Apr 23 16:43:25.031018 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:25.030990 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 23 16:43:25.041959 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:25.041937 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-5wlsz"] Apr 23 16:43:25.156200 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:25.156167 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c0a5e5e-d682-44d7-b461-1b13c974a88a-cert\") pod \"odh-model-controller-696fc77849-5wlsz\" (UID: \"4c0a5e5e-d682-44d7-b461-1b13c974a88a\") " pod="kserve/odh-model-controller-696fc77849-5wlsz" Apr 23 16:43:25.156357 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:25.156285 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcn8m\" (UniqueName: \"kubernetes.io/projected/4c0a5e5e-d682-44d7-b461-1b13c974a88a-kube-api-access-qcn8m\") pod \"odh-model-controller-696fc77849-5wlsz\" (UID: \"4c0a5e5e-d682-44d7-b461-1b13c974a88a\") " pod="kserve/odh-model-controller-696fc77849-5wlsz" Apr 23 16:43:25.257160 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:25.257119 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcn8m\" (UniqueName: \"kubernetes.io/projected/4c0a5e5e-d682-44d7-b461-1b13c974a88a-kube-api-access-qcn8m\") pod \"odh-model-controller-696fc77849-5wlsz\" (UID: \"4c0a5e5e-d682-44d7-b461-1b13c974a88a\") " pod="kserve/odh-model-controller-696fc77849-5wlsz" Apr 23 16:43:25.257327 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:25.257185 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c0a5e5e-d682-44d7-b461-1b13c974a88a-cert\") pod \"odh-model-controller-696fc77849-5wlsz\" (UID: \"4c0a5e5e-d682-44d7-b461-1b13c974a88a\") " pod="kserve/odh-model-controller-696fc77849-5wlsz" Apr 23 16:43:25.259523 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:25.259499 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c0a5e5e-d682-44d7-b461-1b13c974a88a-cert\") pod \"odh-model-controller-696fc77849-5wlsz\" (UID: \"4c0a5e5e-d682-44d7-b461-1b13c974a88a\") " pod="kserve/odh-model-controller-696fc77849-5wlsz" Apr 23 16:43:25.266708 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:25.266683 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcn8m\" (UniqueName: \"kubernetes.io/projected/4c0a5e5e-d682-44d7-b461-1b13c974a88a-kube-api-access-qcn8m\") pod \"odh-model-controller-696fc77849-5wlsz\" (UID: \"4c0a5e5e-d682-44d7-b461-1b13c974a88a\") " pod="kserve/odh-model-controller-696fc77849-5wlsz" Apr 23 16:43:25.340205 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:25.340179 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-5wlsz" Apr 23 16:43:25.503145 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:25.503108 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-5wlsz"] Apr 23 16:43:25.506349 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:43:25.506319 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c0a5e5e_d682_44d7_b461_1b13c974a88a.slice/crio-85675f256766bd0293e0b576e0f7ae6967185bc1d140416de92a4c3625844378 WatchSource:0}: Error finding container 85675f256766bd0293e0b576e0f7ae6967185bc1d140416de92a4c3625844378: Status 404 returned error can't find the container with id 85675f256766bd0293e0b576e0f7ae6967185bc1d140416de92a4c3625844378 Apr 23 16:43:26.201689 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.201655 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5cbf969bd8-vvmt9"] Apr 23 16:43:26.205475 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.205453 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.219822 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.219777 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5cbf969bd8-vvmt9"] Apr 23 16:43:26.239122 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.239086 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-5wlsz" event={"ID":"4c0a5e5e-d682-44d7-b461-1b13c974a88a","Type":"ContainerStarted","Data":"85675f256766bd0293e0b576e0f7ae6967185bc1d140416de92a4c3625844378"} Apr 23 16:43:26.265544 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.265509 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bd43d173-e3e7-434e-bf93-a2bca83ea33b-console-oauth-config\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.265703 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.265553 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd43d173-e3e7-434e-bf93-a2bca83ea33b-console-serving-cert\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.265703 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.265585 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bd43d173-e3e7-434e-bf93-a2bca83ea33b-oauth-serving-cert\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.265703 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.265651 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs9m7\" (UniqueName: \"kubernetes.io/projected/bd43d173-e3e7-434e-bf93-a2bca83ea33b-kube-api-access-xs9m7\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.265870 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.265748 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd43d173-e3e7-434e-bf93-a2bca83ea33b-service-ca\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.265870 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.265779 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd43d173-e3e7-434e-bf93-a2bca83ea33b-trusted-ca-bundle\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.265870 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.265819 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bd43d173-e3e7-434e-bf93-a2bca83ea33b-console-config\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.366697 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.366499 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd43d173-e3e7-434e-bf93-a2bca83ea33b-service-ca\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.366697 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.366548 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd43d173-e3e7-434e-bf93-a2bca83ea33b-trusted-ca-bundle\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.366697 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.366590 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bd43d173-e3e7-434e-bf93-a2bca83ea33b-console-config\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.366697 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.366633 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bd43d173-e3e7-434e-bf93-a2bca83ea33b-console-oauth-config\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.366697 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.366661 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd43d173-e3e7-434e-bf93-a2bca83ea33b-console-serving-cert\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.366697 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.366697 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bd43d173-e3e7-434e-bf93-a2bca83ea33b-oauth-serving-cert\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.367256 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.367229 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xs9m7\" (UniqueName: \"kubernetes.io/projected/bd43d173-e3e7-434e-bf93-a2bca83ea33b-kube-api-access-xs9m7\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.367514 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.367476 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd43d173-e3e7-434e-bf93-a2bca83ea33b-service-ca\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.367514 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.367476 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bd43d173-e3e7-434e-bf93-a2bca83ea33b-oauth-serving-cert\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.367709 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.367687 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bd43d173-e3e7-434e-bf93-a2bca83ea33b-console-config\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.367799 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.367776 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd43d173-e3e7-434e-bf93-a2bca83ea33b-trusted-ca-bundle\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.370558 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.370511 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd43d173-e3e7-434e-bf93-a2bca83ea33b-console-serving-cert\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.371090 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.371045 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bd43d173-e3e7-434e-bf93-a2bca83ea33b-console-oauth-config\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.376181 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.376117 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs9m7\" (UniqueName: \"kubernetes.io/projected/bd43d173-e3e7-434e-bf93-a2bca83ea33b-kube-api-access-xs9m7\") pod \"console-5cbf969bd8-vvmt9\" (UID: \"bd43d173-e3e7-434e-bf93-a2bca83ea33b\") " pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.519411 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.519317 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:26.669305 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:26.669277 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5cbf969bd8-vvmt9"] Apr 23 16:43:26.672094 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:43:26.672062 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd43d173_e3e7_434e_bf93_a2bca83ea33b.slice/crio-c48604cb46a0885bf8cd858985a9a661aa2b0ec3d5c526266ad0de5e80d9906c WatchSource:0}: Error finding container c48604cb46a0885bf8cd858985a9a661aa2b0ec3d5c526266ad0de5e80d9906c: Status 404 returned error can't find the container with id c48604cb46a0885bf8cd858985a9a661aa2b0ec3d5c526266ad0de5e80d9906c Apr 23 16:43:27.243963 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:27.243920 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cbf969bd8-vvmt9" event={"ID":"bd43d173-e3e7-434e-bf93-a2bca83ea33b","Type":"ContainerStarted","Data":"47124b2bc3aa6d9ed3a7c66d68d57a6ef89f0922f4950242c066fa1cf026d3a3"} Apr 23 16:43:27.243963 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:27.243970 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cbf969bd8-vvmt9" event={"ID":"bd43d173-e3e7-434e-bf93-a2bca83ea33b","Type":"ContainerStarted","Data":"c48604cb46a0885bf8cd858985a9a661aa2b0ec3d5c526266ad0de5e80d9906c"} Apr 23 16:43:27.264651 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:27.264601 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5cbf969bd8-vvmt9" podStartSLOduration=1.264584289 podStartE2EDuration="1.264584289s" podCreationTimestamp="2026-04-23 16:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:43:27.264530567 +0000 UTC m=+499.286926706" watchObservedRunningTime="2026-04-23 16:43:27.264584289 +0000 UTC m=+499.286980405" Apr 23 16:43:28.249549 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:28.249508 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-5wlsz" event={"ID":"4c0a5e5e-d682-44d7-b461-1b13c974a88a","Type":"ContainerStarted","Data":"7c2c605b83ea2593b910d52571bab1eaea01af8b60f5f853cb4b8300781ff1cd"} Apr 23 16:43:28.249959 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:28.249586 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-5wlsz" Apr 23 16:43:28.268587 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:28.268542 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-5wlsz" podStartSLOduration=0.649688537 podStartE2EDuration="3.26852907s" podCreationTimestamp="2026-04-23 16:43:25 +0000 UTC" firstStartedPulling="2026-04-23 16:43:25.507595715 +0000 UTC m=+497.529991812" lastFinishedPulling="2026-04-23 16:43:28.126436248 +0000 UTC m=+500.148832345" observedRunningTime="2026-04-23 16:43:28.266180361 +0000 UTC m=+500.288576477" watchObservedRunningTime="2026-04-23 16:43:28.26852907 +0000 UTC m=+500.290925213" Apr 23 16:43:36.528097 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:36.528053 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:36.528774 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:36.528114 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:36.533856 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:36.533831 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:37.283241 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:37.283211 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5cbf969bd8-vvmt9" Apr 23 16:43:37.358045 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:37.358012 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5bc586b777-wlsqt"] Apr 23 16:43:39.255374 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:43:39.255338 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-5wlsz" Apr 23 16:44:01.280316 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:01.280243 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd"] Apr 23 16:44:01.283647 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:01.283630 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd" Apr 23 16:44:01.285964 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:01.285944 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-swd2s\"" Apr 23 16:44:01.290226 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:01.290176 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd"] Apr 23 16:44:01.294040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:01.294024 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd" Apr 23 16:44:01.382569 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:01.382023 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x"] Apr 23 16:44:01.386658 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:01.386640 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" Apr 23 16:44:01.394965 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:01.394924 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x"] Apr 23 16:44:01.436167 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:01.436140 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd"] Apr 23 16:44:01.438756 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:44:01.438726 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod049b30eb_57f1_465f_ab2d_b6f8a18762e9.slice/crio-25ef3e5ab06571eb3aba037575524d455b2524e5bcede4612711942132a58be4 WatchSource:0}: Error finding container 25ef3e5ab06571eb3aba037575524d455b2524e5bcede4612711942132a58be4: Status 404 returned error can't find the container with id 25ef3e5ab06571eb3aba037575524d455b2524e5bcede4612711942132a58be4 Apr 23 16:44:01.481589 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:01.481560 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/822afe44-477a-49a4-99f9-c01638ad964b-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-8dp2x\" (UID: \"822afe44-477a-49a4-99f9-c01638ad964b\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" Apr 23 16:44:01.582727 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:01.582692 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/822afe44-477a-49a4-99f9-c01638ad964b-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-8dp2x\" (UID: \"822afe44-477a-49a4-99f9-c01638ad964b\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" Apr 23 16:44:01.583084 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:01.583061 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/822afe44-477a-49a4-99f9-c01638ad964b-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-8dp2x\" (UID: \"822afe44-477a-49a4-99f9-c01638ad964b\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" Apr 23 16:44:01.701557 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:01.701522 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" Apr 23 16:44:01.828161 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:01.828127 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x"] Apr 23 16:44:02.370115 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.370076 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" event={"ID":"822afe44-477a-49a4-99f9-c01638ad964b","Type":"ContainerStarted","Data":"dbaf3e6c87a4a122365a9cc035350789197d21c6e0924914b6092b441d8668c8"} Apr 23 16:44:02.378591 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.378558 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd" event={"ID":"049b30eb-57f1-465f-ab2d-b6f8a18762e9","Type":"ContainerStarted","Data":"25ef3e5ab06571eb3aba037575524d455b2524e5bcede4612711942132a58be4"} Apr 23 16:44:02.382975 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.382460 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5bc586b777-wlsqt" podUID="e7f39b63-535e-42da-809f-6ae8bafc6786" containerName="console" containerID="cri-o://fc9136da011170266950d47d700c659323b7f9fbfd5711a8cc8d0a3fe96f67eb" gracePeriod=15 Apr 23 16:44:02.716775 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.716369 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5bc586b777-wlsqt_e7f39b63-535e-42da-809f-6ae8bafc6786/console/0.log" Apr 23 16:44:02.716775 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.716465 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:44:02.796642 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.796602 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-service-ca\") pod \"e7f39b63-535e-42da-809f-6ae8bafc6786\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " Apr 23 16:44:02.796830 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.796654 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-trusted-ca-bundle\") pod \"e7f39b63-535e-42da-809f-6ae8bafc6786\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " Apr 23 16:44:02.796830 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.796695 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f39b63-535e-42da-809f-6ae8bafc6786-console-serving-cert\") pod \"e7f39b63-535e-42da-809f-6ae8bafc6786\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " Apr 23 16:44:02.796830 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.796786 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9wvx\" (UniqueName: \"kubernetes.io/projected/e7f39b63-535e-42da-809f-6ae8bafc6786-kube-api-access-g9wvx\") pod \"e7f39b63-535e-42da-809f-6ae8bafc6786\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " Apr 23 16:44:02.796830 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.796811 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-console-config\") pod \"e7f39b63-535e-42da-809f-6ae8bafc6786\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " Apr 23 16:44:02.797034 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.796839 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-oauth-serving-cert\") pod \"e7f39b63-535e-42da-809f-6ae8bafc6786\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " Apr 23 16:44:02.797034 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.796865 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e7f39b63-535e-42da-809f-6ae8bafc6786-console-oauth-config\") pod \"e7f39b63-535e-42da-809f-6ae8bafc6786\" (UID: \"e7f39b63-535e-42da-809f-6ae8bafc6786\") " Apr 23 16:44:02.798272 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.798053 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-console-config" (OuterVolumeSpecName: "console-config") pod "e7f39b63-535e-42da-809f-6ae8bafc6786" (UID: "e7f39b63-535e-42da-809f-6ae8bafc6786"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:44:02.798429 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.798350 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-service-ca" (OuterVolumeSpecName: "service-ca") pod "e7f39b63-535e-42da-809f-6ae8bafc6786" (UID: "e7f39b63-535e-42da-809f-6ae8bafc6786"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:44:02.798429 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.798408 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e7f39b63-535e-42da-809f-6ae8bafc6786" (UID: "e7f39b63-535e-42da-809f-6ae8bafc6786"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:44:02.798563 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.798531 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e7f39b63-535e-42da-809f-6ae8bafc6786" (UID: "e7f39b63-535e-42da-809f-6ae8bafc6786"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:44:02.805429 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.802597 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f39b63-535e-42da-809f-6ae8bafc6786-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e7f39b63-535e-42da-809f-6ae8bafc6786" (UID: "e7f39b63-535e-42da-809f-6ae8bafc6786"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:44:02.805429 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.805345 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7f39b63-535e-42da-809f-6ae8bafc6786-kube-api-access-g9wvx" (OuterVolumeSpecName: "kube-api-access-g9wvx") pod "e7f39b63-535e-42da-809f-6ae8bafc6786" (UID: "e7f39b63-535e-42da-809f-6ae8bafc6786"). InnerVolumeSpecName "kube-api-access-g9wvx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:44:02.815055 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.815015 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7f39b63-535e-42da-809f-6ae8bafc6786-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e7f39b63-535e-42da-809f-6ae8bafc6786" (UID: "e7f39b63-535e-42da-809f-6ae8bafc6786"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:44:02.897911 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.897764 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-service-ca\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:44:02.897911 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.897802 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-trusted-ca-bundle\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:44:02.897911 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.897819 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7f39b63-535e-42da-809f-6ae8bafc6786-console-serving-cert\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:44:02.897911 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.897835 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g9wvx\" (UniqueName: \"kubernetes.io/projected/e7f39b63-535e-42da-809f-6ae8bafc6786-kube-api-access-g9wvx\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:44:02.897911 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.897850 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-console-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:44:02.897911 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.897864 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e7f39b63-535e-42da-809f-6ae8bafc6786-oauth-serving-cert\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:44:02.897911 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:02.897877 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e7f39b63-535e-42da-809f-6ae8bafc6786-console-oauth-config\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:44:03.390400 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:03.386519 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5bc586b777-wlsqt_e7f39b63-535e-42da-809f-6ae8bafc6786/console/0.log" Apr 23 16:44:03.390400 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:03.386568 2578 generic.go:358] "Generic (PLEG): container finished" podID="e7f39b63-535e-42da-809f-6ae8bafc6786" containerID="fc9136da011170266950d47d700c659323b7f9fbfd5711a8cc8d0a3fe96f67eb" exitCode=2 Apr 23 16:44:03.390400 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:03.386655 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bc586b777-wlsqt" event={"ID":"e7f39b63-535e-42da-809f-6ae8bafc6786","Type":"ContainerDied","Data":"fc9136da011170266950d47d700c659323b7f9fbfd5711a8cc8d0a3fe96f67eb"} Apr 23 16:44:03.390400 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:03.386683 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bc586b777-wlsqt" event={"ID":"e7f39b63-535e-42da-809f-6ae8bafc6786","Type":"ContainerDied","Data":"4451dc59a2bffdd01585b46d405c4f0d6c63ebb1cc053a6a309803a57d1ad4f7"} Apr 23 16:44:03.390400 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:03.386704 2578 scope.go:117] "RemoveContainer" containerID="fc9136da011170266950d47d700c659323b7f9fbfd5711a8cc8d0a3fe96f67eb" Apr 23 16:44:03.390400 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:03.386858 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bc586b777-wlsqt" Apr 23 16:44:03.410785 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:03.409457 2578 scope.go:117] "RemoveContainer" containerID="fc9136da011170266950d47d700c659323b7f9fbfd5711a8cc8d0a3fe96f67eb" Apr 23 16:44:03.411282 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:44:03.411179 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc9136da011170266950d47d700c659323b7f9fbfd5711a8cc8d0a3fe96f67eb\": container with ID starting with fc9136da011170266950d47d700c659323b7f9fbfd5711a8cc8d0a3fe96f67eb not found: ID does not exist" containerID="fc9136da011170266950d47d700c659323b7f9fbfd5711a8cc8d0a3fe96f67eb" Apr 23 16:44:03.411570 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:03.411541 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc9136da011170266950d47d700c659323b7f9fbfd5711a8cc8d0a3fe96f67eb"} err="failed to get container status \"fc9136da011170266950d47d700c659323b7f9fbfd5711a8cc8d0a3fe96f67eb\": rpc error: code = NotFound desc = could not find container \"fc9136da011170266950d47d700c659323b7f9fbfd5711a8cc8d0a3fe96f67eb\": container with ID starting with fc9136da011170266950d47d700c659323b7f9fbfd5711a8cc8d0a3fe96f67eb not found: ID does not exist" Apr 23 16:44:03.421703 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:03.421673 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5bc586b777-wlsqt"] Apr 23 16:44:03.427373 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:03.427336 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5bc586b777-wlsqt"] Apr 23 16:44:04.582805 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:04.582767 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7f39b63-535e-42da-809f-6ae8bafc6786" path="/var/lib/kubelet/pods/e7f39b63-535e-42da-809f-6ae8bafc6786/volumes" Apr 23 16:44:16.450270 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:16.450225 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" event={"ID":"822afe44-477a-49a4-99f9-c01638ad964b","Type":"ContainerStarted","Data":"4f0e3338c37f88be80687fcedb4d163de48e9ee61f386885950889b574005d4e"} Apr 23 16:44:16.451593 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:16.451564 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd" event={"ID":"049b30eb-57f1-465f-ab2d-b6f8a18762e9","Type":"ContainerStarted","Data":"5bc99d852921846143891ec548b7b653fc3d15f4b0673b4feff5fe6815c090a2"} Apr 23 16:44:16.451783 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:16.451770 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd" Apr 23 16:44:16.453223 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:16.453195 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd" podUID="049b30eb-57f1-465f-ab2d-b6f8a18762e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 16:44:16.480885 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:16.480822 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd" podStartSLOduration=1.230824117 podStartE2EDuration="15.48080171s" podCreationTimestamp="2026-04-23 16:44:01 +0000 UTC" firstStartedPulling="2026-04-23 16:44:01.440880915 +0000 UTC m=+533.463277010" lastFinishedPulling="2026-04-23 16:44:15.690858506 +0000 UTC m=+547.713254603" observedRunningTime="2026-04-23 16:44:16.47953926 +0000 UTC m=+548.501935389" watchObservedRunningTime="2026-04-23 16:44:16.48080171 +0000 UTC m=+548.503197826" Apr 23 16:44:17.455159 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:17.455116 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd" podUID="049b30eb-57f1-465f-ab2d-b6f8a18762e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 16:44:19.462416 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:19.462303 2578 generic.go:358] "Generic (PLEG): container finished" podID="822afe44-477a-49a4-99f9-c01638ad964b" containerID="4f0e3338c37f88be80687fcedb4d163de48e9ee61f386885950889b574005d4e" exitCode=0 Apr 23 16:44:19.462416 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:19.462396 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" event={"ID":"822afe44-477a-49a4-99f9-c01638ad964b","Type":"ContainerDied","Data":"4f0e3338c37f88be80687fcedb4d163de48e9ee61f386885950889b574005d4e"} Apr 23 16:44:27.455425 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:27.455362 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd" podUID="049b30eb-57f1-465f-ab2d-b6f8a18762e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 16:44:37.455718 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:37.455666 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd" podUID="049b30eb-57f1-465f-ab2d-b6f8a18762e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 16:44:41.554588 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:41.554506 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" event={"ID":"822afe44-477a-49a4-99f9-c01638ad964b","Type":"ContainerStarted","Data":"62bae0fa692fa7672a234eb193f41d22fdd7896332fad51f794d89fabd4cb74f"} Apr 23 16:44:41.555041 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:41.554819 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" Apr 23 16:44:41.556040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:41.556014 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" podUID="822afe44-477a-49a4-99f9-c01638ad964b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 16:44:41.571568 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:41.571522 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" podStartSLOduration=1.3146201020000001 podStartE2EDuration="40.571509595s" podCreationTimestamp="2026-04-23 16:44:01 +0000 UTC" firstStartedPulling="2026-04-23 16:44:01.835027394 +0000 UTC m=+533.857423489" lastFinishedPulling="2026-04-23 16:44:41.091916874 +0000 UTC m=+573.114312982" observedRunningTime="2026-04-23 16:44:41.570175926 +0000 UTC m=+573.592572067" watchObservedRunningTime="2026-04-23 16:44:41.571509595 +0000 UTC m=+573.593905710" Apr 23 16:44:42.559021 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:42.558979 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" podUID="822afe44-477a-49a4-99f9-c01638ad964b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 16:44:47.456040 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:47.455985 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd" podUID="049b30eb-57f1-465f-ab2d-b6f8a18762e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 16:44:52.559477 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:52.559437 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" podUID="822afe44-477a-49a4-99f9-c01638ad964b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 16:44:57.455417 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:44:57.455358 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd" podUID="049b30eb-57f1-465f-ab2d-b6f8a18762e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 16:45:02.559023 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:02.558981 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" podUID="822afe44-477a-49a4-99f9-c01638ad964b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 16:45:07.456560 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:07.456526 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd" Apr 23 16:45:08.486428 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:08.486401 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 16:45:08.487702 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:08.487682 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 16:45:12.559615 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:12.559573 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" podUID="822afe44-477a-49a4-99f9-c01638ad964b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 16:45:22.559366 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:22.559323 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" podUID="822afe44-477a-49a4-99f9-c01638ad964b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 16:45:31.319173 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:31.319141 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd"] Apr 23 16:45:31.319588 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:31.319369 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd" podUID="049b30eb-57f1-465f-ab2d-b6f8a18762e9" containerName="kserve-container" containerID="cri-o://5bc99d852921846143891ec548b7b653fc3d15f4b0673b4feff5fe6815c090a2" gracePeriod=30 Apr 23 16:45:31.494711 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:31.494679 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx"] Apr 23 16:45:31.495071 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:31.495058 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7f39b63-535e-42da-809f-6ae8bafc6786" containerName="console" Apr 23 16:45:31.495118 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:31.495073 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7f39b63-535e-42da-809f-6ae8bafc6786" containerName="console" Apr 23 16:45:31.495152 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:31.495139 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7f39b63-535e-42da-809f-6ae8bafc6786" containerName="console" Apr 23 16:45:31.498077 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:31.498054 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx" Apr 23 16:45:31.507449 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:31.507426 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx"] Apr 23 16:45:31.508626 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:31.508610 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx" Apr 23 16:45:31.634491 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:31.634465 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx"] Apr 23 16:45:31.636852 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:45:31.636825 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09134b41_beba_4d57_91d6_5756496e8ec3.slice/crio-b826026d04826fbb8db22b84972223933749b20eb651c1f68273567f28375309 WatchSource:0}: Error finding container b826026d04826fbb8db22b84972223933749b20eb651c1f68273567f28375309: Status 404 returned error can't find the container with id b826026d04826fbb8db22b84972223933749b20eb651c1f68273567f28375309 Apr 23 16:45:31.723626 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:31.723592 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx" event={"ID":"09134b41-beba-4d57-91d6-5756496e8ec3","Type":"ContainerStarted","Data":"b826026d04826fbb8db22b84972223933749b20eb651c1f68273567f28375309"} Apr 23 16:45:32.559356 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:32.559315 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" podUID="822afe44-477a-49a4-99f9-c01638ad964b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 16:45:32.728319 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:32.728283 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx" event={"ID":"09134b41-beba-4d57-91d6-5756496e8ec3","Type":"ContainerStarted","Data":"f4a54df7d5967e0d7e7e2d687470a143a3f54fb9c93aded4998abe88abfc464c"} Apr 23 16:45:32.728492 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:32.728409 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx" Apr 23 16:45:32.729847 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:32.729817 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx" podUID="09134b41-beba-4d57-91d6-5756496e8ec3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 16:45:32.764059 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:32.764003 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx" podStartSLOduration=1.763987301 podStartE2EDuration="1.763987301s" podCreationTimestamp="2026-04-23 16:45:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:45:32.763118472 +0000 UTC m=+624.785514589" watchObservedRunningTime="2026-04-23 16:45:32.763987301 +0000 UTC m=+624.786383421" Apr 23 16:45:33.732253 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:33.732212 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx" podUID="09134b41-beba-4d57-91d6-5756496e8ec3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 16:45:34.969734 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:34.969711 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd" Apr 23 16:45:35.739864 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:35.739827 2578 generic.go:358] "Generic (PLEG): container finished" podID="049b30eb-57f1-465f-ab2d-b6f8a18762e9" containerID="5bc99d852921846143891ec548b7b653fc3d15f4b0673b4feff5fe6815c090a2" exitCode=0 Apr 23 16:45:35.740041 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:35.739898 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd" Apr 23 16:45:35.740041 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:35.739904 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd" event={"ID":"049b30eb-57f1-465f-ab2d-b6f8a18762e9","Type":"ContainerDied","Data":"5bc99d852921846143891ec548b7b653fc3d15f4b0673b4feff5fe6815c090a2"} Apr 23 16:45:35.740041 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:35.739941 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd" event={"ID":"049b30eb-57f1-465f-ab2d-b6f8a18762e9","Type":"ContainerDied","Data":"25ef3e5ab06571eb3aba037575524d455b2524e5bcede4612711942132a58be4"} Apr 23 16:45:35.740041 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:35.739956 2578 scope.go:117] "RemoveContainer" containerID="5bc99d852921846143891ec548b7b653fc3d15f4b0673b4feff5fe6815c090a2" Apr 23 16:45:35.751674 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:35.749186 2578 scope.go:117] "RemoveContainer" containerID="5bc99d852921846143891ec548b7b653fc3d15f4b0673b4feff5fe6815c090a2" Apr 23 16:45:35.752164 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:45:35.751962 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bc99d852921846143891ec548b7b653fc3d15f4b0673b4feff5fe6815c090a2\": container with ID starting with 5bc99d852921846143891ec548b7b653fc3d15f4b0673b4feff5fe6815c090a2 not found: ID does not exist" containerID="5bc99d852921846143891ec548b7b653fc3d15f4b0673b4feff5fe6815c090a2" Apr 23 16:45:35.752164 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:35.752000 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc99d852921846143891ec548b7b653fc3d15f4b0673b4feff5fe6815c090a2"} err="failed to get container status \"5bc99d852921846143891ec548b7b653fc3d15f4b0673b4feff5fe6815c090a2\": rpc error: code = NotFound desc = could not find container \"5bc99d852921846143891ec548b7b653fc3d15f4b0673b4feff5fe6815c090a2\": container with ID starting with 5bc99d852921846143891ec548b7b653fc3d15f4b0673b4feff5fe6815c090a2 not found: ID does not exist" Apr 23 16:45:35.766424 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:35.766394 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd"] Apr 23 16:45:35.770648 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:35.770627 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5bdc4-predictor-66dc784fb6-4rmbd"] Apr 23 16:45:36.576588 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:36.576551 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="049b30eb-57f1-465f-ab2d-b6f8a18762e9" path="/var/lib/kubelet/pods/049b30eb-57f1-465f-ab2d-b6f8a18762e9/volumes" Apr 23 16:45:42.560572 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:42.560546 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" Apr 23 16:45:43.733225 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:43.733181 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx" podUID="09134b41-beba-4d57-91d6-5756496e8ec3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 16:45:53.732830 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:45:53.732785 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx" podUID="09134b41-beba-4d57-91d6-5756496e8ec3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 16:46:03.732524 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:03.732485 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx" podUID="09134b41-beba-4d57-91d6-5756496e8ec3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 16:46:11.414005 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:11.413962 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc"] Apr 23 16:46:11.415028 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:11.415007 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="049b30eb-57f1-465f-ab2d-b6f8a18762e9" containerName="kserve-container" Apr 23 16:46:11.415105 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:11.415030 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="049b30eb-57f1-465f-ab2d-b6f8a18762e9" containerName="kserve-container" Apr 23 16:46:11.415158 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:11.415146 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="049b30eb-57f1-465f-ab2d-b6f8a18762e9" containerName="kserve-container" Apr 23 16:46:11.419935 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:11.419914 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc" Apr 23 16:46:11.423558 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:11.423537 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc"] Apr 23 16:46:11.431176 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:11.431156 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc" Apr 23 16:46:11.432925 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:11.432902 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x"] Apr 23 16:46:11.433277 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:11.433230 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" podUID="822afe44-477a-49a4-99f9-c01638ad964b" containerName="kserve-container" containerID="cri-o://62bae0fa692fa7672a234eb193f41d22fdd7896332fad51f794d89fabd4cb74f" gracePeriod=30 Apr 23 16:46:11.551247 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:11.551219 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc"] Apr 23 16:46:11.556668 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:11.556640 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:46:11.866564 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:11.866521 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc" event={"ID":"027ca991-c02f-4657-9ff8-fbc4abde661c","Type":"ContainerStarted","Data":"625d021c2562e3a6a33dd4ac5f7c8c91afa57134bea7928b3b048ebf44d68fd6"} Apr 23 16:46:11.866564 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:11.866569 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc" event={"ID":"027ca991-c02f-4657-9ff8-fbc4abde661c","Type":"ContainerStarted","Data":"b2237c340a350ac93e8685f5033fb29c50b5f6c8e2798e07dbb725c8e8b0456e"} Apr 23 16:46:11.866796 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:11.866747 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc" Apr 23 16:46:11.867886 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:11.867859 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc" podUID="027ca991-c02f-4657-9ff8-fbc4abde661c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 16:46:11.880430 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:11.880365 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc" podStartSLOduration=0.880352815 podStartE2EDuration="880.352815ms" podCreationTimestamp="2026-04-23 16:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:46:11.879422132 +0000 UTC m=+663.901818247" watchObservedRunningTime="2026-04-23 16:46:11.880352815 +0000 UTC m=+663.902748932" Apr 23 16:46:12.559030 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:12.558991 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" podUID="822afe44-477a-49a4-99f9-c01638ad964b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 16:46:12.870047 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:12.869960 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc" podUID="027ca991-c02f-4657-9ff8-fbc4abde661c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 16:46:13.733296 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:13.733255 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx" podUID="09134b41-beba-4d57-91d6-5756496e8ec3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 16:46:15.282473 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:15.282443 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" Apr 23 16:46:15.448082 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:15.448042 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/822afe44-477a-49a4-99f9-c01638ad964b-kserve-provision-location\") pod \"822afe44-477a-49a4-99f9-c01638ad964b\" (UID: \"822afe44-477a-49a4-99f9-c01638ad964b\") " Apr 23 16:46:15.448472 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:15.448438 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/822afe44-477a-49a4-99f9-c01638ad964b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "822afe44-477a-49a4-99f9-c01638ad964b" (UID: "822afe44-477a-49a4-99f9-c01638ad964b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:46:15.549647 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:15.549607 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/822afe44-477a-49a4-99f9-c01638ad964b-kserve-provision-location\") on node \"ip-10-0-129-102.ec2.internal\" DevicePath \"\"" Apr 23 16:46:15.884317 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:15.884281 2578 generic.go:358] "Generic (PLEG): container finished" podID="822afe44-477a-49a4-99f9-c01638ad964b" containerID="62bae0fa692fa7672a234eb193f41d22fdd7896332fad51f794d89fabd4cb74f" exitCode=0 Apr 23 16:46:15.884506 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:15.884353 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" Apr 23 16:46:15.884506 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:15.884366 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" event={"ID":"822afe44-477a-49a4-99f9-c01638ad964b","Type":"ContainerDied","Data":"62bae0fa692fa7672a234eb193f41d22fdd7896332fad51f794d89fabd4cb74f"} Apr 23 16:46:15.884506 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:15.884433 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x" event={"ID":"822afe44-477a-49a4-99f9-c01638ad964b","Type":"ContainerDied","Data":"dbaf3e6c87a4a122365a9cc035350789197d21c6e0924914b6092b441d8668c8"} Apr 23 16:46:15.884506 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:15.884453 2578 scope.go:117] "RemoveContainer" containerID="62bae0fa692fa7672a234eb193f41d22fdd7896332fad51f794d89fabd4cb74f" Apr 23 16:46:15.893022 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:15.893003 2578 scope.go:117] "RemoveContainer" containerID="4f0e3338c37f88be80687fcedb4d163de48e9ee61f386885950889b574005d4e" Apr 23 16:46:15.900372 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:15.900351 2578 scope.go:117] "RemoveContainer" containerID="62bae0fa692fa7672a234eb193f41d22fdd7896332fad51f794d89fabd4cb74f" Apr 23 16:46:15.900633 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:46:15.900613 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62bae0fa692fa7672a234eb193f41d22fdd7896332fad51f794d89fabd4cb74f\": container with ID starting with 62bae0fa692fa7672a234eb193f41d22fdd7896332fad51f794d89fabd4cb74f not found: ID does not exist" containerID="62bae0fa692fa7672a234eb193f41d22fdd7896332fad51f794d89fabd4cb74f" Apr 23 16:46:15.900688 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:15.900641 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62bae0fa692fa7672a234eb193f41d22fdd7896332fad51f794d89fabd4cb74f"} err="failed to get container status \"62bae0fa692fa7672a234eb193f41d22fdd7896332fad51f794d89fabd4cb74f\": rpc error: code = NotFound desc = could not find container \"62bae0fa692fa7672a234eb193f41d22fdd7896332fad51f794d89fabd4cb74f\": container with ID starting with 62bae0fa692fa7672a234eb193f41d22fdd7896332fad51f794d89fabd4cb74f not found: ID does not exist" Apr 23 16:46:15.900688 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:15.900660 2578 scope.go:117] "RemoveContainer" containerID="4f0e3338c37f88be80687fcedb4d163de48e9ee61f386885950889b574005d4e" Apr 23 16:46:15.900895 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:46:15.900879 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f0e3338c37f88be80687fcedb4d163de48e9ee61f386885950889b574005d4e\": container with ID starting with 4f0e3338c37f88be80687fcedb4d163de48e9ee61f386885950889b574005d4e not found: ID does not exist" containerID="4f0e3338c37f88be80687fcedb4d163de48e9ee61f386885950889b574005d4e" Apr 23 16:46:15.900944 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:15.900898 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0e3338c37f88be80687fcedb4d163de48e9ee61f386885950889b574005d4e"} err="failed to get container status \"4f0e3338c37f88be80687fcedb4d163de48e9ee61f386885950889b574005d4e\": rpc error: code = NotFound desc = could not find container \"4f0e3338c37f88be80687fcedb4d163de48e9ee61f386885950889b574005d4e\": container with ID starting with 4f0e3338c37f88be80687fcedb4d163de48e9ee61f386885950889b574005d4e not found: ID does not exist" Apr 23 16:46:15.906315 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:15.906289 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x"] Apr 23 16:46:15.908117 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:15.908094 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-8dp2x"] Apr 23 16:46:16.576469 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:16.576433 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="822afe44-477a-49a4-99f9-c01638ad964b" path="/var/lib/kubelet/pods/822afe44-477a-49a4-99f9-c01638ad964b/volumes" Apr 23 16:46:22.870465 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:22.870417 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc" podUID="027ca991-c02f-4657-9ff8-fbc4abde661c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 16:46:23.733632 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:23.733601 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx" Apr 23 16:46:32.871045 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:32.870997 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc" podUID="027ca991-c02f-4657-9ff8-fbc4abde661c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 16:46:42.871122 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:42.871074 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc" podUID="027ca991-c02f-4657-9ff8-fbc4abde661c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 16:46:52.871069 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:46:52.871026 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc" podUID="027ca991-c02f-4657-9ff8-fbc4abde661c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 16:47:02.871624 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:47:02.871546 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc" Apr 23 16:50:08.512468 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:50:08.512401 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 16:50:08.514897 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:50:08.514872 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 16:54:56.378627 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:56.378594 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx"] Apr 23 16:54:56.379081 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:56.378881 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx" podUID="09134b41-beba-4d57-91d6-5756496e8ec3" containerName="kserve-container" containerID="cri-o://f4a54df7d5967e0d7e7e2d687470a143a3f54fb9c93aded4998abe88abfc464c" gracePeriod=30 Apr 23 16:54:56.443923 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:56.443885 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb"] Apr 23 16:54:56.444286 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:56.444274 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="822afe44-477a-49a4-99f9-c01638ad964b" containerName="storage-initializer" Apr 23 16:54:56.444329 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:56.444288 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="822afe44-477a-49a4-99f9-c01638ad964b" containerName="storage-initializer" Apr 23 16:54:56.444329 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:56.444304 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="822afe44-477a-49a4-99f9-c01638ad964b" containerName="kserve-container" Apr 23 16:54:56.444329 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:56.444309 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="822afe44-477a-49a4-99f9-c01638ad964b" containerName="kserve-container" Apr 23 16:54:56.444437 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:56.444363 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="822afe44-477a-49a4-99f9-c01638ad964b" containerName="kserve-container" Apr 23 16:54:56.447234 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:56.447218 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb" Apr 23 16:54:56.455959 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:56.455931 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb"] Apr 23 16:54:56.457648 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:56.457626 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb" Apr 23 16:54:56.785189 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:56.785165 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb"] Apr 23 16:54:56.787048 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:54:56.787020 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3659e25d_c7f7_4979_bfd8_d8ff93d60c90.slice/crio-16c312417fb7a9ba1c234cd6ba10972b4a20ea7bb1a15c0a31d18033d58151f5 WatchSource:0}: Error finding container 16c312417fb7a9ba1c234cd6ba10972b4a20ea7bb1a15c0a31d18033d58151f5: Status 404 returned error can't find the container with id 16c312417fb7a9ba1c234cd6ba10972b4a20ea7bb1a15c0a31d18033d58151f5 Apr 23 16:54:56.789165 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:56.789150 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:54:57.637874 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:57.637838 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb" event={"ID":"3659e25d-c7f7-4979-bfd8-d8ff93d60c90","Type":"ContainerStarted","Data":"9b4f922e1eea306aab8a2ebd09a56ebc9aa682f126fb9ef0b5693022653ba54f"} Apr 23 16:54:57.637874 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:57.637875 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb" event={"ID":"3659e25d-c7f7-4979-bfd8-d8ff93d60c90","Type":"ContainerStarted","Data":"16c312417fb7a9ba1c234cd6ba10972b4a20ea7bb1a15c0a31d18033d58151f5"} Apr 23 16:54:57.638366 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:57.638005 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb" Apr 23 16:54:57.639318 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:57.639287 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb" podUID="3659e25d-c7f7-4979-bfd8-d8ff93d60c90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 16:54:57.654396 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:57.654323 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb" podStartSLOduration=1.654306558 podStartE2EDuration="1.654306558s" podCreationTimestamp="2026-04-23 16:54:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:54:57.651708554 +0000 UTC m=+1189.674104671" watchObservedRunningTime="2026-04-23 16:54:57.654306558 +0000 UTC m=+1189.676702675" Apr 23 16:54:58.641076 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:58.641035 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb" podUID="3659e25d-c7f7-4979-bfd8-d8ff93d60c90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 16:54:59.645736 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:59.645701 2578 generic.go:358] "Generic (PLEG): container finished" podID="09134b41-beba-4d57-91d6-5756496e8ec3" containerID="f4a54df7d5967e0d7e7e2d687470a143a3f54fb9c93aded4998abe88abfc464c" exitCode=0 Apr 23 16:54:59.646108 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:59.645774 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx" event={"ID":"09134b41-beba-4d57-91d6-5756496e8ec3","Type":"ContainerDied","Data":"f4a54df7d5967e0d7e7e2d687470a143a3f54fb9c93aded4998abe88abfc464c"} Apr 23 16:54:59.819619 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:54:59.819592 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx" Apr 23 16:55:00.650033 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:00.650004 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx" event={"ID":"09134b41-beba-4d57-91d6-5756496e8ec3","Type":"ContainerDied","Data":"b826026d04826fbb8db22b84972223933749b20eb651c1f68273567f28375309"} Apr 23 16:55:00.650454 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:00.650048 2578 scope.go:117] "RemoveContainer" containerID="f4a54df7d5967e0d7e7e2d687470a143a3f54fb9c93aded4998abe88abfc464c" Apr 23 16:55:00.650454 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:00.650063 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx" Apr 23 16:55:00.672816 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:00.672789 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx"] Apr 23 16:55:00.674634 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:00.674612 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-be540-predictor-77756f784b-sh7vx"] Apr 23 16:55:02.576371 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:02.576336 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09134b41-beba-4d57-91d6-5756496e8ec3" path="/var/lib/kubelet/pods/09134b41-beba-4d57-91d6-5756496e8ec3/volumes" Apr 23 16:55:08.536675 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:08.536647 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 16:55:08.540556 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:08.540539 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 16:55:08.641893 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:08.641858 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb" podUID="3659e25d-c7f7-4979-bfd8-d8ff93d60c90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 16:55:18.641121 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:18.641077 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb" podUID="3659e25d-c7f7-4979-bfd8-d8ff93d60c90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 16:55:28.641069 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:28.641026 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb" podUID="3659e25d-c7f7-4979-bfd8-d8ff93d60c90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 16:55:36.290460 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:36.290424 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc"] Apr 23 16:55:36.290881 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:36.290735 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc" podUID="027ca991-c02f-4657-9ff8-fbc4abde661c" containerName="kserve-container" containerID="cri-o://625d021c2562e3a6a33dd4ac5f7c8c91afa57134bea7928b3b048ebf44d68fd6" gracePeriod=30 Apr 23 16:55:36.311598 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:36.311570 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc"] Apr 23 16:55:36.311983 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:36.311970 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09134b41-beba-4d57-91d6-5756496e8ec3" containerName="kserve-container" Apr 23 16:55:36.312024 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:36.311985 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="09134b41-beba-4d57-91d6-5756496e8ec3" containerName="kserve-container" Apr 23 16:55:36.312061 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:36.312052 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="09134b41-beba-4d57-91d6-5756496e8ec3" containerName="kserve-container" Apr 23 16:55:36.315033 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:36.315018 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc" Apr 23 16:55:36.320860 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:36.320834 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc"] Apr 23 16:55:36.325039 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:36.325022 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc" Apr 23 16:55:36.447926 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:36.447897 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc"] Apr 23 16:55:36.450029 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:55:36.450002 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30f124ff_cce0_4699_bc6a_ffc58b185e1f.slice/crio-37f1e0ec82ca46899f9e4f629c86f59bf3ca0f4d90944219e7589a89af0140d9 WatchSource:0}: Error finding container 37f1e0ec82ca46899f9e4f629c86f59bf3ca0f4d90944219e7589a89af0140d9: Status 404 returned error can't find the container with id 37f1e0ec82ca46899f9e4f629c86f59bf3ca0f4d90944219e7589a89af0140d9 Apr 23 16:55:36.781263 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:36.781226 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc" event={"ID":"30f124ff-cce0-4699-bc6a-ffc58b185e1f","Type":"ContainerStarted","Data":"c2e6b658e08c44b9ad22f8b93066019f1281496064d714ecaaf0d7b798f9c48e"} Apr 23 16:55:36.781263 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:36.781270 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc" event={"ID":"30f124ff-cce0-4699-bc6a-ffc58b185e1f","Type":"ContainerStarted","Data":"37f1e0ec82ca46899f9e4f629c86f59bf3ca0f4d90944219e7589a89af0140d9"} Apr 23 16:55:36.781527 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:36.781366 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc" Apr 23 16:55:36.782601 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:36.782571 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc" podUID="30f124ff-cce0-4699-bc6a-ffc58b185e1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 16:55:36.796436 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:36.796371 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc" podStartSLOduration=0.796357881 podStartE2EDuration="796.357881ms" podCreationTimestamp="2026-04-23 16:55:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:55:36.794066887 +0000 UTC m=+1228.816463015" watchObservedRunningTime="2026-04-23 16:55:36.796357881 +0000 UTC m=+1228.818753996" Apr 23 16:55:37.786960 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:37.786914 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc" podUID="30f124ff-cce0-4699-bc6a-ffc58b185e1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 16:55:38.641442 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:38.641400 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb" podUID="3659e25d-c7f7-4979-bfd8-d8ff93d60c90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 16:55:39.631084 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:39.631059 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc" Apr 23 16:55:39.793761 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:39.793679 2578 generic.go:358] "Generic (PLEG): container finished" podID="027ca991-c02f-4657-9ff8-fbc4abde661c" containerID="625d021c2562e3a6a33dd4ac5f7c8c91afa57134bea7928b3b048ebf44d68fd6" exitCode=0 Apr 23 16:55:39.793761 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:39.793741 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc" Apr 23 16:55:39.793970 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:39.793756 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc" event={"ID":"027ca991-c02f-4657-9ff8-fbc4abde661c","Type":"ContainerDied","Data":"625d021c2562e3a6a33dd4ac5f7c8c91afa57134bea7928b3b048ebf44d68fd6"} Apr 23 16:55:39.793970 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:39.793787 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc" event={"ID":"027ca991-c02f-4657-9ff8-fbc4abde661c","Type":"ContainerDied","Data":"b2237c340a350ac93e8685f5033fb29c50b5f6c8e2798e07dbb725c8e8b0456e"} Apr 23 16:55:39.793970 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:39.793802 2578 scope.go:117] "RemoveContainer" containerID="625d021c2562e3a6a33dd4ac5f7c8c91afa57134bea7928b3b048ebf44d68fd6" Apr 23 16:55:39.802599 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:39.802562 2578 scope.go:117] "RemoveContainer" containerID="625d021c2562e3a6a33dd4ac5f7c8c91afa57134bea7928b3b048ebf44d68fd6" Apr 23 16:55:39.802788 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:55:39.802771 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"625d021c2562e3a6a33dd4ac5f7c8c91afa57134bea7928b3b048ebf44d68fd6\": container with ID starting with 625d021c2562e3a6a33dd4ac5f7c8c91afa57134bea7928b3b048ebf44d68fd6 not found: ID does not exist" containerID="625d021c2562e3a6a33dd4ac5f7c8c91afa57134bea7928b3b048ebf44d68fd6" Apr 23 16:55:39.802830 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:39.802796 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625d021c2562e3a6a33dd4ac5f7c8c91afa57134bea7928b3b048ebf44d68fd6"} err="failed to get container status \"625d021c2562e3a6a33dd4ac5f7c8c91afa57134bea7928b3b048ebf44d68fd6\": rpc error: code = NotFound desc = could not find container \"625d021c2562e3a6a33dd4ac5f7c8c91afa57134bea7928b3b048ebf44d68fd6\": container with ID starting with 625d021c2562e3a6a33dd4ac5f7c8c91afa57134bea7928b3b048ebf44d68fd6 not found: ID does not exist" Apr 23 16:55:39.814525 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:39.814503 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc"] Apr 23 16:55:39.817905 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:39.817876 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-39400-predictor-755bc7b66-hcrwc"] Apr 23 16:55:40.575830 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:40.575789 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="027ca991-c02f-4657-9ff8-fbc4abde661c" path="/var/lib/kubelet/pods/027ca991-c02f-4657-9ff8-fbc4abde661c/volumes" Apr 23 16:55:47.786946 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:47.786901 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc" podUID="30f124ff-cce0-4699-bc6a-ffc58b185e1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 16:55:48.642180 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:48.642147 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb" Apr 23 16:55:57.787652 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:55:57.787538 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc" podUID="30f124ff-cce0-4699-bc6a-ffc58b185e1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 16:56:07.787258 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:07.787210 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc" podUID="30f124ff-cce0-4699-bc6a-ffc58b185e1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 16:56:16.674856 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:16.674822 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb"] Apr 23 16:56:16.675304 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:16.675068 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb" podUID="3659e25d-c7f7-4979-bfd8-d8ff93d60c90" containerName="kserve-container" containerID="cri-o://9b4f922e1eea306aab8a2ebd09a56ebc9aa682f126fb9ef0b5693022653ba54f" gracePeriod=30 Apr 23 16:56:16.800226 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:16.800187 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh"] Apr 23 16:56:16.800596 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:16.800583 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="027ca991-c02f-4657-9ff8-fbc4abde661c" containerName="kserve-container" Apr 23 16:56:16.800647 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:16.800598 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="027ca991-c02f-4657-9ff8-fbc4abde661c" containerName="kserve-container" Apr 23 16:56:16.800680 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:16.800657 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="027ca991-c02f-4657-9ff8-fbc4abde661c" containerName="kserve-container" Apr 23 16:56:16.805484 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:16.805463 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh" Apr 23 16:56:16.812808 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:16.812778 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh"] Apr 23 16:56:16.816320 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:16.816298 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh" Apr 23 16:56:16.944404 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:16.944357 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh"] Apr 23 16:56:16.946534 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:56:16.946504 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcf88044_c1a4_40b6_b6ff_4fe41d23c6ad.slice/crio-7e32e8ac04bb14e03848cf7adc2b5a9f44c69ed3238baf046b11850343243860 WatchSource:0}: Error finding container 7e32e8ac04bb14e03848cf7adc2b5a9f44c69ed3238baf046b11850343243860: Status 404 returned error can't find the container with id 7e32e8ac04bb14e03848cf7adc2b5a9f44c69ed3238baf046b11850343243860 Apr 23 16:56:17.787600 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:17.787542 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc" podUID="30f124ff-cce0-4699-bc6a-ffc58b185e1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 16:56:17.931726 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:17.931686 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh" event={"ID":"bcf88044-c1a4-40b6-b6ff-4fe41d23c6ad","Type":"ContainerStarted","Data":"8fdc10977c70ec144ea834d2becad6c9f080f9ac27402d15dd653146d659516f"} Apr 23 16:56:17.931726 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:17.931731 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh" event={"ID":"bcf88044-c1a4-40b6-b6ff-4fe41d23c6ad","Type":"ContainerStarted","Data":"7e32e8ac04bb14e03848cf7adc2b5a9f44c69ed3238baf046b11850343243860"} Apr 23 16:56:17.931954 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:17.931848 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh" Apr 23 16:56:17.932999 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:17.932972 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh" podUID="bcf88044-c1a4-40b6-b6ff-4fe41d23c6ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 16:56:17.949500 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:17.949439 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh" podStartSLOduration=1.9494218559999998 podStartE2EDuration="1.949421856s" podCreationTimestamp="2026-04-23 16:56:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:56:17.94858415 +0000 UTC m=+1269.970980281" watchObservedRunningTime="2026-04-23 16:56:17.949421856 +0000 UTC m=+1269.971818061" Apr 23 16:56:18.642204 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:18.642162 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb" podUID="3659e25d-c7f7-4979-bfd8-d8ff93d60c90" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 16:56:18.935998 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:18.935908 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh" podUID="bcf88044-c1a4-40b6-b6ff-4fe41d23c6ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 16:56:20.215093 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:20.215067 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb" Apr 23 16:56:20.944154 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:20.944060 2578 generic.go:358] "Generic (PLEG): container finished" podID="3659e25d-c7f7-4979-bfd8-d8ff93d60c90" containerID="9b4f922e1eea306aab8a2ebd09a56ebc9aa682f126fb9ef0b5693022653ba54f" exitCode=0 Apr 23 16:56:20.944154 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:20.944120 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb" Apr 23 16:56:20.944364 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:20.944148 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb" event={"ID":"3659e25d-c7f7-4979-bfd8-d8ff93d60c90","Type":"ContainerDied","Data":"9b4f922e1eea306aab8a2ebd09a56ebc9aa682f126fb9ef0b5693022653ba54f"} Apr 23 16:56:20.944364 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:20.944185 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb" event={"ID":"3659e25d-c7f7-4979-bfd8-d8ff93d60c90","Type":"ContainerDied","Data":"16c312417fb7a9ba1c234cd6ba10972b4a20ea7bb1a15c0a31d18033d58151f5"} Apr 23 16:56:20.944364 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:20.944201 2578 scope.go:117] "RemoveContainer" containerID="9b4f922e1eea306aab8a2ebd09a56ebc9aa682f126fb9ef0b5693022653ba54f" Apr 23 16:56:20.953901 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:20.953874 2578 scope.go:117] "RemoveContainer" containerID="9b4f922e1eea306aab8a2ebd09a56ebc9aa682f126fb9ef0b5693022653ba54f" Apr 23 16:56:20.954269 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:56:20.954246 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b4f922e1eea306aab8a2ebd09a56ebc9aa682f126fb9ef0b5693022653ba54f\": container with ID starting with 9b4f922e1eea306aab8a2ebd09a56ebc9aa682f126fb9ef0b5693022653ba54f not found: ID does not exist" containerID="9b4f922e1eea306aab8a2ebd09a56ebc9aa682f126fb9ef0b5693022653ba54f" Apr 23 16:56:20.954350 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:20.954281 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b4f922e1eea306aab8a2ebd09a56ebc9aa682f126fb9ef0b5693022653ba54f"} err="failed to get container status \"9b4f922e1eea306aab8a2ebd09a56ebc9aa682f126fb9ef0b5693022653ba54f\": rpc error: code = NotFound desc = could not find container \"9b4f922e1eea306aab8a2ebd09a56ebc9aa682f126fb9ef0b5693022653ba54f\": container with ID starting with 9b4f922e1eea306aab8a2ebd09a56ebc9aa682f126fb9ef0b5693022653ba54f not found: ID does not exist" Apr 23 16:56:20.964757 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:20.964731 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb"] Apr 23 16:56:20.970082 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:20.970059 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-c88c1-predictor-6df56fdd5b-4dnfb"] Apr 23 16:56:22.581783 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:22.581742 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3659e25d-c7f7-4979-bfd8-d8ff93d60c90" path="/var/lib/kubelet/pods/3659e25d-c7f7-4979-bfd8-d8ff93d60c90/volumes" Apr 23 16:56:27.789135 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:27.789104 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc" Apr 23 16:56:28.936803 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:28.936760 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh" podUID="bcf88044-c1a4-40b6-b6ff-4fe41d23c6ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 16:56:38.936558 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:38.936516 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh" podUID="bcf88044-c1a4-40b6-b6ff-4fe41d23c6ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 16:56:48.936416 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:48.936353 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh" podUID="bcf88044-c1a4-40b6-b6ff-4fe41d23c6ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 16:56:56.484727 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:56.484694 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc"] Apr 23 16:56:56.485080 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:56.484914 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc" podUID="30f124ff-cce0-4699-bc6a-ffc58b185e1f" containerName="kserve-container" containerID="cri-o://c2e6b658e08c44b9ad22f8b93066019f1281496064d714ecaaf0d7b798f9c48e" gracePeriod=30 Apr 23 16:56:56.544963 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:56.544929 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782"] Apr 23 16:56:56.545338 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:56.545322 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3659e25d-c7f7-4979-bfd8-d8ff93d60c90" containerName="kserve-container" Apr 23 16:56:56.545400 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:56.545340 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3659e25d-c7f7-4979-bfd8-d8ff93d60c90" containerName="kserve-container" Apr 23 16:56:56.545454 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:56.545443 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3659e25d-c7f7-4979-bfd8-d8ff93d60c90" containerName="kserve-container" Apr 23 16:56:56.548310 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:56.548287 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782" Apr 23 16:56:56.555115 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:56.555087 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782"] Apr 23 16:56:56.559084 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:56.559064 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782" Apr 23 16:56:56.680039 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:56.679964 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782"] Apr 23 16:56:56.682244 ip-10-0-129-102 kubenswrapper[2578]: W0423 16:56:56.682216 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e550a89_ceb6_45b4_a611_ec97d98ace2f.slice/crio-3fec01f2e1ff3f137201552bb686e949ec85f827042064378e4243ea3c894eb6 WatchSource:0}: Error finding container 3fec01f2e1ff3f137201552bb686e949ec85f827042064378e4243ea3c894eb6: Status 404 returned error can't find the container with id 3fec01f2e1ff3f137201552bb686e949ec85f827042064378e4243ea3c894eb6 Apr 23 16:56:57.080943 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:57.080899 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782" event={"ID":"1e550a89-ceb6-45b4-a611-ec97d98ace2f","Type":"ContainerStarted","Data":"d811c03e56fe9d37caf27d0486253e149189cae3a0f63660382da35b60e3c3ad"} Apr 23 16:56:57.080943 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:57.080948 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782" event={"ID":"1e550a89-ceb6-45b4-a611-ec97d98ace2f","Type":"ContainerStarted","Data":"3fec01f2e1ff3f137201552bb686e949ec85f827042064378e4243ea3c894eb6"} Apr 23 16:56:57.081192 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:57.081104 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782" Apr 23 16:56:57.082513 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:57.082489 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782" podUID="1e550a89-ceb6-45b4-a611-ec97d98ace2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 16:56:57.096566 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:57.096528 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782" podStartSLOduration=1.096515384 podStartE2EDuration="1.096515384s" podCreationTimestamp="2026-04-23 16:56:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:56:57.095371536 +0000 UTC m=+1309.117767653" watchObservedRunningTime="2026-04-23 16:56:57.096515384 +0000 UTC m=+1309.118911499" Apr 23 16:56:57.786986 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:57.786944 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc" podUID="30f124ff-cce0-4699-bc6a-ffc58b185e1f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 23 16:56:58.084260 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:58.084225 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782" podUID="1e550a89-ceb6-45b4-a611-ec97d98ace2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 16:56:58.936600 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:58.936557 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh" podUID="bcf88044-c1a4-40b6-b6ff-4fe41d23c6ad" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 23 16:56:59.530223 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:56:59.530198 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc" Apr 23 16:57:00.090573 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:57:00.090531 2578 generic.go:358] "Generic (PLEG): container finished" podID="30f124ff-cce0-4699-bc6a-ffc58b185e1f" containerID="c2e6b658e08c44b9ad22f8b93066019f1281496064d714ecaaf0d7b798f9c48e" exitCode=0 Apr 23 16:57:00.090944 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:57:00.090585 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc" event={"ID":"30f124ff-cce0-4699-bc6a-ffc58b185e1f","Type":"ContainerDied","Data":"c2e6b658e08c44b9ad22f8b93066019f1281496064d714ecaaf0d7b798f9c48e"} Apr 23 16:57:00.090944 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:57:00.090598 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc" Apr 23 16:57:00.090944 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:57:00.090617 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc" event={"ID":"30f124ff-cce0-4699-bc6a-ffc58b185e1f","Type":"ContainerDied","Data":"37f1e0ec82ca46899f9e4f629c86f59bf3ca0f4d90944219e7589a89af0140d9"} Apr 23 16:57:00.090944 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:57:00.090637 2578 scope.go:117] "RemoveContainer" containerID="c2e6b658e08c44b9ad22f8b93066019f1281496064d714ecaaf0d7b798f9c48e" Apr 23 16:57:00.098740 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:57:00.098724 2578 scope.go:117] "RemoveContainer" containerID="c2e6b658e08c44b9ad22f8b93066019f1281496064d714ecaaf0d7b798f9c48e" Apr 23 16:57:00.098984 ip-10-0-129-102 kubenswrapper[2578]: E0423 16:57:00.098960 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2e6b658e08c44b9ad22f8b93066019f1281496064d714ecaaf0d7b798f9c48e\": container with ID starting with c2e6b658e08c44b9ad22f8b93066019f1281496064d714ecaaf0d7b798f9c48e not found: ID does not exist" containerID="c2e6b658e08c44b9ad22f8b93066019f1281496064d714ecaaf0d7b798f9c48e" Apr 23 16:57:00.099041 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:57:00.098991 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2e6b658e08c44b9ad22f8b93066019f1281496064d714ecaaf0d7b798f9c48e"} err="failed to get container status \"c2e6b658e08c44b9ad22f8b93066019f1281496064d714ecaaf0d7b798f9c48e\": rpc error: code = NotFound desc = could not find container \"c2e6b658e08c44b9ad22f8b93066019f1281496064d714ecaaf0d7b798f9c48e\": container with ID starting with c2e6b658e08c44b9ad22f8b93066019f1281496064d714ecaaf0d7b798f9c48e not found: ID does not exist" Apr 23 16:57:00.109363 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:57:00.109341 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc"] Apr 23 16:57:00.112623 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:57:00.112603 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-65292-predictor-f5b7cdf66-tjrrc"] Apr 23 16:57:00.576266 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:57:00.576235 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30f124ff-cce0-4699-bc6a-ffc58b185e1f" path="/var/lib/kubelet/pods/30f124ff-cce0-4699-bc6a-ffc58b185e1f/volumes" Apr 23 16:57:08.085330 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:57:08.085285 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782" podUID="1e550a89-ceb6-45b4-a611-ec97d98ace2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 16:57:08.937184 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:57:08.937151 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh" Apr 23 16:57:18.085233 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:57:18.085189 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782" podUID="1e550a89-ceb6-45b4-a611-ec97d98ace2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 16:57:28.085271 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:57:28.085185 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782" podUID="1e550a89-ceb6-45b4-a611-ec97d98ace2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 16:57:38.085309 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:57:38.085261 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782" podUID="1e550a89-ceb6-45b4-a611-ec97d98ace2f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 16:57:48.086371 ip-10-0-129-102 kubenswrapper[2578]: I0423 16:57:48.086332 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782" Apr 23 17:00:08.566636 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:00:08.566603 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 17:00:08.570234 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:00:08.570212 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 17:05:08.596703 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:08.596671 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 17:05:08.597575 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:08.597555 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 17:05:41.604998 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:41.604965 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh"] Apr 23 17:05:41.605654 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:41.605223 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh" podUID="bcf88044-c1a4-40b6-b6ff-4fe41d23c6ad" containerName="kserve-container" containerID="cri-o://8fdc10977c70ec144ea834d2becad6c9f080f9ac27402d15dd653146d659516f" gracePeriod=30 Apr 23 17:05:41.645965 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:41.645937 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz"] Apr 23 17:05:41.646310 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:41.646297 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30f124ff-cce0-4699-bc6a-ffc58b185e1f" containerName="kserve-container" Apr 23 17:05:41.646357 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:41.646312 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f124ff-cce0-4699-bc6a-ffc58b185e1f" containerName="kserve-container" Apr 23 17:05:41.646435 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:41.646423 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="30f124ff-cce0-4699-bc6a-ffc58b185e1f" containerName="kserve-container" Apr 23 17:05:41.649338 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:41.649323 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz" Apr 23 17:05:41.655450 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:41.655426 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz"] Apr 23 17:05:41.660108 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:41.660090 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz" Apr 23 17:05:41.785395 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:41.785345 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz"] Apr 23 17:05:41.788643 ip-10-0-129-102 kubenswrapper[2578]: W0423 17:05:41.788604 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7a2aa2a_1d34_47c8_87ad_ce6f739ee4ca.slice/crio-19f60f10fed68d8d5aca89c993d6ecdcf03dcb425229e335b00de89bdc47ebb1 WatchSource:0}: Error finding container 19f60f10fed68d8d5aca89c993d6ecdcf03dcb425229e335b00de89bdc47ebb1: Status 404 returned error can't find the container with id 19f60f10fed68d8d5aca89c993d6ecdcf03dcb425229e335b00de89bdc47ebb1 Apr 23 17:05:41.790676 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:41.790656 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:05:41.879443 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:41.879410 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz" event={"ID":"d7a2aa2a-1d34-47c8-87ad-ce6f739ee4ca","Type":"ContainerStarted","Data":"1a4a5f2f610ee6d7c456e4a6bafc01f2da9024f1f4a6b9fe19b467b3a62e5c21"} Apr 23 17:05:41.879443 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:41.879447 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz" event={"ID":"d7a2aa2a-1d34-47c8-87ad-ce6f739ee4ca","Type":"ContainerStarted","Data":"19f60f10fed68d8d5aca89c993d6ecdcf03dcb425229e335b00de89bdc47ebb1"} Apr 23 17:05:41.879662 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:41.879589 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz" Apr 23 17:05:41.880962 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:41.880940 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz" podUID="d7a2aa2a-1d34-47c8-87ad-ce6f739ee4ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 17:05:41.900164 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:41.900118 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz" podStartSLOduration=0.900102964 podStartE2EDuration="900.102964ms" podCreationTimestamp="2026-04-23 17:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:05:41.897368668 +0000 UTC m=+1833.919764809" watchObservedRunningTime="2026-04-23 17:05:41.900102964 +0000 UTC m=+1833.922499079" Apr 23 17:05:42.883492 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:42.883448 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz" podUID="d7a2aa2a-1d34-47c8-87ad-ce6f739ee4ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 17:05:44.760138 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:44.760113 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh" Apr 23 17:05:44.890758 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:44.890723 2578 generic.go:358] "Generic (PLEG): container finished" podID="bcf88044-c1a4-40b6-b6ff-4fe41d23c6ad" containerID="8fdc10977c70ec144ea834d2becad6c9f080f9ac27402d15dd653146d659516f" exitCode=0 Apr 23 17:05:44.890945 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:44.890782 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh" Apr 23 17:05:44.890945 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:44.890809 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh" event={"ID":"bcf88044-c1a4-40b6-b6ff-4fe41d23c6ad","Type":"ContainerDied","Data":"8fdc10977c70ec144ea834d2becad6c9f080f9ac27402d15dd653146d659516f"} Apr 23 17:05:44.890945 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:44.890858 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh" event={"ID":"bcf88044-c1a4-40b6-b6ff-4fe41d23c6ad","Type":"ContainerDied","Data":"7e32e8ac04bb14e03848cf7adc2b5a9f44c69ed3238baf046b11850343243860"} Apr 23 17:05:44.890945 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:44.890879 2578 scope.go:117] "RemoveContainer" containerID="8fdc10977c70ec144ea834d2becad6c9f080f9ac27402d15dd653146d659516f" Apr 23 17:05:44.899243 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:44.899228 2578 scope.go:117] "RemoveContainer" containerID="8fdc10977c70ec144ea834d2becad6c9f080f9ac27402d15dd653146d659516f" Apr 23 17:05:44.899497 ip-10-0-129-102 kubenswrapper[2578]: E0423 17:05:44.899479 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fdc10977c70ec144ea834d2becad6c9f080f9ac27402d15dd653146d659516f\": container with ID starting with 8fdc10977c70ec144ea834d2becad6c9f080f9ac27402d15dd653146d659516f not found: ID does not exist" containerID="8fdc10977c70ec144ea834d2becad6c9f080f9ac27402d15dd653146d659516f" Apr 23 17:05:44.899547 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:44.899508 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fdc10977c70ec144ea834d2becad6c9f080f9ac27402d15dd653146d659516f"} err="failed to get container status \"8fdc10977c70ec144ea834d2becad6c9f080f9ac27402d15dd653146d659516f\": rpc error: code = NotFound desc = could not find container \"8fdc10977c70ec144ea834d2becad6c9f080f9ac27402d15dd653146d659516f\": container with ID starting with 8fdc10977c70ec144ea834d2becad6c9f080f9ac27402d15dd653146d659516f not found: ID does not exist" Apr 23 17:05:44.910170 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:44.910148 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh"] Apr 23 17:05:44.913911 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:44.913886 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-3ffea-predictor-c9bf7cd76-kpqnh"] Apr 23 17:05:46.575864 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:46.575831 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcf88044-c1a4-40b6-b6ff-4fe41d23c6ad" path="/var/lib/kubelet/pods/bcf88044-c1a4-40b6-b6ff-4fe41d23c6ad/volumes" Apr 23 17:05:52.883762 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:05:52.883711 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz" podUID="d7a2aa2a-1d34-47c8-87ad-ce6f739ee4ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 17:06:02.883973 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:02.883930 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz" podUID="d7a2aa2a-1d34-47c8-87ad-ce6f739ee4ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 17:06:12.884209 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:12.884164 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz" podUID="d7a2aa2a-1d34-47c8-87ad-ce6f739ee4ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 17:06:21.419287 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:21.419248 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782"] Apr 23 17:06:21.419681 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:21.419505 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782" podUID="1e550a89-ceb6-45b4-a611-ec97d98ace2f" containerName="kserve-container" containerID="cri-o://d811c03e56fe9d37caf27d0486253e149189cae3a0f63660382da35b60e3c3ad" gracePeriod=30 Apr 23 17:06:21.424627 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:21.424601 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9"] Apr 23 17:06:21.424970 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:21.424959 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bcf88044-c1a4-40b6-b6ff-4fe41d23c6ad" containerName="kserve-container" Apr 23 17:06:21.425011 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:21.424972 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf88044-c1a4-40b6-b6ff-4fe41d23c6ad" containerName="kserve-container" Apr 23 17:06:21.425057 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:21.425034 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="bcf88044-c1a4-40b6-b6ff-4fe41d23c6ad" containerName="kserve-container" Apr 23 17:06:21.429137 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:21.429117 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9" Apr 23 17:06:21.435542 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:21.435513 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9"] Apr 23 17:06:21.439743 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:21.439720 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9" Apr 23 17:06:21.575041 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:21.575009 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9"] Apr 23 17:06:21.578354 ip-10-0-129-102 kubenswrapper[2578]: W0423 17:06:21.578324 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9438b07_56b1_4b01_bab3_1e245eba99ac.slice/crio-343d0d4fc2c2a5d3508c47cbfdb122a933aaf93b2e60989c14b87b63fef9a455 WatchSource:0}: Error finding container 343d0d4fc2c2a5d3508c47cbfdb122a933aaf93b2e60989c14b87b63fef9a455: Status 404 returned error can't find the container with id 343d0d4fc2c2a5d3508c47cbfdb122a933aaf93b2e60989c14b87b63fef9a455 Apr 23 17:06:22.015276 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:22.015232 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9" event={"ID":"f9438b07-56b1-4b01-bab3-1e245eba99ac","Type":"ContainerStarted","Data":"0d8d0ff6073778a9abd14c39b723b283308061d78e22d7ae4f6c2a21b3fa54b5"} Apr 23 17:06:22.015276 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:22.015279 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9" event={"ID":"f9438b07-56b1-4b01-bab3-1e245eba99ac","Type":"ContainerStarted","Data":"343d0d4fc2c2a5d3508c47cbfdb122a933aaf93b2e60989c14b87b63fef9a455"} Apr 23 17:06:22.015559 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:22.015422 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9" Apr 23 17:06:22.016807 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:22.016778 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9" podUID="f9438b07-56b1-4b01-bab3-1e245eba99ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 17:06:22.030798 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:22.030754 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9" podStartSLOduration=1.030741925 podStartE2EDuration="1.030741925s" podCreationTimestamp="2026-04-23 17:06:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:06:22.02903386 +0000 UTC m=+1874.051429976" watchObservedRunningTime="2026-04-23 17:06:22.030741925 +0000 UTC m=+1874.053138084" Apr 23 17:06:22.883651 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:22.883615 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz" podUID="d7a2aa2a-1d34-47c8-87ad-ce6f739ee4ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 17:06:23.018998 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:23.018963 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9" podUID="f9438b07-56b1-4b01-bab3-1e245eba99ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 17:06:25.027583 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:25.027552 2578 generic.go:358] "Generic (PLEG): container finished" podID="1e550a89-ceb6-45b4-a611-ec97d98ace2f" containerID="d811c03e56fe9d37caf27d0486253e149189cae3a0f63660382da35b60e3c3ad" exitCode=0 Apr 23 17:06:25.027935 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:25.027633 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782" event={"ID":"1e550a89-ceb6-45b4-a611-ec97d98ace2f","Type":"ContainerDied","Data":"d811c03e56fe9d37caf27d0486253e149189cae3a0f63660382da35b60e3c3ad"} Apr 23 17:06:25.078102 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:25.078074 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782" Apr 23 17:06:26.032122 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:26.032082 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782" event={"ID":"1e550a89-ceb6-45b4-a611-ec97d98ace2f","Type":"ContainerDied","Data":"3fec01f2e1ff3f137201552bb686e949ec85f827042064378e4243ea3c894eb6"} Apr 23 17:06:26.032122 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:26.032129 2578 scope.go:117] "RemoveContainer" containerID="d811c03e56fe9d37caf27d0486253e149189cae3a0f63660382da35b60e3c3ad" Apr 23 17:06:26.032663 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:26.032129 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782" Apr 23 17:06:26.052487 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:26.052455 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782"] Apr 23 17:06:26.055944 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:26.055917 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-a7ecd-predictor-5bdddf8766-6g782"] Apr 23 17:06:26.576612 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:26.576579 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e550a89-ceb6-45b4-a611-ec97d98ace2f" path="/var/lib/kubelet/pods/1e550a89-ceb6-45b4-a611-ec97d98ace2f/volumes" Apr 23 17:06:32.884713 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:32.884683 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz" Apr 23 17:06:33.019964 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:33.019925 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9" podUID="f9438b07-56b1-4b01-bab3-1e245eba99ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 17:06:43.020087 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:43.020041 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9" podUID="f9438b07-56b1-4b01-bab3-1e245eba99ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 17:06:53.019925 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:06:53.019874 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9" podUID="f9438b07-56b1-4b01-bab3-1e245eba99ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 17:07:01.846938 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:01.846900 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz"] Apr 23 17:07:01.847332 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:01.847212 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz" podUID="d7a2aa2a-1d34-47c8-87ad-ce6f739ee4ca" containerName="kserve-container" containerID="cri-o://1a4a5f2f610ee6d7c456e4a6bafc01f2da9024f1f4a6b9fe19b467b3a62e5c21" gracePeriod=30 Apr 23 17:07:01.974961 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:01.974919 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6"] Apr 23 17:07:01.975371 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:01.975353 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e550a89-ceb6-45b4-a611-ec97d98ace2f" containerName="kserve-container" Apr 23 17:07:01.975501 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:01.975373 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e550a89-ceb6-45b4-a611-ec97d98ace2f" containerName="kserve-container" Apr 23 17:07:01.975501 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:01.975488 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e550a89-ceb6-45b4-a611-ec97d98ace2f" containerName="kserve-container" Apr 23 17:07:01.978782 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:01.978759 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6" Apr 23 17:07:01.984859 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:01.984832 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6"] Apr 23 17:07:01.989185 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:01.989165 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6" Apr 23 17:07:02.119174 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:02.119096 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6"] Apr 23 17:07:02.122436 ip-10-0-129-102 kubenswrapper[2578]: W0423 17:07:02.122408 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8431775_fe3c_414b_9d9b_e9586527e351.slice/crio-3e93c9b21b4da7c1f3660ce7a0884efbf3ad99e3d7cbeb7de882629a7ed7166a WatchSource:0}: Error finding container 3e93c9b21b4da7c1f3660ce7a0884efbf3ad99e3d7cbeb7de882629a7ed7166a: Status 404 returned error can't find the container with id 3e93c9b21b4da7c1f3660ce7a0884efbf3ad99e3d7cbeb7de882629a7ed7166a Apr 23 17:07:02.157264 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:02.157230 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6" event={"ID":"c8431775-fe3c-414b-9d9b-e9586527e351","Type":"ContainerStarted","Data":"3e93c9b21b4da7c1f3660ce7a0884efbf3ad99e3d7cbeb7de882629a7ed7166a"} Apr 23 17:07:02.884060 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:02.884020 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz" podUID="d7a2aa2a-1d34-47c8-87ad-ce6f739ee4ca" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 17:07:03.019366 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:03.019320 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9" podUID="f9438b07-56b1-4b01-bab3-1e245eba99ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 17:07:03.161444 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:03.161338 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6" event={"ID":"c8431775-fe3c-414b-9d9b-e9586527e351","Type":"ContainerStarted","Data":"3d06b204a01dfb21262752acce05becac3f44fb544e4e979720737f835cea325"} Apr 23 17:07:03.161444 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:03.161401 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6" Apr 23 17:07:03.162804 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:03.162777 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6" podUID="c8431775-fe3c-414b-9d9b-e9586527e351" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 17:07:03.177186 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:03.177135 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6" podStartSLOduration=2.17711987 podStartE2EDuration="2.17711987s" podCreationTimestamp="2026-04-23 17:07:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:07:03.175892354 +0000 UTC m=+1915.198288469" watchObservedRunningTime="2026-04-23 17:07:03.17711987 +0000 UTC m=+1915.199515986" Apr 23 17:07:04.164581 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:04.164545 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6" podUID="c8431775-fe3c-414b-9d9b-e9586527e351" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 17:07:05.399936 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:05.399909 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz" Apr 23 17:07:06.172126 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:06.172090 2578 generic.go:358] "Generic (PLEG): container finished" podID="d7a2aa2a-1d34-47c8-87ad-ce6f739ee4ca" containerID="1a4a5f2f610ee6d7c456e4a6bafc01f2da9024f1f4a6b9fe19b467b3a62e5c21" exitCode=0 Apr 23 17:07:06.172297 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:06.172149 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz" Apr 23 17:07:06.172297 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:06.172170 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz" event={"ID":"d7a2aa2a-1d34-47c8-87ad-ce6f739ee4ca","Type":"ContainerDied","Data":"1a4a5f2f610ee6d7c456e4a6bafc01f2da9024f1f4a6b9fe19b467b3a62e5c21"} Apr 23 17:07:06.172297 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:06.172208 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz" event={"ID":"d7a2aa2a-1d34-47c8-87ad-ce6f739ee4ca","Type":"ContainerDied","Data":"19f60f10fed68d8d5aca89c993d6ecdcf03dcb425229e335b00de89bdc47ebb1"} Apr 23 17:07:06.172297 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:06.172224 2578 scope.go:117] "RemoveContainer" containerID="1a4a5f2f610ee6d7c456e4a6bafc01f2da9024f1f4a6b9fe19b467b3a62e5c21" Apr 23 17:07:06.180664 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:06.180644 2578 scope.go:117] "RemoveContainer" containerID="1a4a5f2f610ee6d7c456e4a6bafc01f2da9024f1f4a6b9fe19b467b3a62e5c21" Apr 23 17:07:06.180967 ip-10-0-129-102 kubenswrapper[2578]: E0423 17:07:06.180948 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a4a5f2f610ee6d7c456e4a6bafc01f2da9024f1f4a6b9fe19b467b3a62e5c21\": container with ID starting with 1a4a5f2f610ee6d7c456e4a6bafc01f2da9024f1f4a6b9fe19b467b3a62e5c21 not found: ID does not exist" containerID="1a4a5f2f610ee6d7c456e4a6bafc01f2da9024f1f4a6b9fe19b467b3a62e5c21" Apr 23 17:07:06.181037 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:06.180978 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a4a5f2f610ee6d7c456e4a6bafc01f2da9024f1f4a6b9fe19b467b3a62e5c21"} err="failed to get container status \"1a4a5f2f610ee6d7c456e4a6bafc01f2da9024f1f4a6b9fe19b467b3a62e5c21\": rpc error: code = NotFound desc = could not find container \"1a4a5f2f610ee6d7c456e4a6bafc01f2da9024f1f4a6b9fe19b467b3a62e5c21\": container with ID starting with 1a4a5f2f610ee6d7c456e4a6bafc01f2da9024f1f4a6b9fe19b467b3a62e5c21 not found: ID does not exist" Apr 23 17:07:06.191593 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:06.191566 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz"] Apr 23 17:07:06.194348 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:06.194323 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-56812-predictor-7f65bbd66b-64cmz"] Apr 23 17:07:06.576497 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:06.576464 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7a2aa2a-1d34-47c8-87ad-ce6f739ee4ca" path="/var/lib/kubelet/pods/d7a2aa2a-1d34-47c8-87ad-ce6f739ee4ca/volumes" Apr 23 17:07:13.020311 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:13.020270 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9" Apr 23 17:07:14.164983 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:14.164937 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6" podUID="c8431775-fe3c-414b-9d9b-e9586527e351" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 17:07:24.165093 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:24.165042 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6" podUID="c8431775-fe3c-414b-9d9b-e9586527e351" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 17:07:34.165117 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:34.165067 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6" podUID="c8431775-fe3c-414b-9d9b-e9586527e351" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 17:07:44.164937 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:44.164887 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6" podUID="c8431775-fe3c-414b-9d9b-e9586527e351" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 17:07:54.165860 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:07:54.165826 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6" Apr 23 17:10:08.620093 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:10:08.620066 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 17:10:08.622689 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:10:08.622500 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 17:15:08.643884 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:15:08.643776 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 17:15:08.648007 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:15:08.647219 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 17:16:26.854525 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:16:26.854490 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6"] Apr 23 17:16:26.855047 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:16:26.854725 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6" podUID="c8431775-fe3c-414b-9d9b-e9586527e351" containerName="kserve-container" containerID="cri-o://3d06b204a01dfb21262752acce05becac3f44fb544e4e979720737f835cea325" gracePeriod=30 Apr 23 17:16:29.910555 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:16:29.910527 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6" Apr 23 17:16:30.091884 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:16:30.091840 2578 generic.go:358] "Generic (PLEG): container finished" podID="c8431775-fe3c-414b-9d9b-e9586527e351" containerID="3d06b204a01dfb21262752acce05becac3f44fb544e4e979720737f835cea325" exitCode=0 Apr 23 17:16:30.092213 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:16:30.091909 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6" Apr 23 17:16:30.092213 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:16:30.091929 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6" event={"ID":"c8431775-fe3c-414b-9d9b-e9586527e351","Type":"ContainerDied","Data":"3d06b204a01dfb21262752acce05becac3f44fb544e4e979720737f835cea325"} Apr 23 17:16:30.092213 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:16:30.091975 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6" event={"ID":"c8431775-fe3c-414b-9d9b-e9586527e351","Type":"ContainerDied","Data":"3e93c9b21b4da7c1f3660ce7a0884efbf3ad99e3d7cbeb7de882629a7ed7166a"} Apr 23 17:16:30.092213 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:16:30.091993 2578 scope.go:117] "RemoveContainer" containerID="3d06b204a01dfb21262752acce05becac3f44fb544e4e979720737f835cea325" Apr 23 17:16:30.100517 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:16:30.100488 2578 scope.go:117] "RemoveContainer" containerID="3d06b204a01dfb21262752acce05becac3f44fb544e4e979720737f835cea325" Apr 23 17:16:30.100774 ip-10-0-129-102 kubenswrapper[2578]: E0423 17:16:30.100757 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d06b204a01dfb21262752acce05becac3f44fb544e4e979720737f835cea325\": container with ID starting with 3d06b204a01dfb21262752acce05becac3f44fb544e4e979720737f835cea325 not found: ID does not exist" containerID="3d06b204a01dfb21262752acce05becac3f44fb544e4e979720737f835cea325" Apr 23 17:16:30.100824 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:16:30.100785 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d06b204a01dfb21262752acce05becac3f44fb544e4e979720737f835cea325"} err="failed to get container status \"3d06b204a01dfb21262752acce05becac3f44fb544e4e979720737f835cea325\": rpc error: code = NotFound desc = could not find container \"3d06b204a01dfb21262752acce05becac3f44fb544e4e979720737f835cea325\": container with ID starting with 3d06b204a01dfb21262752acce05becac3f44fb544e4e979720737f835cea325 not found: ID does not exist" Apr 23 17:16:30.111513 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:16:30.111486 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6"] Apr 23 17:16:30.114947 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:16:30.114917 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-5bc06-predictor-5b55f49dc8-sxjj6"] Apr 23 17:16:30.575943 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:16:30.575911 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8431775-fe3c-414b-9d9b-e9586527e351" path="/var/lib/kubelet/pods/c8431775-fe3c-414b-9d9b-e9586527e351/volumes" Apr 23 17:20:08.668573 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:20:08.668441 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 17:20:08.672732 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:20:08.671335 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 17:23:50.812335 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:23:50.812296 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9"] Apr 23 17:23:50.812829 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:23:50.812610 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9" podUID="f9438b07-56b1-4b01-bab3-1e245eba99ac" containerName="kserve-container" containerID="cri-o://0d8d0ff6073778a9abd14c39b723b283308061d78e22d7ae4f6c2a21b3fa54b5" gracePeriod=30 Apr 23 17:23:53.019081 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:23:53.019034 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9" podUID="f9438b07-56b1-4b01-bab3-1e245eba99ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 17:23:54.058565 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:23:54.058542 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9" Apr 23 17:23:54.589584 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:23:54.589548 2578 generic.go:358] "Generic (PLEG): container finished" podID="f9438b07-56b1-4b01-bab3-1e245eba99ac" containerID="0d8d0ff6073778a9abd14c39b723b283308061d78e22d7ae4f6c2a21b3fa54b5" exitCode=0 Apr 23 17:23:54.589745 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:23:54.589607 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9" event={"ID":"f9438b07-56b1-4b01-bab3-1e245eba99ac","Type":"ContainerDied","Data":"0d8d0ff6073778a9abd14c39b723b283308061d78e22d7ae4f6c2a21b3fa54b5"} Apr 23 17:23:54.589745 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:23:54.589629 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9" Apr 23 17:23:54.589745 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:23:54.589646 2578 scope.go:117] "RemoveContainer" containerID="0d8d0ff6073778a9abd14c39b723b283308061d78e22d7ae4f6c2a21b3fa54b5" Apr 23 17:23:54.589745 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:23:54.589634 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9" event={"ID":"f9438b07-56b1-4b01-bab3-1e245eba99ac","Type":"ContainerDied","Data":"343d0d4fc2c2a5d3508c47cbfdb122a933aaf93b2e60989c14b87b63fef9a455"} Apr 23 17:23:54.598032 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:23:54.598014 2578 scope.go:117] "RemoveContainer" containerID="0d8d0ff6073778a9abd14c39b723b283308061d78e22d7ae4f6c2a21b3fa54b5" Apr 23 17:23:54.598267 ip-10-0-129-102 kubenswrapper[2578]: E0423 17:23:54.598247 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d8d0ff6073778a9abd14c39b723b283308061d78e22d7ae4f6c2a21b3fa54b5\": container with ID starting with 0d8d0ff6073778a9abd14c39b723b283308061d78e22d7ae4f6c2a21b3fa54b5 not found: ID does not exist" containerID="0d8d0ff6073778a9abd14c39b723b283308061d78e22d7ae4f6c2a21b3fa54b5" Apr 23 17:23:54.598314 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:23:54.598276 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d8d0ff6073778a9abd14c39b723b283308061d78e22d7ae4f6c2a21b3fa54b5"} err="failed to get container status \"0d8d0ff6073778a9abd14c39b723b283308061d78e22d7ae4f6c2a21b3fa54b5\": rpc error: code = NotFound desc = could not find container \"0d8d0ff6073778a9abd14c39b723b283308061d78e22d7ae4f6c2a21b3fa54b5\": container with ID starting with 0d8d0ff6073778a9abd14c39b723b283308061d78e22d7ae4f6c2a21b3fa54b5 not found: ID does not exist" Apr 23 17:23:54.603458 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:23:54.603428 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9"] Apr 23 17:23:54.606565 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:23:54.606546 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-390fe-predictor-5d9d5697cb-ct4l9"] Apr 23 17:23:56.576017 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:23:56.575984 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9438b07-56b1-4b01-bab3-1e245eba99ac" path="/var/lib/kubelet/pods/f9438b07-56b1-4b01-bab3-1e245eba99ac/volumes" Apr 23 17:24:18.823967 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:18.823935 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-lpllm_3c77861a-3b9a-47ae-9a06-cdc3b74145f7/global-pull-secret-syncer/0.log" Apr 23 17:24:18.981670 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:18.981637 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-s9d8v_ac1c0ce2-6f52-471e-ba47-e46a7d7fc0a6/konnectivity-agent/0.log" Apr 23 17:24:19.025839 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:19.025813 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-102.ec2.internal_432a54920ff69b032f406403f8e82323/haproxy/0.log" Apr 23 17:24:22.514507 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:22.514476 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c2a18e2c-6e57-4281-89be-0b5ff6a32cfb/alertmanager/0.log" Apr 23 17:24:22.531316 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:22.531289 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c2a18e2c-6e57-4281-89be-0b5ff6a32cfb/config-reloader/0.log" Apr 23 17:24:22.548072 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:22.548043 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c2a18e2c-6e57-4281-89be-0b5ff6a32cfb/kube-rbac-proxy-web/0.log" Apr 23 17:24:22.569391 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:22.569360 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c2a18e2c-6e57-4281-89be-0b5ff6a32cfb/kube-rbac-proxy/0.log" Apr 23 17:24:22.587672 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:22.587648 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c2a18e2c-6e57-4281-89be-0b5ff6a32cfb/kube-rbac-proxy-metric/0.log" Apr 23 17:24:22.604542 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:22.604513 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c2a18e2c-6e57-4281-89be-0b5ff6a32cfb/prom-label-proxy/0.log" Apr 23 17:24:22.620954 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:22.620924 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c2a18e2c-6e57-4281-89be-0b5ff6a32cfb/init-config-reloader/0.log" Apr 23 17:24:22.653506 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:22.653476 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-7nckc_71af0be6-1f33-49c7-ba45-d12899bb84e6/cluster-monitoring-operator/0.log" Apr 23 17:24:22.754905 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:22.754878 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-558c4f944f-hzqnx_ab3ca3b1-edf8-47a5-8f0d-357e4211c00f/metrics-server/0.log" Apr 23 17:24:22.777144 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:22.777059 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-6btkg_1a7f563a-e902-46a6-ba9c-dab961c3b378/monitoring-plugin/0.log" Apr 23 17:24:22.805771 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:22.805735 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-77j9v_7c860e58-8c97-4fba-b206-c1a4c598ff18/node-exporter/0.log" Apr 23 17:24:22.824080 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:22.824053 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-77j9v_7c860e58-8c97-4fba-b206-c1a4c598ff18/kube-rbac-proxy/0.log" Apr 23 17:24:22.839537 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:22.839511 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-77j9v_7c860e58-8c97-4fba-b206-c1a4c598ff18/init-textfile/0.log" Apr 23 17:24:23.007893 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:23.007862 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gd78m_00590187-4b05-446e-b9d1-efc30e43aec4/kube-rbac-proxy-main/0.log" Apr 23 17:24:23.023838 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:23.023809 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gd78m_00590187-4b05-446e-b9d1-efc30e43aec4/kube-rbac-proxy-self/0.log" Apr 23 17:24:23.044425 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:23.044324 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gd78m_00590187-4b05-446e-b9d1-efc30e43aec4/openshift-state-metrics/0.log" Apr 23 17:24:24.619060 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:24.619030 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-wxt5q_203f31ff-6191-4108-83b4-7a8cd9446ee7/networking-console-plugin/0.log" Apr 23 17:24:25.347286 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:25.347208 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5cbf969bd8-vvmt9_bd43d173-e3e7-434e-bf93-a2bca83ea33b/console/0.log" Apr 23 17:24:25.372931 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:25.372898 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-csnvz_99a16fcc-e537-4736-a7fd-4a673684aa6e/download-server/0.log" Apr 23 17:24:26.193947 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.193914 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k"] Apr 23 17:24:26.194312 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.194272 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8431775-fe3c-414b-9d9b-e9586527e351" containerName="kserve-container" Apr 23 17:24:26.194312 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.194283 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8431775-fe3c-414b-9d9b-e9586527e351" containerName="kserve-container" Apr 23 17:24:26.194312 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.194291 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7a2aa2a-1d34-47c8-87ad-ce6f739ee4ca" containerName="kserve-container" Apr 23 17:24:26.194312 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.194296 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a2aa2a-1d34-47c8-87ad-ce6f739ee4ca" containerName="kserve-container" Apr 23 17:24:26.194312 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.194307 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9438b07-56b1-4b01-bab3-1e245eba99ac" containerName="kserve-container" Apr 23 17:24:26.194312 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.194313 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9438b07-56b1-4b01-bab3-1e245eba99ac" containerName="kserve-container" Apr 23 17:24:26.194518 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.194363 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8431775-fe3c-414b-9d9b-e9586527e351" containerName="kserve-container" Apr 23 17:24:26.194518 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.194374 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9438b07-56b1-4b01-bab3-1e245eba99ac" containerName="kserve-container" Apr 23 17:24:26.194518 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.194394 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7a2aa2a-1d34-47c8-87ad-ce6f739ee4ca" containerName="kserve-container" Apr 23 17:24:26.197640 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.197622 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" Apr 23 17:24:26.199912 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.199887 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jm5zz\"/\"openshift-service-ca.crt\"" Apr 23 17:24:26.200545 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.200521 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jm5zz\"/\"default-dockercfg-rwbms\"" Apr 23 17:24:26.200909 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.200521 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jm5zz\"/\"kube-root-ca.crt\"" Apr 23 17:24:26.206964 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.206938 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k"] Apr 23 17:24:26.277187 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.277143 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/805f76ec-39c9-47bf-b2cc-c01065b3581d-lib-modules\") pod \"perf-node-gather-daemonset-p8l9k\" (UID: \"805f76ec-39c9-47bf-b2cc-c01065b3581d\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" Apr 23 17:24:26.277366 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.277195 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/805f76ec-39c9-47bf-b2cc-c01065b3581d-sys\") pod \"perf-node-gather-daemonset-p8l9k\" (UID: \"805f76ec-39c9-47bf-b2cc-c01065b3581d\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" Apr 23 17:24:26.277366 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.277255 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/805f76ec-39c9-47bf-b2cc-c01065b3581d-podres\") pod \"perf-node-gather-daemonset-p8l9k\" (UID: \"805f76ec-39c9-47bf-b2cc-c01065b3581d\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" Apr 23 17:24:26.277366 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.277292 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2j7f\" (UniqueName: \"kubernetes.io/projected/805f76ec-39c9-47bf-b2cc-c01065b3581d-kube-api-access-m2j7f\") pod \"perf-node-gather-daemonset-p8l9k\" (UID: \"805f76ec-39c9-47bf-b2cc-c01065b3581d\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" Apr 23 17:24:26.277366 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.277319 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/805f76ec-39c9-47bf-b2cc-c01065b3581d-proc\") pod \"perf-node-gather-daemonset-p8l9k\" (UID: \"805f76ec-39c9-47bf-b2cc-c01065b3581d\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" Apr 23 17:24:26.373696 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.373663 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-g2wqn_4db30a17-673a-4844-8750-e939b2e34518/dns/0.log" Apr 23 17:24:26.377799 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.377773 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/805f76ec-39c9-47bf-b2cc-c01065b3581d-sys\") pod \"perf-node-gather-daemonset-p8l9k\" (UID: \"805f76ec-39c9-47bf-b2cc-c01065b3581d\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" Apr 23 17:24:26.377940 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.377812 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/805f76ec-39c9-47bf-b2cc-c01065b3581d-podres\") pod \"perf-node-gather-daemonset-p8l9k\" (UID: \"805f76ec-39c9-47bf-b2cc-c01065b3581d\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" Apr 23 17:24:26.377940 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.377849 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2j7f\" (UniqueName: \"kubernetes.io/projected/805f76ec-39c9-47bf-b2cc-c01065b3581d-kube-api-access-m2j7f\") pod \"perf-node-gather-daemonset-p8l9k\" (UID: \"805f76ec-39c9-47bf-b2cc-c01065b3581d\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" Apr 23 17:24:26.377940 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.377884 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/805f76ec-39c9-47bf-b2cc-c01065b3581d-sys\") pod \"perf-node-gather-daemonset-p8l9k\" (UID: \"805f76ec-39c9-47bf-b2cc-c01065b3581d\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" Apr 23 17:24:26.377940 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.377886 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/805f76ec-39c9-47bf-b2cc-c01065b3581d-proc\") pod \"perf-node-gather-daemonset-p8l9k\" (UID: \"805f76ec-39c9-47bf-b2cc-c01065b3581d\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" Apr 23 17:24:26.377940 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.377933 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/805f76ec-39c9-47bf-b2cc-c01065b3581d-proc\") pod \"perf-node-gather-daemonset-p8l9k\" (UID: \"805f76ec-39c9-47bf-b2cc-c01065b3581d\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" Apr 23 17:24:26.378191 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.377999 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/805f76ec-39c9-47bf-b2cc-c01065b3581d-podres\") pod \"perf-node-gather-daemonset-p8l9k\" (UID: \"805f76ec-39c9-47bf-b2cc-c01065b3581d\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" Apr 23 17:24:26.378191 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.378022 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/805f76ec-39c9-47bf-b2cc-c01065b3581d-lib-modules\") pod \"perf-node-gather-daemonset-p8l9k\" (UID: \"805f76ec-39c9-47bf-b2cc-c01065b3581d\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" Apr 23 17:24:26.378191 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.378159 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/805f76ec-39c9-47bf-b2cc-c01065b3581d-lib-modules\") pod \"perf-node-gather-daemonset-p8l9k\" (UID: \"805f76ec-39c9-47bf-b2cc-c01065b3581d\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" Apr 23 17:24:26.384966 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.384929 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2j7f\" (UniqueName: \"kubernetes.io/projected/805f76ec-39c9-47bf-b2cc-c01065b3581d-kube-api-access-m2j7f\") pod \"perf-node-gather-daemonset-p8l9k\" (UID: \"805f76ec-39c9-47bf-b2cc-c01065b3581d\") " pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" Apr 23 17:24:26.389704 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.389686 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-g2wqn_4db30a17-673a-4844-8750-e939b2e34518/kube-rbac-proxy/0.log" Apr 23 17:24:26.491007 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.490923 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-frq2q_74494667-d025-4d57-be34-03a72ee7cbaa/dns-node-resolver/0.log" Apr 23 17:24:26.508500 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.508472 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" Apr 23 17:24:26.631909 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.631881 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k"] Apr 23 17:24:26.634882 ip-10-0-129-102 kubenswrapper[2578]: W0423 17:24:26.634842 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod805f76ec_39c9_47bf_b2cc_c01065b3581d.slice/crio-eb4ad3a1e7def7091adfc2ec8cdccbad1c28a852c9424fe4fa4096c0bcbec209 WatchSource:0}: Error finding container eb4ad3a1e7def7091adfc2ec8cdccbad1c28a852c9424fe4fa4096c0bcbec209: Status 404 returned error can't find the container with id eb4ad3a1e7def7091adfc2ec8cdccbad1c28a852c9424fe4fa4096c0bcbec209 Apr 23 17:24:26.636775 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.636756 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:24:26.697162 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.697124 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" event={"ID":"805f76ec-39c9-47bf-b2cc-c01065b3581d","Type":"ContainerStarted","Data":"eb4ad3a1e7def7091adfc2ec8cdccbad1c28a852c9424fe4fa4096c0bcbec209"} Apr 23 17:24:26.871485 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.871447 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-58544877c9-fd586_0f3f1e66-afce-41c0-a9fe-72c3a0eb1f44/registry/0.log" Apr 23 17:24:26.926879 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:26.926836 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xsjmw_0cf2af80-3ff4-4717-af9c-87bb29677708/node-ca/0.log" Apr 23 17:24:27.701541 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:27.701503 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" event={"ID":"805f76ec-39c9-47bf-b2cc-c01065b3581d","Type":"ContainerStarted","Data":"3966070fdbfd18081acd4dc0a7b1248983adab9bfda33ca8995c125e147f2a8f"} Apr 23 17:24:27.701943 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:27.701602 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" Apr 23 17:24:27.718561 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:27.718513 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" podStartSLOduration=1.718498037 podStartE2EDuration="1.718498037s" podCreationTimestamp="2026-04-23 17:24:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:24:27.716967276 +0000 UTC m=+2959.739363404" watchObservedRunningTime="2026-04-23 17:24:27.718498037 +0000 UTC m=+2959.740894152" Apr 23 17:24:27.876208 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:27.876179 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rmrtl_743aa8f5-75e8-4c04-8f4a-d49896428015/serve-healthcheck-canary/0.log" Apr 23 17:24:28.248211 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:28.248180 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lgpt4_d042bc1e-b16b-4b25-a9e1-19e50f2c799f/kube-rbac-proxy/0.log" Apr 23 17:24:28.263167 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:28.263137 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lgpt4_d042bc1e-b16b-4b25-a9e1-19e50f2c799f/exporter/0.log" Apr 23 17:24:28.278255 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:28.278227 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lgpt4_d042bc1e-b16b-4b25-a9e1-19e50f2c799f/extractor/0.log" Apr 23 17:24:30.247124 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:30.247091 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-6b94ff949c-rkqk9_59c1528b-83e6-4e0d-b9a9-37900b7098b2/manager/0.log" Apr 23 17:24:30.508510 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:30.508416 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-5wlsz_4c0a5e5e-d682-44d7-b461-1b13c974a88a/manager/0.log" Apr 23 17:24:30.551793 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:30.551766 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-tllcv_421a788d-3f91-42f6-a112-cdf4bd228da3/seaweedfs/0.log" Apr 23 17:24:33.715738 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:33.715708 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jm5zz/perf-node-gather-daemonset-p8l9k" Apr 23 17:24:34.135709 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:34.135669 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-94hpw_6b7ec9ae-872e-40fc-8d51-650ccb39c97b/kube-storage-version-migrator-operator/1.log" Apr 23 17:24:34.136331 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:34.136310 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-94hpw_6b7ec9ae-872e-40fc-8d51-650ccb39c97b/kube-storage-version-migrator-operator/0.log" Apr 23 17:24:35.187096 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:35.187018 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fmfdm_07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd/kube-multus-additional-cni-plugins/0.log" Apr 23 17:24:35.203540 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:35.203511 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fmfdm_07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd/egress-router-binary-copy/0.log" Apr 23 17:24:35.219587 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:35.219562 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fmfdm_07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd/cni-plugins/0.log" Apr 23 17:24:35.235510 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:35.235485 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fmfdm_07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd/bond-cni-plugin/0.log" Apr 23 17:24:35.250915 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:35.250892 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fmfdm_07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd/routeoverride-cni/0.log" Apr 23 17:24:35.268950 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:35.268910 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fmfdm_07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd/whereabouts-cni-bincopy/0.log" Apr 23 17:24:35.287683 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:35.287652 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fmfdm_07043a05-dfd7-4ffb-ac7d-95bd4f1e3ccd/whereabouts-cni/0.log" Apr 23 17:24:35.477572 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:35.477492 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q2hgm_d285eb72-a566-4dcd-badf-2fefeec9c577/kube-multus/0.log" Apr 23 17:24:35.497199 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:35.497166 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-h6kzn_b7f21f2f-2763-41c8-af5e-52de8001226b/network-metrics-daemon/0.log" Apr 23 17:24:35.512180 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:35.512156 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-h6kzn_b7f21f2f-2763-41c8-af5e-52de8001226b/kube-rbac-proxy/0.log" Apr 23 17:24:36.541937 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:36.541904 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-controller/0.log" Apr 23 17:24:36.555424 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:36.555392 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/0.log" Apr 23 17:24:36.570067 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:36.570033 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovn-acl-logging/1.log" Apr 23 17:24:36.585781 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:36.585747 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/kube-rbac-proxy-node/0.log" Apr 23 17:24:36.602270 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:36.602236 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 17:24:36.615817 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:36.615793 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/northd/0.log" Apr 23 17:24:36.633674 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:36.633643 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/nbdb/0.log" Apr 23 17:24:36.649577 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:36.649550 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/sbdb/0.log" Apr 23 17:24:36.743104 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:36.743077 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hbtmc_3301fde8-0566-4365-a9d8-b069eb4bebb7/ovnkube-controller/0.log" Apr 23 17:24:37.929655 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:37.929624 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-vrnns_d3c992ed-435c-4d40-bed7-1069bba6e643/check-endpoints/0.log" Apr 23 17:24:37.948339 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:37.948311 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-5lhlh_017fd19b-a66e-4805-8f42-625a4749d380/network-check-target-container/0.log" Apr 23 17:24:38.805339 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:38.805304 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-2dvdd_10e49ebb-e9c6-4f87-903f-bb7018d79002/iptables-alerter/0.log" Apr 23 17:24:39.464367 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:39.464335 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-hgwkb_a12982ff-ef0c-4f4d-88dc-c3e4719ef6d6/tuned/0.log" Apr 23 17:24:40.998080 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:40.998048 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-lqxpv_1a83f99d-af3c-4f7f-ba85-ee5701997cd8/cluster-samples-operator/0.log" Apr 23 17:24:41.011280 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:41.011244 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-lqxpv_1a83f99d-af3c-4f7f-ba85-ee5701997cd8/cluster-samples-operator-watch/0.log" Apr 23 17:24:41.930845 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:41.930811 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-r2qw8_e4eef891-cf79-4965-b0db-94974d87932b/service-ca-operator/1.log" Apr 23 17:24:41.931888 ip-10-0-129-102 kubenswrapper[2578]: I0423 17:24:41.931867 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-r2qw8_e4eef891-cf79-4965-b0db-94974d87932b/service-ca-operator/0.log"