Apr 22 18:46:04.515192 ip-10-0-136-85 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:46:04.960674 ip-10-0-136-85 kubenswrapper[2535]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:04.960674 ip-10-0-136-85 kubenswrapper[2535]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:46:04.960674 ip-10-0-136-85 kubenswrapper[2535]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:04.960674 ip-10-0-136-85 kubenswrapper[2535]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:46:04.960674 ip-10-0-136-85 kubenswrapper[2535]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:04.964007 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.963924 2535 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:46:04.967861 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967839 2535 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:04.967861 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967857 2535 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:04.967861 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967863 2535 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:04.967861 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967867 2535 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:04.968103 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967871 2535 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:04.968103 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967876 2535 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:04.968103 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967881 2535 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:04.968103 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967885 2535 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:04.968103 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967889 2535 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:04.968103 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967893 2535 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:04.968103 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967897 2535 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:04.968103 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967921 2535 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:04.968103 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967925 2535 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:04.968103 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967929 2535 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:04.968103 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967941 2535 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:04.968103 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967946 2535 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:04.968103 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967950 2535 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:04.968103 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967955 2535 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:04.968103 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967959 2535 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:04.968103 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967962 2535 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:04.968103 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967966 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:04.968103 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967970 2535 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:04.968103 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967974 2535 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:04.968103 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967978 2535 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:04.968892 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967982 2535 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:04.968892 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967985 2535 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:04.968892 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967989 2535 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:04.968892 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967994 2535 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:04.968892 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.967997 2535 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:04.968892 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968001 2535 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:04.968892 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968005 2535 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:04.968892 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968010 2535 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:04.968892 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968014 2535 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:04.968892 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968019 2535 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:04.968892 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968022 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:04.968892 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968036 2535 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:04.968892 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968041 2535 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:04.968892 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968046 2535 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:04.968892 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968050 2535 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:04.968892 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968054 2535 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:04.968892 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968059 2535 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:04.968892 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968063 2535 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:04.968892 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968067 2535 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:04.968892 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968071 2535 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:04.969614 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968075 2535 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:04.969614 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968081 2535 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:04.969614 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968085 2535 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:04.969614 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968091 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:04.969614 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968095 2535 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:04.969614 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968099 2535 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:04.969614 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968103 2535 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:04.969614 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968107 2535 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:04.969614 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968111 2535 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:04.969614 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968117 2535 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:04.969614 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968123 2535 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:04.969614 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968128 2535 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:04.969614 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968132 2535 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:04.969614 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968138 2535 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:04.969614 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968143 2535 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:04.969614 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968147 2535 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:04.969614 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968152 2535 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:04.969614 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968156 2535 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:04.969614 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968160 2535 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:04.970110 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968165 2535 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:04.970110 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968170 2535 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:04.970110 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968175 2535 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:04.970110 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968179 2535 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:04.970110 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968183 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:04.970110 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968187 2535 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:04.970110 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968193 2535 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:04.970110 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968199 2535 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:04.970110 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968218 2535 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:04.970110 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968223 2535 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:04.970110 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968227 2535 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:04.970110 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968231 2535 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:04.970110 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968236 2535 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:04.970110 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968240 2535 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:04.970110 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968244 2535 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:04.970110 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968248 2535 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:04.970110 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968252 2535 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:04.970110 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968257 2535 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:04.970110 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968261 2535 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:04.970634 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968266 2535 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:04.970634 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968270 2535 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:04.970634 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968274 2535 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:04.970634 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968278 2535 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:04.970634 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968878 2535 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:04.970634 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968886 2535 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:04.970634 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968890 2535 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:04.970634 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968895 2535 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:04.970634 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968917 2535 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:04.970634 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968922 2535 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:04.970634 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968926 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:04.970634 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968933 2535 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:04.970634 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968940 2535 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:04.970634 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968946 2535 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:04.970634 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968952 2535 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:04.970634 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968957 2535 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:04.970634 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968961 2535 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:04.970634 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968966 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:04.970634 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968970 2535 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:04.971506 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968976 2535 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:04.971506 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968980 2535 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:04.971506 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968984 2535 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:04.971506 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968988 2535 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:04.971506 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968992 2535 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:04.971506 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.968996 2535 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:04.971506 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969000 2535 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:04.971506 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969004 2535 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:04.971506 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969009 2535 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:04.971506 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969013 2535 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:04.971506 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969017 2535 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:04.971506 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969022 2535 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:04.971506 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969026 2535 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:04.971506 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969031 2535 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:04.971506 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969035 2535 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:04.971506 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969040 2535 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:04.971506 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969044 2535 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:04.971506 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969048 2535 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:04.971506 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969052 2535 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:04.971506 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969056 2535 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:04.972198 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969060 2535 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:04.972198 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969064 2535 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:04.972198 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969068 2535 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:04.972198 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969073 2535 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:04.972198 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969078 2535 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:04.972198 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969082 2535 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:04.972198 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969087 2535 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:04.972198 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969091 2535 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:04.972198 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969096 2535 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:04.972198 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969100 2535 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:04.972198 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969104 2535 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:04.972198 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969108 2535 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:04.972198 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969112 2535 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:04.972198 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969116 2535 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:04.972198 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969120 2535 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:04.972198 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969124 2535 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:04.972198 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969129 2535 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:04.972198 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969133 2535 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:04.972198 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969137 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:04.972717 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969141 2535 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:04.972717 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969146 2535 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:04.972717 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969150 2535 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:04.972717 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969154 2535 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:04.972717 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969157 2535 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:04.972717 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969164 2535 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:04.972717 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969168 2535 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:04.972717 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969172 2535 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:04.972717 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969176 2535 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:04.972717 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969180 2535 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:04.972717 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969184 2535 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:04.972717 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969189 2535 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:04.972717 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969193 2535 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:04.972717 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969197 2535 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:04.972717 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969202 2535 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:04.972717 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969206 2535 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:04.972717 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969210 2535 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:04.972717 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969215 2535 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:04.972717 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969220 2535 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:04.972717 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969224 2535 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969227 2535 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969231 2535 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969236 2535 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969240 2535 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969245 2535 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969249 2535 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969253 2535 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969257 2535 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969261 2535 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969265 2535 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969268 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.969272 2535 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970498 2535 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970520 2535 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970531 2535 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970538 2535 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970544 2535 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970549 2535 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970557 2535 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970568 2535 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:46:04.973333 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970573 2535 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970578 2535 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970584 2535 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970589 2535 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970594 2535 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970599 2535 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970604 2535 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970609 2535 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970614 2535 flags.go:64] FLAG: --cloud-config="" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970618 2535 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970623 2535 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970629 2535 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970634 2535 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970639 2535 flags.go:64] FLAG: --config-dir="" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970643 2535 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970648 2535 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970654 2535 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970659 2535 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970663 2535 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970668 2535 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970674 2535 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970678 2535 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970683 2535 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970688 2535 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970692 2535 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:46:04.974042 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970699 2535 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970703 2535 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970708 2535 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970713 2535 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970717 2535 flags.go:64] FLAG: --enable-server="true" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970722 2535 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970730 2535 flags.go:64] FLAG: --event-burst="100" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970735 2535 flags.go:64] FLAG: --event-qps="50" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970740 2535 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970745 2535 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970750 2535 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970756 2535 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970761 2535 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970766 2535 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970771 2535 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970775 2535 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970780 2535 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970784 2535 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970789 2535 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970794 2535 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970798 2535 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970803 2535 flags.go:64] FLAG: --feature-gates="" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970809 2535 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970813 2535 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970818 2535 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:46:04.974685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970823 2535 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970828 2535 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970834 2535 flags.go:64] FLAG: --help="false" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970838 2535 flags.go:64] FLAG: --hostname-override="ip-10-0-136-85.ec2.internal" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970843 2535 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970848 2535 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970852 2535 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970858 2535 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970863 2535 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970868 2535 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970872 2535 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970877 2535 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970881 2535 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970886 2535 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970892 2535 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970896 2535 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970920 2535 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970925 2535 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970931 2535 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970935 2535 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970939 2535 flags.go:64] FLAG: --lock-file="" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970944 2535 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970948 2535 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970953 2535 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:46:04.975375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970962 2535 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970967 2535 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970971 2535 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970976 2535 flags.go:64] FLAG: --logging-format="text" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970981 2535 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970987 2535 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970993 2535 flags.go:64] FLAG: --manifest-url="" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.970997 2535 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971004 2535 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971010 2535 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971016 2535 flags.go:64] FLAG: --max-pods="110" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971020 2535 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971025 2535 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971030 2535 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971035 2535 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971040 2535 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971044 2535 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971049 2535 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971060 2535 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971065 2535 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971069 2535 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971075 2535 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971079 2535 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:46:04.975974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971089 2535 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971093 2535 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971098 2535 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971102 2535 flags.go:64] FLAG: --port="10250" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971107 2535 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971111 2535 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0141024636fc5313b" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971116 2535 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971121 2535 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971126 2535 flags.go:64] FLAG: --register-node="true" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971130 2535 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971134 2535 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971140 2535 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971144 2535 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971148 2535 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971154 2535 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971168 2535 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971173 2535 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971178 2535 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971182 2535 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971187 2535 flags.go:64] FLAG: --runonce="false" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971191 2535 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971196 2535 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971201 2535 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971205 2535 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971210 2535 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971215 2535 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:46:04.976548 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971219 2535 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971224 2535 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971229 2535 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971233 2535 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971238 2535 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971242 2535 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971247 2535 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971253 2535 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971258 2535 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971267 2535 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971272 2535 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971276 2535 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971283 2535 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971287 2535 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971292 2535 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971296 2535 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971301 2535 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971306 2535 flags.go:64] FLAG: --v="2" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971312 2535 flags.go:64] FLAG: --version="false" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971319 2535 flags.go:64] FLAG: --vmodule="" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971325 2535 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.971331 2535 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971483 2535 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971490 2535 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:04.977212 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971495 2535 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:04.977782 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971500 2535 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:04.977782 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971504 2535 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:04.977782 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971508 2535 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:04.977782 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971512 2535 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:04.977782 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971516 2535 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:04.977782 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971520 2535 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:04.977782 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971523 2535 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:04.977782 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971527 2535 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:04.977782 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971531 2535 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:04.977782 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971536 2535 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:04.977782 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971540 2535 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:04.977782 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971544 2535 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:04.977782 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971548 2535 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:04.977782 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971553 2535 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:04.977782 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971557 2535 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:04.977782 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971562 2535 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:04.977782 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971566 2535 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:04.977782 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971575 2535 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:04.977782 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971579 2535 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:04.977782 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971583 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:04.978443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971588 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:04.978443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971592 2535 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:04.978443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971596 2535 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:04.978443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971602 2535 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:04.978443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971608 2535 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:04.978443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971612 2535 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:04.978443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971616 2535 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:04.978443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971620 2535 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:04.978443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971624 2535 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:04.978443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971628 2535 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:04.978443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971635 2535 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:04.978443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971639 2535 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:04.978443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971643 2535 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:04.978443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971647 2535 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:04.978443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971651 2535 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:04.978443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971656 2535 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:04.978443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971659 2535 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:04.978443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971664 2535 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:04.978443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971668 2535 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:04.979031 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971672 2535 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:04.979031 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971676 2535 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:04.979031 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971680 2535 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:04.979031 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971684 2535 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:04.979031 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971688 2535 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:04.979031 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971692 2535 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:04.979031 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971696 2535 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:04.979031 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971700 2535 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:04.979031 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971704 2535 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:04.979031 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971709 2535 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:04.979031 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971715 2535 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:04.979031 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971719 2535 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:04.979031 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971723 2535 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:04.979031 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971727 2535 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:04.979031 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971731 2535 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:04.979031 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971735 2535 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:04.979031 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971739 2535 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:04.979031 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971743 2535 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:04.979031 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971747 2535 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:04.979031 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971751 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:04.979818 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971755 2535 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:04.979818 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971759 2535 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:04.979818 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971763 2535 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:04.979818 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971772 2535 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:04.979818 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971776 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:04.979818 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971780 2535 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:04.979818 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971784 2535 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:04.979818 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971789 2535 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:04.979818 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971793 2535 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:04.979818 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971797 2535 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:04.979818 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971801 2535 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:04.979818 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971805 2535 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:04.979818 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971809 2535 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:04.979818 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971813 2535 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:04.979818 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971817 2535 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:04.979818 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971820 2535 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:04.979818 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971824 2535 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:04.979818 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971831 2535 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:04.979818 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971836 2535 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:04.980334 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971841 2535 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:04.980334 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971846 2535 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:04.980334 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971850 2535 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:04.980334 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971857 2535 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:04.980334 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.971861 2535 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:04.980334 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.972606 2535 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:04.981242 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.981223 2535 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:46:04.981273 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.981244 2535 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:46:04.981308 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981294 2535 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:04.981308 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981299 2535 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:04.981308 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981303 2535 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:04.981308 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981305 2535 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:04.981308 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981308 2535 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:04.981443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981313 2535 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:04.981443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981318 2535 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:04.981443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981321 2535 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:04.981443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981324 2535 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:04.981443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981327 2535 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:04.981443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981330 2535 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:04.981443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981333 2535 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:04.981443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981336 2535 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:04.981443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981338 2535 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:04.981443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981341 2535 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:04.981443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981344 2535 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:04.981443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981346 2535 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:04.981443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981349 2535 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:04.981443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981352 2535 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:04.981443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981355 2535 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:04.981443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981359 2535 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:04.981443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981362 2535 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:04.981443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981364 2535 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:04.981443 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981367 2535 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:04.981955 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981369 2535 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:04.981955 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981372 2535 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:04.981955 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981374 2535 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:04.981955 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981377 2535 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:04.981955 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981380 2535 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:04.981955 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981382 2535 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:04.981955 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981384 2535 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:04.981955 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981387 2535 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:04.981955 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981389 2535 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:04.981955 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981392 2535 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:04.981955 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981395 2535 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:04.981955 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981397 2535 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:04.981955 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981400 2535 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:04.981955 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981403 2535 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:04.981955 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981406 2535 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:04.981955 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981408 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:04.981955 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981410 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:04.981955 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981413 2535 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:04.981955 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981415 2535 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:04.981955 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981418 2535 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:04.982456 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981420 2535 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:04.982456 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981423 2535 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:04.982456 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981425 2535 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:04.982456 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981428 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:04.982456 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981430 2535 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:04.982456 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981433 2535 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:04.982456 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981435 2535 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:04.982456 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981438 2535 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:04.982456 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981440 2535 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:04.982456 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981442 2535 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:04.982456 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981445 2535 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:04.982456 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981447 2535 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:04.982456 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981450 2535 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:04.982456 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981453 2535 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:04.982456 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981455 2535 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:04.982456 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981458 2535 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:04.982456 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981460 2535 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:04.982456 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981463 2535 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:04.982456 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981465 2535 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:04.982456 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981468 2535 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:04.982961 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981470 2535 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:04.982961 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981474 2535 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:04.982961 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981476 2535 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:04.982961 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981479 2535 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:04.982961 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981482 2535 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:04.982961 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981484 2535 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:04.982961 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981488 2535 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:04.982961 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981490 2535 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:04.982961 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981493 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:04.982961 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981495 2535 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:04.982961 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981497 2535 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:04.982961 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981500 2535 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:04.982961 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981503 2535 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:04.982961 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981505 2535 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:04.982961 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981508 2535 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:04.982961 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981510 2535 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:04.982961 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981512 2535 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:04.982961 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981515 2535 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:04.982961 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981518 2535 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:04.982961 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981520 2535 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:04.983490 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981522 2535 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:04.983490 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981525 2535 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:04.983490 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.981530 2535 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:04.983490 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981651 2535 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:04.983490 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981656 2535 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:04.983490 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981660 2535 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:04.983490 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981663 2535 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:04.983490 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981666 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:04.983490 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981669 2535 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:04.983490 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981672 2535 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:04.983490 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981675 2535 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:04.983490 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981677 2535 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:04.983490 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981680 2535 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:04.983490 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981683 2535 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:04.983490 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981687 2535 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:04.983875 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981690 2535 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:04.983875 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981693 2535 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:04.983875 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981695 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:04.983875 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981698 2535 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:04.983875 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981700 2535 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:04.983875 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981703 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:04.983875 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981705 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:04.983875 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981708 2535 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:04.983875 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981710 2535 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:04.983875 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981712 2535 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:04.983875 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981716 2535 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:04.983875 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981718 2535 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:04.983875 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981720 2535 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:04.983875 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981723 2535 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:04.983875 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981726 2535 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:04.983875 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981728 2535 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:04.983875 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981731 2535 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:04.983875 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981733 2535 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:04.983875 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981736 2535 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:04.983875 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981738 2535 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:04.984399 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981741 2535 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:04.984399 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981743 2535 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:04.984399 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981746 2535 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:04.984399 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981748 2535 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:04.984399 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981751 2535 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:04.984399 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981753 2535 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:04.984399 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981756 2535 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:04.984399 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981759 2535 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:04.984399 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981761 2535 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:04.984399 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981764 2535 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:04.984399 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981767 2535 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:04.984399 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981771 2535 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:04.984399 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981774 2535 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:04.984399 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981777 2535 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:04.984399 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981779 2535 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:04.984399 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981782 2535 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:04.984399 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981785 2535 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:04.984399 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981787 2535 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:04.984399 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981790 2535 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:04.984399 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981793 2535 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:04.984893 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981795 2535 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:04.984893 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981798 2535 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:04.984893 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981800 2535 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:04.984893 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981804 2535 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:04.984893 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981806 2535 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:04.984893 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981809 2535 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:04.984893 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981811 2535 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:04.984893 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981814 2535 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:04.984893 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981816 2535 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:04.984893 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981818 2535 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:04.984893 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981821 2535 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:04.984893 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981823 2535 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:04.984893 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981826 2535 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:04.984893 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981828 2535 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:04.984893 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981831 2535 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:04.984893 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981833 2535 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:04.984893 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981835 2535 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:04.984893 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981838 2535 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:04.984893 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981840 2535 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:04.984893 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981843 2535 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:04.985471 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981845 2535 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:04.985471 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981848 2535 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:04.985471 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981850 2535 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:04.985471 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981852 2535 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:04.985471 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981855 2535 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:04.985471 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981857 2535 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:04.985471 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981859 2535 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:04.985471 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981862 2535 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:04.985471 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981864 2535 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:04.985471 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981866 2535 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:04.985471 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981869 2535 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:04.985471 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981871 2535 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:04.985471 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981874 2535 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:04.985471 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:04.981876 2535 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:04.985471 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.981880 2535 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:04.985837 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.982538 2535 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:46:04.986980 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.986967 2535 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:46:04.988100 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.988088 2535 server.go:1019] "Starting client certificate rotation" Apr 22 18:46:04.988203 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.988184 2535 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:04.988232 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:04.988224 2535 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:05.013809 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.013790 2535 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:05.015593 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.015572 2535 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:05.029730 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.029707 2535 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:46:05.034860 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.034844 2535 log.go:25] "Validated CRI v1 image API" Apr 22 18:46:05.038561 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.038546 2535 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:46:05.042843 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.042821 2535 fs.go:135] Filesystem UUIDs: map[648d6552-664a-4715-a81a-60e23ff0a5c5:/dev/nvme0n1p3 6661526e-6334-4e0e-8093-e1350fc2e1c5:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 22 18:46:05.042945 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.042843 2535 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:46:05.047453 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.047423 2535 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:05.049085 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.048973 2535 manager.go:217] Machine: {Timestamp:2026-04-22 18:46:05.047632363 +0000 UTC m=+0.411432243 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3063914 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec264f2842c0c0f96b93095227af6e30 SystemUUID:ec264f28-42c0-c0f9-6b93-095227af6e30 BootID:1071b64a-2dcc-4308-8bb1-082f7d968eab Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1f:63:e2:7e:b3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1f:63:e2:7e:b3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:fe:35:a9:96:83:c4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:46:05.049085 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.049079 2535 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:46:05.049200 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.049157 2535 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:46:05.050870 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.050848 2535 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:46:05.051023 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.050873 2535 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-85.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:46:05.051072 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.051036 2535 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:46:05.051072 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.051045 2535 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:46:05.051072 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.051057 2535 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:05.051072 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.051067 2535 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:05.052164 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.052154 2535 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:05.052268 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.052259 2535 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:46:05.054707 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.054697 2535 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:46:05.054741 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.054711 2535 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:46:05.054741 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.054722 2535 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:46:05.054741 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.054731 2535 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:46:05.054741 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.054739 2535 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:46:05.055777 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.055764 2535 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:05.055819 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.055783 2535 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:05.059252 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.059227 2535 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:46:05.060489 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.060476 2535 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:46:05.062220 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.062206 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:46:05.062298 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.062224 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:46:05.062298 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.062231 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:46:05.062298 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.062236 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:46:05.062298 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.062242 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:46:05.062298 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.062248 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:46:05.062298 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.062256 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:46:05.062298 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.062262 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:46:05.062298 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.062269 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:46:05.062298 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.062275 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:46:05.062298 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.062283 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:46:05.062298 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.062292 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:46:05.063152 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.063142 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:46:05.063152 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.063152 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:46:05.066892 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.066878 2535 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:46:05.067033 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.066937 2535 server.go:1295] "Started kubelet" Apr 22 18:46:05.067093 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.067032 2535 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:46:05.067161 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.067113 2535 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:46:05.067231 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.067180 2535 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:46:05.067716 ip-10-0-136-85 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:46:05.068274 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.068092 2535 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:46:05.068274 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.068136 2535 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-85.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:46:05.068274 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.068239 2535 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-85.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:46:05.068425 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.068298 2535 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:46:05.069636 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.069622 2535 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:46:05.072038 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.072017 2535 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-cb4z7" Apr 22 18:46:05.074084 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.074066 2535 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:05.074643 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.074606 2535 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:46:05.074787 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.073848 2535 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-85.ec2.internal.18a8c230270aab36 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-85.ec2.internal,UID:ip-10-0-136-85.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-85.ec2.internal,},FirstTimestamp:2026-04-22 18:46:05.066890038 +0000 UTC m=+0.430689918,LastTimestamp:2026-04-22 18:46:05.066890038 +0000 UTC m=+0.430689918,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-85.ec2.internal,}" Apr 22 18:46:05.076270 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.076253 2535 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:46:05.076370 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.076360 2535 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:46:05.076480 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.076315 2535 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-85.ec2.internal\" not found" Apr 22 18:46:05.076623 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.076400 2535 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:46:05.076623 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.076548 2535 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:46:05.076623 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.076622 2535 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:46:05.077084 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.077068 2535 factory.go:55] Registering systemd factory Apr 22 18:46:05.077161 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.077089 2535 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:46:05.077717 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.077699 2535 factory.go:153] Registering CRI-O factory Apr 22 18:46:05.077796 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.077720 2535 factory.go:223] Registration of the crio container factory successfully Apr 22 18:46:05.077796 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.077774 2535 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:46:05.077876 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.077799 2535 factory.go:103] Registering Raw factory Apr 22 18:46:05.077876 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.077813 2535 manager.go:1196] Started watching for new ooms in manager Apr 22 18:46:05.078261 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.078216 2535 manager.go:319] Starting recovery of all containers Apr 22 18:46:05.079486 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.079424 2535 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-cb4z7" Apr 22 18:46:05.081978 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.081806 2535 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:46:05.087074 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.087046 2535 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:05.090049 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.090026 2535 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-136-85.ec2.internal\" not found" node="ip-10-0-136-85.ec2.internal" Apr 22 18:46:05.090459 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.090443 2535 manager.go:324] Recovery completed Apr 22 18:46:05.094435 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.094422 2535 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:05.096955 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.096939 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-85.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:05.097065 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.096973 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-85.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:05.097065 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.096989 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-85.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:05.097459 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.097445 2535 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:46:05.097459 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.097458 2535 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:46:05.097535 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.097479 2535 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:05.099512 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.099500 2535 policy_none.go:49] "None policy: Start" Apr 22 18:46:05.099546 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.099516 2535 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:46:05.099546 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.099526 2535 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:46:05.150494 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.150476 2535 manager.go:341] "Starting Device Plugin manager" Apr 22 18:46:05.162889 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.150522 2535 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:46:05.162889 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.150535 2535 server.go:85] "Starting device plugin registration server" Apr 22 18:46:05.162889 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.150751 2535 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:46:05.162889 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.150762 2535 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:46:05.162889 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.150861 2535 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:46:05.162889 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.150949 2535 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:46:05.162889 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.150957 2535 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:46:05.162889 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.151498 2535 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:46:05.162889 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.151536 2535 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-85.ec2.internal\" not found" Apr 22 18:46:05.251620 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.251573 2535 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:05.252139 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.252118 2535 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:46:05.252676 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.252654 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-85.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:05.252761 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.252684 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-85.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:05.252761 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.252697 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-85.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:05.252761 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.252731 2535 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-85.ec2.internal" Apr 22 18:46:05.253676 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.253659 2535 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:46:05.253761 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.253690 2535 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:46:05.253761 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.253713 2535 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:46:05.253761 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.253722 2535 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:46:05.253761 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.253757 2535 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:46:05.256708 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.256687 2535 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:05.258721 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.258707 2535 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-85.ec2.internal" Apr 22 18:46:05.258788 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.258727 2535 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-85.ec2.internal\": node \"ip-10-0-136-85.ec2.internal\" not found" Apr 22 18:46:05.278950 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.278935 2535 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-85.ec2.internal\" not found" Apr 22 18:46:05.354537 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.354514 2535 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-85.ec2.internal"] Apr 22 18:46:05.354619 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.354594 2535 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:05.355948 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.355933 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-85.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:05.356028 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.355971 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-85.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:05.356028 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.355980 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-85.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:05.357073 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.357061 2535 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:05.357240 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.357225 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal" Apr 22 18:46:05.357300 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.357261 2535 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:05.357694 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.357678 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-85.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:05.357766 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.357707 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-85.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:05.357766 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.357677 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-85.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:05.357766 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.357722 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-85.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:05.357766 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.357736 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-85.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:05.357766 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.357748 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-85.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:05.359105 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.359090 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-85.ec2.internal" Apr 22 18:46:05.359196 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.359131 2535 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:05.359748 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.359734 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-85.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:05.359820 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.359758 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-85.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:05.359820 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.359769 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-85.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:05.378098 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.378080 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6b3c4961afadf912d9474ffe329248eb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal\" (UID: \"6b3c4961afadf912d9474ffe329248eb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal" Apr 22 18:46:05.378159 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.378115 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b3c4961afadf912d9474ffe329248eb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal\" (UID: \"6b3c4961afadf912d9474ffe329248eb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal" Apr 22 18:46:05.378159 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.378133 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e8fe672e26d8f153f55920269ad886fc-config\") pod \"kube-apiserver-proxy-ip-10-0-136-85.ec2.internal\" (UID: \"e8fe672e26d8f153f55920269ad886fc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-85.ec2.internal" Apr 22 18:46:05.379099 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.379084 2535 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-85.ec2.internal\" not found" Apr 22 18:46:05.382400 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.382386 2535 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-85.ec2.internal\" not found" node="ip-10-0-136-85.ec2.internal" Apr 22 18:46:05.386578 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.386560 2535 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-85.ec2.internal\" not found" node="ip-10-0-136-85.ec2.internal" Apr 22 18:46:05.478397 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.478377 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e8fe672e26d8f153f55920269ad886fc-config\") pod \"kube-apiserver-proxy-ip-10-0-136-85.ec2.internal\" (UID: \"e8fe672e26d8f153f55920269ad886fc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-85.ec2.internal" Apr 22 18:46:05.478489 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.478400 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6b3c4961afadf912d9474ffe329248eb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal\" (UID: \"6b3c4961afadf912d9474ffe329248eb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal" Apr 22 18:46:05.478489 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.478421 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b3c4961afadf912d9474ffe329248eb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal\" (UID: \"6b3c4961afadf912d9474ffe329248eb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal" Apr 22 18:46:05.478489 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.478446 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b3c4961afadf912d9474ffe329248eb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal\" (UID: \"6b3c4961afadf912d9474ffe329248eb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal" Apr 22 18:46:05.478489 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.478464 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e8fe672e26d8f153f55920269ad886fc-config\") pod \"kube-apiserver-proxy-ip-10-0-136-85.ec2.internal\" (UID: \"e8fe672e26d8f153f55920269ad886fc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-85.ec2.internal" Apr 22 18:46:05.478489 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.478467 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6b3c4961afadf912d9474ffe329248eb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal\" (UID: \"6b3c4961afadf912d9474ffe329248eb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal" Apr 22 18:46:05.479416 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.479403 2535 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-85.ec2.internal\" not found" Apr 22 18:46:05.580392 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.580339 2535 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-85.ec2.internal\" not found" Apr 22 18:46:05.680836 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.680812 2535 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-85.ec2.internal\" not found" Apr 22 18:46:05.683971 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.683957 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal" Apr 22 18:46:05.689446 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.689428 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-85.ec2.internal" Apr 22 18:46:05.781875 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.781854 2535 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-85.ec2.internal\" not found" Apr 22 18:46:05.882325 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.882268 2535 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-85.ec2.internal\" not found" Apr 22 18:46:05.982792 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:05.982767 2535 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-85.ec2.internal\" not found" Apr 22 18:46:05.987939 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.987919 2535 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:46:05.988062 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.988047 2535 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:05.988112 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:05.988082 2535 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:06.074413 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:06.074385 2535 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:06.083091 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:06.083072 2535 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-85.ec2.internal\" not found" Apr 22 18:46:06.083200 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:06.083092 2535 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:41:05 +0000 UTC" deadline="2027-11-06 11:31:21.86171311 +0000 UTC" Apr 22 18:46:06.083200 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:06.083124 2535 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13504h45m15.778592828s" Apr 22 18:46:06.089459 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:06.089441 2535 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:06.114352 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:06.114329 2535 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-l82tt" Apr 22 18:46:06.119936 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:06.119919 2535 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-l82tt" Apr 22 18:46:06.142248 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:06.142223 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b3c4961afadf912d9474ffe329248eb.slice/crio-707bb72a54f4c5a17fb41307c25504aa171064e87d66cc5cf3e1d5bed42c5232 WatchSource:0}: Error finding container 707bb72a54f4c5a17fb41307c25504aa171064e87d66cc5cf3e1d5bed42c5232: Status 404 returned error can't find the container with id 707bb72a54f4c5a17fb41307c25504aa171064e87d66cc5cf3e1d5bed42c5232 Apr 22 18:46:06.145799 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:06.145784 2535 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:46:06.178559 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:06.178534 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8fe672e26d8f153f55920269ad886fc.slice/crio-4cef2f9b94f56bd44348f62315d876ec116c92d27410ffb3232bd006ac9993b9 WatchSource:0}: Error finding container 4cef2f9b94f56bd44348f62315d876ec116c92d27410ffb3232bd006ac9993b9: Status 404 returned error can't find the container with id 4cef2f9b94f56bd44348f62315d876ec116c92d27410ffb3232bd006ac9993b9 Apr 22 18:46:06.183425 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:06.183406 2535 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-85.ec2.internal\" not found" Apr 22 18:46:06.256191 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:06.256154 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-85.ec2.internal" event={"ID":"e8fe672e26d8f153f55920269ad886fc","Type":"ContainerStarted","Data":"4cef2f9b94f56bd44348f62315d876ec116c92d27410ffb3232bd006ac9993b9"} Apr 22 18:46:06.257122 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:06.257103 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal" event={"ID":"6b3c4961afadf912d9474ffe329248eb","Type":"ContainerStarted","Data":"707bb72a54f4c5a17fb41307c25504aa171064e87d66cc5cf3e1d5bed42c5232"} Apr 22 18:46:06.283472 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:06.283452 2535 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-85.ec2.internal\" not found" Apr 22 18:46:06.373936 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:06.373896 2535 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:06.383547 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:06.383496 2535 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-85.ec2.internal\" not found" Apr 22 18:46:06.400867 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:06.400847 2535 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:06.476629 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:06.476609 2535 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal" Apr 22 18:46:06.488744 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:06.488728 2535 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:06.489726 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:06.489714 2535 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-85.ec2.internal" Apr 22 18:46:06.498118 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:06.498102 2535 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:06.832052 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:06.831978 2535 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:07.055605 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.055574 2535 apiserver.go:52] "Watching apiserver" Apr 22 18:46:07.061931 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.061892 2535 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:46:07.062236 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.062204 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal","openshift-multus/multus-pjv5t","openshift-multus/network-metrics-daemon-5g7dk","openshift-ovn-kubernetes/ovnkube-node-b8wsf","kube-system/konnectivity-agent-cmrnt","kube-system/kube-apiserver-proxy-ip-10-0-136-85.ec2.internal","openshift-cluster-node-tuning-operator/tuned-5rsfp","openshift-dns/node-resolver-hgtq9","openshift-image-registry/node-ca-642sx","openshift-multus/multus-additional-cni-plugins-87xk8","openshift-network-diagnostics/network-check-target-89stm","openshift-network-operator/iptables-alerter-hkwlr"] Apr 22 18:46:07.064252 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.064235 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-642sx" Apr 22 18:46:07.065637 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.065229 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.066736 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.066564 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:07.066736 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.066634 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:46:07.066881 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:07.066628 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g7dk" podUID="252dfd14-9c83-4928-bbcd-d84b479525bc" Apr 22 18:46:07.067030 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.066889 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:46:07.067030 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.066952 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-zxb2x\"" Apr 22 18:46:07.067241 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.067223 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:46:07.067983 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.067964 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:46:07.068174 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.068157 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-k7lhr\"" Apr 22 18:46:07.068360 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.068344 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:46:07.068802 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.068782 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cmrnt" Apr 22 18:46:07.068915 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.068883 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:46:07.069161 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.069142 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:46:07.070211 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.070191 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.071587 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.071326 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.073106 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.072076 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:46:07.073106 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.072333 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fs825\"" Apr 22 18:46:07.073106 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.072500 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:46:07.073106 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.072539 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:46:07.073106 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.072697 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.073106 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.072760 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:46:07.073106 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.072800 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-jtxgh\"" Apr 22 18:46:07.073711 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.073691 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:46:07.073940 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.073919 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:46:07.074093 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.074076 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:46:07.074267 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.074251 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:46:07.074349 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.074272 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.074627 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.074595 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:07.074691 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.074672 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-rbl2b\"" Apr 22 18:46:07.074781 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.074766 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:07.075427 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.075406 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:46:07.075894 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.075876 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hgtq9" Apr 22 18:46:07.075992 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.075974 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:07.076062 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:07.076038 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-89stm" podUID="85ff2eb7-3fb1-424b-9402-d67103c35bf2" Apr 22 18:46:07.078090 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.078070 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hkwlr" Apr 22 18:46:07.078606 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.078566 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-f6vt7\"" Apr 22 18:46:07.079551 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.079131 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-fbpqc\"" Apr 22 18:46:07.079551 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.079444 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:46:07.081499 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.081341 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:46:07.081499 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.081491 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:07.081623 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.081558 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:46:07.081623 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.081613 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sf2bc\"" Apr 22 18:46:07.081723 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.081662 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:46:07.081826 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.081811 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:46:07.081882 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.081830 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:07.082025 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.082010 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:46:07.082202 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.082166 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:46:07.083368 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.083274 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-qgzpj\"" Apr 22 18:46:07.085712 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085422 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crhv5\" (UniqueName: \"kubernetes.io/projected/7a529510-3e55-4661-b315-c2dac62260b8-kube-api-access-crhv5\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.085712 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085457 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/476d713f-1df9-4369-9fcb-94a61680226e-host\") pod \"node-ca-642sx\" (UID: \"476d713f-1df9-4369-9fcb-94a61680226e\") " pod="openshift-image-registry/node-ca-642sx" Apr 22 18:46:07.085712 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085483 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-cni-bin\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.085712 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085506 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/88bb7a51-9742-46e6-817e-2c17c4357d07-ovnkube-script-lib\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.085712 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085529 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/92f3b87a-b94c-43ff-b9f1-8f66016fc2ce-konnectivity-ca\") pod \"konnectivity-agent-cmrnt\" (UID: \"92f3b87a-b94c-43ff-b9f1-8f66016fc2ce\") " pod="kube-system/konnectivity-agent-cmrnt" Apr 22 18:46:07.085712 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085551 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-kubernetes\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.085712 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085626 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/476d713f-1df9-4369-9fcb-94a61680226e-serviceca\") pod \"node-ca-642sx\" (UID: \"476d713f-1df9-4369-9fcb-94a61680226e\") " pod="openshift-image-registry/node-ca-642sx" Apr 22 18:46:07.085712 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085658 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ac730e3f-49e1-4703-9a89-4d82e11d265d-cni-binary-copy\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.085712 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085681 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ac730e3f-49e1-4703-9a89-4d82e11d265d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.085712 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085697 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d21b0e10-276b-4b90-8d4c-23aa447f6f29-iptables-alerter-script\") pod \"iptables-alerter-hkwlr\" (UID: \"d21b0e10-276b-4b90-8d4c-23aa447f6f29\") " pod="openshift-network-operator/iptables-alerter-hkwlr" Apr 22 18:46:07.085712 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085715 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-tuned\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.086200 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085740 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-run-systemd\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.086200 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085753 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-var-lib-openvswitch\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.086200 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085767 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-cnibin\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.086200 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085780 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/88bb7a51-9742-46e6-817e-2c17c4357d07-ovnkube-config\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.086200 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085795 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-run-ovn\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.086200 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085817 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ac730e3f-49e1-4703-9a89-4d82e11d265d-system-cni-dir\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.086200 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085839 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-multus-cni-dir\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.086200 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085856 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-multus-conf-dir\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.086200 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085870 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-run-netns\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.086200 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085890 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-multus-socket-dir-parent\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.086200 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085926 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-node-log\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.086200 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.085967 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-host-run-k8s-cni-cncf-io\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.086200 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086007 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-run\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.086200 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086039 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-lib-modules\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.086200 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086058 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/62c7a7b2-9390-4fb2-84e7-23f981d15afe-tmp-dir\") pod \"node-resolver-hgtq9\" (UID: \"62c7a7b2-9390-4fb2-84e7-23f981d15afe\") " pod="openshift-dns/node-resolver-hgtq9" Apr 22 18:46:07.086200 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086074 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-os-release\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.086200 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086087 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-host-var-lib-cni-bin\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.086917 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086100 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d21b0e10-276b-4b90-8d4c-23aa447f6f29-host-slash\") pod \"iptables-alerter-hkwlr\" (UID: \"d21b0e10-276b-4b90-8d4c-23aa447f6f29\") " pod="openshift-network-operator/iptables-alerter-hkwlr" Apr 22 18:46:07.086917 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086131 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-hostroot\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.086917 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086150 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-log-socket\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.086917 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086171 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-run-ovn-kubernetes\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.086917 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086216 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/92f3b87a-b94c-43ff-b9f1-8f66016fc2ce-agent-certs\") pod \"konnectivity-agent-cmrnt\" (UID: \"92f3b87a-b94c-43ff-b9f1-8f66016fc2ce\") " pod="kube-system/konnectivity-agent-cmrnt" Apr 22 18:46:07.086917 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086258 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-sysctl-conf\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.086917 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086295 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs\") pod \"network-metrics-daemon-5g7dk\" (UID: \"252dfd14-9c83-4928-bbcd-d84b479525bc\") " pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:07.086917 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086323 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-slash\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.086917 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086356 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7a529510-3e55-4661-b315-c2dac62260b8-sys-fs\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.086917 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086380 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v2np\" (UniqueName: \"kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np\") pod \"network-check-target-89stm\" (UID: \"85ff2eb7-3fb1-424b-9402-d67103c35bf2\") " pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:07.086917 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086403 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-system-cni-dir\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.086917 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086435 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.086917 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086459 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-modprobe-d\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.086917 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086487 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7a529510-3e55-4661-b315-c2dac62260b8-registration-dir\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.086917 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086509 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ac730e3f-49e1-4703-9a89-4d82e11d265d-os-release\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.086917 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086532 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqtxv\" (UniqueName: \"kubernetes.io/projected/ac730e3f-49e1-4703-9a89-4d82e11d265d-kube-api-access-dqtxv\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.087636 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086551 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-kubelet\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.087636 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086584 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc4pq\" (UniqueName: \"kubernetes.io/projected/62c7a7b2-9390-4fb2-84e7-23f981d15afe-kube-api-access-wc4pq\") pod \"node-resolver-hgtq9\" (UID: \"62c7a7b2-9390-4fb2-84e7-23f981d15afe\") " pod="openshift-dns/node-resolver-hgtq9" Apr 22 18:46:07.087636 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086623 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a529510-3e55-4661-b315-c2dac62260b8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.087636 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086648 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ac730e3f-49e1-4703-9a89-4d82e11d265d-cnibin\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.087636 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086672 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ac730e3f-49e1-4703-9a89-4d82e11d265d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.087636 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086695 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7a529510-3e55-4661-b315-c2dac62260b8-socket-dir\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.087636 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086741 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-sys\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.087636 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086764 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-host-var-lib-cni-multus\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.087636 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086795 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-host-var-lib-kubelet\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.087636 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086819 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2251f062-5650-40fb-b187-729124eb8087-multus-daemon-config\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.087636 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086843 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-host-run-multus-certs\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.087636 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086879 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-systemd-units\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.087636 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086931 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/88bb7a51-9742-46e6-817e-2c17c4357d07-ovn-node-metrics-cert\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.087636 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086951 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-sysconfig\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.087636 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.086974 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-host-run-netns\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.087636 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087011 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vggw7\" (UniqueName: \"kubernetes.io/projected/2251f062-5650-40fb-b187-729124eb8087-kube-api-access-vggw7\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.088195 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087034 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-run-openvswitch\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.088195 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087091 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/88bb7a51-9742-46e6-817e-2c17c4357d07-env-overrides\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.088195 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087138 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-sysctl-d\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.088195 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087171 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-systemd\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.088195 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087199 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6aedca06-2e88-42cc-a622-3d71dec7b063-tmp\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.088195 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087233 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/62c7a7b2-9390-4fb2-84e7-23f981d15afe-hosts-file\") pod \"node-resolver-hgtq9\" (UID: \"62c7a7b2-9390-4fb2-84e7-23f981d15afe\") " pod="openshift-dns/node-resolver-hgtq9" Apr 22 18:46:07.088195 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087261 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2251f062-5650-40fb-b187-729124eb8087-cni-binary-copy\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.088195 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087289 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-etc-kubernetes\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.088195 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087313 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnmvp\" (UniqueName: \"kubernetes.io/projected/252dfd14-9c83-4928-bbcd-d84b479525bc-kube-api-access-tnmvp\") pod \"network-metrics-daemon-5g7dk\" (UID: \"252dfd14-9c83-4928-bbcd-d84b479525bc\") " pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:07.088195 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087335 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-cni-netd\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.088195 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087356 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-host\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.088195 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087411 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n6f5\" (UniqueName: \"kubernetes.io/projected/476d713f-1df9-4369-9fcb-94a61680226e-kube-api-access-9n6f5\") pod \"node-ca-642sx\" (UID: \"476d713f-1df9-4369-9fcb-94a61680226e\") " pod="openshift-image-registry/node-ca-642sx" Apr 22 18:46:07.088195 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087438 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ac730e3f-49e1-4703-9a89-4d82e11d265d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.088195 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087464 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkxcp\" (UniqueName: \"kubernetes.io/projected/d21b0e10-276b-4b90-8d4c-23aa447f6f29-kube-api-access-gkxcp\") pod \"iptables-alerter-hkwlr\" (UID: \"d21b0e10-276b-4b90-8d4c-23aa447f6f29\") " pod="openshift-network-operator/iptables-alerter-hkwlr" Apr 22 18:46:07.088195 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087489 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-var-lib-kubelet\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.088195 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087511 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-etc-openvswitch\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.088660 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087545 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzjqm\" (UniqueName: \"kubernetes.io/projected/88bb7a51-9742-46e6-817e-2c17c4357d07-kube-api-access-gzjqm\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.088660 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087568 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brqk4\" (UniqueName: \"kubernetes.io/projected/6aedca06-2e88-42cc-a622-3d71dec7b063-kube-api-access-brqk4\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.088660 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087593 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7a529510-3e55-4661-b315-c2dac62260b8-device-dir\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.088660 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.087616 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7a529510-3e55-4661-b315-c2dac62260b8-etc-selinux\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.120467 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.120445 2535 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:06 +0000 UTC" deadline="2027-11-25 05:30:23.92046831 +0000 UTC" Apr 22 18:46:07.120467 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.120466 2535 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13954h44m16.800005636s" Apr 22 18:46:07.177750 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.177729 2535 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:46:07.187838 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.187813 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-os-release\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.187964 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.187846 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-host-var-lib-cni-bin\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.187964 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.187868 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d21b0e10-276b-4b90-8d4c-23aa447f6f29-host-slash\") pod \"iptables-alerter-hkwlr\" (UID: \"d21b0e10-276b-4b90-8d4c-23aa447f6f29\") " pod="openshift-network-operator/iptables-alerter-hkwlr" Apr 22 18:46:07.187964 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.187882 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-hostroot\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.187964 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.187896 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-log-socket\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.188167 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.187961 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-os-release\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.188167 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.187960 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-host-var-lib-cni-bin\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.188167 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.187959 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-run-ovn-kubernetes\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.188167 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.187997 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-log-socket\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.188167 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.187998 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/92f3b87a-b94c-43ff-b9f1-8f66016fc2ce-agent-certs\") pod \"konnectivity-agent-cmrnt\" (UID: \"92f3b87a-b94c-43ff-b9f1-8f66016fc2ce\") " pod="kube-system/konnectivity-agent-cmrnt" Apr 22 18:46:07.188167 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.187996 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-hostroot\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.188167 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.187997 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-run-ovn-kubernetes\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.188167 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.187969 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d21b0e10-276b-4b90-8d4c-23aa447f6f29-host-slash\") pod \"iptables-alerter-hkwlr\" (UID: \"d21b0e10-276b-4b90-8d4c-23aa447f6f29\") " pod="openshift-network-operator/iptables-alerter-hkwlr" Apr 22 18:46:07.188167 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188024 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-sysctl-conf\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.188167 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188067 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs\") pod \"network-metrics-daemon-5g7dk\" (UID: \"252dfd14-9c83-4928-bbcd-d84b479525bc\") " pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:07.188167 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188093 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-slash\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.188167 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188119 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7a529510-3e55-4661-b315-c2dac62260b8-sys-fs\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.188167 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188139 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-sysctl-conf\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.188167 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188144 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9v2np\" (UniqueName: \"kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np\") pod \"network-check-target-89stm\" (UID: \"85ff2eb7-3fb1-424b-9402-d67103c35bf2\") " pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:07.188167 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188169 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-system-cni-dir\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.188888 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188187 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.188888 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188195 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7a529510-3e55-4661-b315-c2dac62260b8-sys-fs\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.188888 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188202 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-modprobe-d\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.188888 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188217 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7a529510-3e55-4661-b315-c2dac62260b8-registration-dir\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.188888 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188233 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ac730e3f-49e1-4703-9a89-4d82e11d265d-os-release\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.188888 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188247 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dqtxv\" (UniqueName: \"kubernetes.io/projected/ac730e3f-49e1-4703-9a89-4d82e11d265d-kube-api-access-dqtxv\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.188888 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188262 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-kubelet\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.188888 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:07.188264 2535 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:07.188888 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188271 2535 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:46:07.188888 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:07.188334 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs podName:252dfd14-9c83-4928-bbcd-d84b479525bc nodeName:}" failed. No retries permitted until 2026-04-22 18:46:07.688292574 +0000 UTC m=+3.052092441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs") pod "network-metrics-daemon-5g7dk" (UID: "252dfd14-9c83-4928-bbcd-d84b479525bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:07.188888 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188472 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-slash\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.188888 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188559 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7a529510-3e55-4661-b315-c2dac62260b8-registration-dir\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.188888 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188615 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-modprobe-d\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.188888 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188631 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ac730e3f-49e1-4703-9a89-4d82e11d265d-os-release\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.188888 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188771 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-kubelet\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.188888 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188276 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wc4pq\" (UniqueName: \"kubernetes.io/projected/62c7a7b2-9390-4fb2-84e7-23f981d15afe-kube-api-access-wc4pq\") pod \"node-resolver-hgtq9\" (UID: \"62c7a7b2-9390-4fb2-84e7-23f981d15afe\") " pod="openshift-dns/node-resolver-hgtq9" Apr 22 18:46:07.188888 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188808 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a529510-3e55-4661-b315-c2dac62260b8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.189688 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188824 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ac730e3f-49e1-4703-9a89-4d82e11d265d-cnibin\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.189688 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188838 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-system-cni-dir\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.189688 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188846 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ac730e3f-49e1-4703-9a89-4d82e11d265d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.189688 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188891 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a529510-3e55-4661-b315-c2dac62260b8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.189688 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188889 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.189688 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188934 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ac730e3f-49e1-4703-9a89-4d82e11d265d-cnibin\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.189688 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188946 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7a529510-3e55-4661-b315-c2dac62260b8-socket-dir\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.189688 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188971 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-sys\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.189688 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188993 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ac730e3f-49e1-4703-9a89-4d82e11d265d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.189688 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.188997 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-host-var-lib-cni-multus\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.189688 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189030 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-host-var-lib-kubelet\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.189688 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189055 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2251f062-5650-40fb-b187-729124eb8087-multus-daemon-config\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.189688 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189064 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7a529510-3e55-4661-b315-c2dac62260b8-socket-dir\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.189688 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189074 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-sys\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.189688 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189079 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-host-run-multus-certs\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.189688 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189115 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-systemd-units\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.189688 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189142 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/88bb7a51-9742-46e6-817e-2c17c4357d07-ovn-node-metrics-cert\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.190472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189109 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-host-var-lib-kubelet\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.190472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189032 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-host-var-lib-cni-multus\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.190472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189166 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-sysconfig\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.190472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189189 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-host-run-multus-certs\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.190472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189191 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-host-run-netns\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.190472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189220 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vggw7\" (UniqueName: \"kubernetes.io/projected/2251f062-5650-40fb-b187-729124eb8087-kube-api-access-vggw7\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.190472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189244 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-run-openvswitch\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.190472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189268 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/88bb7a51-9742-46e6-817e-2c17c4357d07-env-overrides\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.190472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189290 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-sysctl-d\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.190472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189312 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-systemd\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.190472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189334 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6aedca06-2e88-42cc-a622-3d71dec7b063-tmp\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.190472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189357 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/62c7a7b2-9390-4fb2-84e7-23f981d15afe-hosts-file\") pod \"node-resolver-hgtq9\" (UID: \"62c7a7b2-9390-4fb2-84e7-23f981d15afe\") " pod="openshift-dns/node-resolver-hgtq9" Apr 22 18:46:07.190472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189383 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2251f062-5650-40fb-b187-729124eb8087-cni-binary-copy\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.190472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189405 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-etc-kubernetes\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.190472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189429 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnmvp\" (UniqueName: \"kubernetes.io/projected/252dfd14-9c83-4928-bbcd-d84b479525bc-kube-api-access-tnmvp\") pod \"network-metrics-daemon-5g7dk\" (UID: \"252dfd14-9c83-4928-bbcd-d84b479525bc\") " pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:07.190472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189453 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-cni-netd\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.190472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189477 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-host\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.190472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189500 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9n6f5\" (UniqueName: \"kubernetes.io/projected/476d713f-1df9-4369-9fcb-94a61680226e-kube-api-access-9n6f5\") pod \"node-ca-642sx\" (UID: \"476d713f-1df9-4369-9fcb-94a61680226e\") " pod="openshift-image-registry/node-ca-642sx" Apr 22 18:46:07.191292 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189526 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ac730e3f-49e1-4703-9a89-4d82e11d265d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.191292 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189557 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-systemd-units\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.191292 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189556 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkxcp\" (UniqueName: \"kubernetes.io/projected/d21b0e10-276b-4b90-8d4c-23aa447f6f29-kube-api-access-gkxcp\") pod \"iptables-alerter-hkwlr\" (UID: \"d21b0e10-276b-4b90-8d4c-23aa447f6f29\") " pod="openshift-network-operator/iptables-alerter-hkwlr" Apr 22 18:46:07.191292 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189592 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2251f062-5650-40fb-b187-729124eb8087-multus-daemon-config\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.191292 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189592 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-var-lib-kubelet\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.191292 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189631 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-var-lib-kubelet\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.191292 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189646 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-etc-openvswitch\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.191292 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189672 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzjqm\" (UniqueName: \"kubernetes.io/projected/88bb7a51-9742-46e6-817e-2c17c4357d07-kube-api-access-gzjqm\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.191292 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189697 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brqk4\" (UniqueName: \"kubernetes.io/projected/6aedca06-2e88-42cc-a622-3d71dec7b063-kube-api-access-brqk4\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.191292 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189722 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7a529510-3e55-4661-b315-c2dac62260b8-device-dir\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.191292 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189748 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7a529510-3e55-4661-b315-c2dac62260b8-etc-selinux\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.191292 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189774 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crhv5\" (UniqueName: \"kubernetes.io/projected/7a529510-3e55-4661-b315-c2dac62260b8-kube-api-access-crhv5\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.191292 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189784 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/62c7a7b2-9390-4fb2-84e7-23f981d15afe-hosts-file\") pod \"node-resolver-hgtq9\" (UID: \"62c7a7b2-9390-4fb2-84e7-23f981d15afe\") " pod="openshift-dns/node-resolver-hgtq9" Apr 22 18:46:07.191292 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189798 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/476d713f-1df9-4369-9fcb-94a61680226e-host\") pod \"node-ca-642sx\" (UID: \"476d713f-1df9-4369-9fcb-94a61680226e\") " pod="openshift-image-registry/node-ca-642sx" Apr 22 18:46:07.191292 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189823 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-cni-bin\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.191292 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189839 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-etc-kubernetes\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.191292 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189847 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/88bb7a51-9742-46e6-817e-2c17c4357d07-ovnkube-script-lib\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.191891 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189872 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/92f3b87a-b94c-43ff-b9f1-8f66016fc2ce-konnectivity-ca\") pod \"konnectivity-agent-cmrnt\" (UID: \"92f3b87a-b94c-43ff-b9f1-8f66016fc2ce\") " pod="kube-system/konnectivity-agent-cmrnt" Apr 22 18:46:07.191891 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189895 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-kubernetes\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.191891 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189943 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/476d713f-1df9-4369-9fcb-94a61680226e-serviceca\") pod \"node-ca-642sx\" (UID: \"476d713f-1df9-4369-9fcb-94a61680226e\") " pod="openshift-image-registry/node-ca-642sx" Apr 22 18:46:07.191891 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189967 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ac730e3f-49e1-4703-9a89-4d82e11d265d-cni-binary-copy\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.191891 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.189993 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ac730e3f-49e1-4703-9a89-4d82e11d265d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.191891 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190021 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d21b0e10-276b-4b90-8d4c-23aa447f6f29-iptables-alerter-script\") pod \"iptables-alerter-hkwlr\" (UID: \"d21b0e10-276b-4b90-8d4c-23aa447f6f29\") " pod="openshift-network-operator/iptables-alerter-hkwlr" Apr 22 18:46:07.191891 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190046 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-tuned\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.191891 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190070 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-run-systemd\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.191891 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190072 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-host\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.191891 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190095 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-var-lib-openvswitch\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.191891 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190119 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-cnibin\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.191891 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190143 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/88bb7a51-9742-46e6-817e-2c17c4357d07-ovnkube-config\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.191891 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190169 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-run-ovn\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.191891 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190195 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ac730e3f-49e1-4703-9a89-4d82e11d265d-system-cni-dir\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.191891 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190220 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-multus-cni-dir\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.191891 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190247 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-multus-conf-dir\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.191891 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190273 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-run-netns\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.192672 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190298 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-multus-socket-dir-parent\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.192672 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190323 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-node-log\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.192672 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190352 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-host-run-k8s-cni-cncf-io\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.192672 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190383 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-run\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.192672 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190407 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-lib-modules\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.192672 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190432 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/62c7a7b2-9390-4fb2-84e7-23f981d15afe-tmp-dir\") pod \"node-resolver-hgtq9\" (UID: \"62c7a7b2-9390-4fb2-84e7-23f981d15afe\") " pod="openshift-dns/node-resolver-hgtq9" Apr 22 18:46:07.192672 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190508 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-run-openvswitch\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.192672 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190574 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-sysconfig\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.192672 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190619 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-host-run-netns\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.192672 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190686 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ac730e3f-49e1-4703-9a89-4d82e11d265d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.192672 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190734 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/62c7a7b2-9390-4fb2-84e7-23f981d15afe-tmp-dir\") pod \"node-resolver-hgtq9\" (UID: \"62c7a7b2-9390-4fb2-84e7-23f981d15afe\") " pod="openshift-dns/node-resolver-hgtq9" Apr 22 18:46:07.192672 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190790 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-etc-openvswitch\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.192672 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190795 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-sysctl-d\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.192672 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190844 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-var-lib-openvswitch\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.192672 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.190894 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-cnibin\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.192672 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.191168 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-run-netns\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.192672 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.191232 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-run-ovn\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.192672 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.191261 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ac730e3f-49e1-4703-9a89-4d82e11d265d-system-cni-dir\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.193390 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.191274 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/88bb7a51-9742-46e6-817e-2c17c4357d07-env-overrides\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.193390 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.191300 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-multus-cni-dir\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.193390 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.191325 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-multus-conf-dir\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.193390 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.191494 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-systemd\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.193390 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.191548 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-host-run-k8s-cni-cncf-io\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.193390 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.191600 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2251f062-5650-40fb-b187-729124eb8087-multus-socket-dir-parent\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.193390 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.191645 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-node-log\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.193390 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.191669 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7a529510-3e55-4661-b315-c2dac62260b8-device-dir\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.193390 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.191726 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7a529510-3e55-4661-b315-c2dac62260b8-etc-selinux\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.193390 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.191783 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/476d713f-1df9-4369-9fcb-94a61680226e-host\") pod \"node-ca-642sx\" (UID: \"476d713f-1df9-4369-9fcb-94a61680226e\") " pod="openshift-image-registry/node-ca-642sx" Apr 22 18:46:07.193390 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.191798 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-run\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.193390 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.191970 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/88bb7a51-9742-46e6-817e-2c17c4357d07-ovnkube-config\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.193390 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.192147 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/88bb7a51-9742-46e6-817e-2c17c4357d07-ovn-node-metrics-cert\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.193390 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.192179 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2251f062-5650-40fb-b187-729124eb8087-cni-binary-copy\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.193390 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.192185 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-cni-bin\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.193390 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.192258 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-lib-modules\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.193390 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.192286 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-kubernetes\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.193390 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.192664 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/92f3b87a-b94c-43ff-b9f1-8f66016fc2ce-agent-certs\") pod \"konnectivity-agent-cmrnt\" (UID: \"92f3b87a-b94c-43ff-b9f1-8f66016fc2ce\") " pod="kube-system/konnectivity-agent-cmrnt" Apr 22 18:46:07.194000 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.192667 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ac730e3f-49e1-4703-9a89-4d82e11d265d-cni-binary-copy\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.194000 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.192688 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/92f3b87a-b94c-43ff-b9f1-8f66016fc2ce-konnectivity-ca\") pod \"konnectivity-agent-cmrnt\" (UID: \"92f3b87a-b94c-43ff-b9f1-8f66016fc2ce\") " pod="kube-system/konnectivity-agent-cmrnt" Apr 22 18:46:07.194000 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.192704 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-host-cni-netd\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.194000 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.192831 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/88bb7a51-9742-46e6-817e-2c17c4357d07-run-systemd\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.194000 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.192865 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/88bb7a51-9742-46e6-817e-2c17c4357d07-ovnkube-script-lib\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.194000 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.192887 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/476d713f-1df9-4369-9fcb-94a61680226e-serviceca\") pod \"node-ca-642sx\" (UID: \"476d713f-1df9-4369-9fcb-94a61680226e\") " pod="openshift-image-registry/node-ca-642sx" Apr 22 18:46:07.194000 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.193144 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ac730e3f-49e1-4703-9a89-4d82e11d265d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.194000 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.193176 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d21b0e10-276b-4b90-8d4c-23aa447f6f29-iptables-alerter-script\") pod \"iptables-alerter-hkwlr\" (UID: \"d21b0e10-276b-4b90-8d4c-23aa447f6f29\") " pod="openshift-network-operator/iptables-alerter-hkwlr" Apr 22 18:46:07.194000 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.193291 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6aedca06-2e88-42cc-a622-3d71dec7b063-tmp\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.195279 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.195247 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6aedca06-2e88-42cc-a622-3d71dec7b063-etc-tuned\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.197575 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:07.197554 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:07.197659 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:07.197580 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:07.197659 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:07.197591 2535 projected.go:194] Error preparing data for projected volume kube-api-access-9v2np for pod openshift-network-diagnostics/network-check-target-89stm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:07.197744 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:07.197660 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np podName:85ff2eb7-3fb1-424b-9402-d67103c35bf2 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:07.697645968 +0000 UTC m=+3.061445835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9v2np" (UniqueName: "kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np") pod "network-check-target-89stm" (UID: "85ff2eb7-3fb1-424b-9402-d67103c35bf2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:07.200516 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.200493 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc4pq\" (UniqueName: \"kubernetes.io/projected/62c7a7b2-9390-4fb2-84e7-23f981d15afe-kube-api-access-wc4pq\") pod \"node-resolver-hgtq9\" (UID: \"62c7a7b2-9390-4fb2-84e7-23f981d15afe\") " pod="openshift-dns/node-resolver-hgtq9" Apr 22 18:46:07.203255 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.203233 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n6f5\" (UniqueName: \"kubernetes.io/projected/476d713f-1df9-4369-9fcb-94a61680226e-kube-api-access-9n6f5\") pod \"node-ca-642sx\" (UID: \"476d713f-1df9-4369-9fcb-94a61680226e\") " pod="openshift-image-registry/node-ca-642sx" Apr 22 18:46:07.203954 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.203934 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnmvp\" (UniqueName: \"kubernetes.io/projected/252dfd14-9c83-4928-bbcd-d84b479525bc-kube-api-access-tnmvp\") pod \"network-metrics-daemon-5g7dk\" (UID: \"252dfd14-9c83-4928-bbcd-d84b479525bc\") " pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:07.204232 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.204211 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vggw7\" (UniqueName: \"kubernetes.io/projected/2251f062-5650-40fb-b187-729124eb8087-kube-api-access-vggw7\") pod \"multus-pjv5t\" (UID: \"2251f062-5650-40fb-b187-729124eb8087\") " pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.204529 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.204431 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzjqm\" (UniqueName: \"kubernetes.io/projected/88bb7a51-9742-46e6-817e-2c17c4357d07-kube-api-access-gzjqm\") pod \"ovnkube-node-b8wsf\" (UID: \"88bb7a51-9742-46e6-817e-2c17c4357d07\") " pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.204529 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.204449 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqtxv\" (UniqueName: \"kubernetes.io/projected/ac730e3f-49e1-4703-9a89-4d82e11d265d-kube-api-access-dqtxv\") pod \"multus-additional-cni-plugins-87xk8\" (UID: \"ac730e3f-49e1-4703-9a89-4d82e11d265d\") " pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.204529 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.204484 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crhv5\" (UniqueName: \"kubernetes.io/projected/7a529510-3e55-4661-b315-c2dac62260b8-kube-api-access-crhv5\") pod \"aws-ebs-csi-driver-node-gq5hm\" (UID: \"7a529510-3e55-4661-b315-c2dac62260b8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.204689 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.204601 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brqk4\" (UniqueName: \"kubernetes.io/projected/6aedca06-2e88-42cc-a622-3d71dec7b063-kube-api-access-brqk4\") pod \"tuned-5rsfp\" (UID: \"6aedca06-2e88-42cc-a622-3d71dec7b063\") " pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.206039 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.206022 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkxcp\" (UniqueName: \"kubernetes.io/projected/d21b0e10-276b-4b90-8d4c-23aa447f6f29-kube-api-access-gkxcp\") pod \"iptables-alerter-hkwlr\" (UID: \"d21b0e10-276b-4b90-8d4c-23aa447f6f29\") " pod="openshift-network-operator/iptables-alerter-hkwlr" Apr 22 18:46:07.382220 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.382139 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-642sx" Apr 22 18:46:07.394112 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.394080 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pjv5t" Apr 22 18:46:07.399297 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.399273 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cmrnt" Apr 22 18:46:07.401660 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.401641 2535 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:07.404756 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.404739 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:07.411343 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.411323 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" Apr 22 18:46:07.416867 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.416851 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" Apr 22 18:46:07.423464 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.423447 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-87xk8" Apr 22 18:46:07.429978 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.429962 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hgtq9" Apr 22 18:46:07.436446 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.436427 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hkwlr" Apr 22 18:46:07.695064 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.694995 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs\") pod \"network-metrics-daemon-5g7dk\" (UID: \"252dfd14-9c83-4928-bbcd-d84b479525bc\") " pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:07.695217 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:07.695111 2535 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:07.695217 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:07.695195 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs podName:252dfd14-9c83-4928-bbcd-d84b479525bc nodeName:}" failed. No retries permitted until 2026-04-22 18:46:08.695167437 +0000 UTC m=+4.058967344 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs") pod "network-metrics-daemon-5g7dk" (UID: "252dfd14-9c83-4928-bbcd-d84b479525bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:07.795794 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:07.795765 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9v2np\" (UniqueName: \"kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np\") pod \"network-check-target-89stm\" (UID: \"85ff2eb7-3fb1-424b-9402-d67103c35bf2\") " pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:07.795948 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:07.795930 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:07.795998 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:07.795953 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:07.795998 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:07.795962 2535 projected.go:194] Error preparing data for projected volume kube-api-access-9v2np for pod openshift-network-diagnostics/network-check-target-89stm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:07.796062 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:07.796012 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np podName:85ff2eb7-3fb1-424b-9402-d67103c35bf2 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:08.795998557 +0000 UTC m=+4.159798428 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-9v2np" (UniqueName: "kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np") pod "network-check-target-89stm" (UID: "85ff2eb7-3fb1-424b-9402-d67103c35bf2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:07.817650 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:07.817431 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aedca06_2e88_42cc_a622_3d71dec7b063.slice/crio-44689684611c973f81337363a96acb575f793284434765b3542d431823cb759c WatchSource:0}: Error finding container 44689684611c973f81337363a96acb575f793284434765b3542d431823cb759c: Status 404 returned error can't find the container with id 44689684611c973f81337363a96acb575f793284434765b3542d431823cb759c Apr 22 18:46:07.818242 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:07.818220 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88bb7a51_9742_46e6_817e_2c17c4357d07.slice/crio-f13615f2643fbef8b2c9b9d12362aa88a08798c5d3c0c5934c4bba8c21601181 WatchSource:0}: Error finding container f13615f2643fbef8b2c9b9d12362aa88a08798c5d3c0c5934c4bba8c21601181: Status 404 returned error can't find the container with id f13615f2643fbef8b2c9b9d12362aa88a08798c5d3c0c5934c4bba8c21601181 Apr 22 18:46:07.819659 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:07.819633 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd21b0e10_276b_4b90_8d4c_23aa447f6f29.slice/crio-850c39cf864cbbfaacd6e1a3c7553d191df7861ca5771bc7dee657558b766319 WatchSource:0}: Error finding container 850c39cf864cbbfaacd6e1a3c7553d191df7861ca5771bc7dee657558b766319: Status 404 returned error can't find the container with id 850c39cf864cbbfaacd6e1a3c7553d191df7861ca5771bc7dee657558b766319 Apr 22 18:46:07.823104 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:07.823084 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92f3b87a_b94c_43ff_b9f1_8f66016fc2ce.slice/crio-7e2c086ba89fc80fd9e4337c4b9b3a81760be51aefcaf78ed8aaf6128e11bb0e WatchSource:0}: Error finding container 7e2c086ba89fc80fd9e4337c4b9b3a81760be51aefcaf78ed8aaf6128e11bb0e: Status 404 returned error can't find the container with id 7e2c086ba89fc80fd9e4337c4b9b3a81760be51aefcaf78ed8aaf6128e11bb0e Apr 22 18:46:07.823739 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:07.823717 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod476d713f_1df9_4369_9fcb_94a61680226e.slice/crio-21eca2f4e7684f6f99c03ec3b9ef4b40c11bd504f4b45eccff140d5273811c1d WatchSource:0}: Error finding container 21eca2f4e7684f6f99c03ec3b9ef4b40c11bd504f4b45eccff140d5273811c1d: Status 404 returned error can't find the container with id 21eca2f4e7684f6f99c03ec3b9ef4b40c11bd504f4b45eccff140d5273811c1d Apr 22 18:46:07.825967 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:07.825934 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2251f062_5650_40fb_b187_729124eb8087.slice/crio-54f2ef56c7f7359cab678681a0ceacd00a9442c82bfd32dc26df056f13ca90ec WatchSource:0}: Error finding container 54f2ef56c7f7359cab678681a0ceacd00a9442c82bfd32dc26df056f13ca90ec: Status 404 returned error can't find the container with id 54f2ef56c7f7359cab678681a0ceacd00a9442c82bfd32dc26df056f13ca90ec Apr 22 18:46:07.828146 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:07.827767 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a529510_3e55_4661_b315_c2dac62260b8.slice/crio-844295f7c2f2373ccb51ceec75a67eb88b4da43f99fd8671e52995368cd910ab WatchSource:0}: Error finding container 844295f7c2f2373ccb51ceec75a67eb88b4da43f99fd8671e52995368cd910ab: Status 404 returned error can't find the container with id 844295f7c2f2373ccb51ceec75a67eb88b4da43f99fd8671e52995368cd910ab Apr 22 18:46:08.120847 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:08.120747 2535 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:06 +0000 UTC" deadline="2028-01-21 15:20:28.010351768 +0000 UTC" Apr 22 18:46:08.120847 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:08.120776 2535 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15332h34m19.889577975s" Apr 22 18:46:08.263115 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:08.263036 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-642sx" event={"ID":"476d713f-1df9-4369-9fcb-94a61680226e","Type":"ContainerStarted","Data":"21eca2f4e7684f6f99c03ec3b9ef4b40c11bd504f4b45eccff140d5273811c1d"} Apr 22 18:46:08.266319 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:08.266289 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cmrnt" event={"ID":"92f3b87a-b94c-43ff-b9f1-8f66016fc2ce","Type":"ContainerStarted","Data":"7e2c086ba89fc80fd9e4337c4b9b3a81760be51aefcaf78ed8aaf6128e11bb0e"} Apr 22 18:46:08.267821 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:08.267792 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" event={"ID":"6aedca06-2e88-42cc-a622-3d71dec7b063","Type":"ContainerStarted","Data":"44689684611c973f81337363a96acb575f793284434765b3542d431823cb759c"} Apr 22 18:46:08.270011 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:08.269984 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-85.ec2.internal" event={"ID":"e8fe672e26d8f153f55920269ad886fc","Type":"ContainerStarted","Data":"9dd74449285f16e060762f4314ce3246e10601c45db9e2f0d5875b40336666c2"} Apr 22 18:46:08.272381 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:08.272338 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pjv5t" event={"ID":"2251f062-5650-40fb-b187-729124eb8087","Type":"ContainerStarted","Data":"54f2ef56c7f7359cab678681a0ceacd00a9442c82bfd32dc26df056f13ca90ec"} Apr 22 18:46:08.275502 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:08.275454 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hkwlr" event={"ID":"d21b0e10-276b-4b90-8d4c-23aa447f6f29","Type":"ContainerStarted","Data":"850c39cf864cbbfaacd6e1a3c7553d191df7861ca5771bc7dee657558b766319"} Apr 22 18:46:08.278424 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:08.278360 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" event={"ID":"88bb7a51-9742-46e6-817e-2c17c4357d07","Type":"ContainerStarted","Data":"f13615f2643fbef8b2c9b9d12362aa88a08798c5d3c0c5934c4bba8c21601181"} Apr 22 18:46:08.281830 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:08.281805 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hgtq9" event={"ID":"62c7a7b2-9390-4fb2-84e7-23f981d15afe","Type":"ContainerStarted","Data":"e0565b4ef76f4073ab77519456fec6e9b533c86594ae54d53b2c2c01406d6e4a"} Apr 22 18:46:08.286692 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:08.286668 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87xk8" event={"ID":"ac730e3f-49e1-4703-9a89-4d82e11d265d","Type":"ContainerStarted","Data":"aafc6c646f8664769297c2052c3fc8b6eb3582056fdbbeb35a2f14d92677272d"} Apr 22 18:46:08.294106 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:08.293941 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" event={"ID":"7a529510-3e55-4661-b315-c2dac62260b8","Type":"ContainerStarted","Data":"844295f7c2f2373ccb51ceec75a67eb88b4da43f99fd8671e52995368cd910ab"} Apr 22 18:46:08.702958 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:08.702533 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs\") pod \"network-metrics-daemon-5g7dk\" (UID: \"252dfd14-9c83-4928-bbcd-d84b479525bc\") " pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:08.702958 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:08.702701 2535 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:08.702958 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:08.702787 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs podName:252dfd14-9c83-4928-bbcd-d84b479525bc nodeName:}" failed. No retries permitted until 2026-04-22 18:46:10.702769214 +0000 UTC m=+6.066569085 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs") pod "network-metrics-daemon-5g7dk" (UID: "252dfd14-9c83-4928-bbcd-d84b479525bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:08.803606 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:08.803531 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9v2np\" (UniqueName: \"kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np\") pod \"network-check-target-89stm\" (UID: \"85ff2eb7-3fb1-424b-9402-d67103c35bf2\") " pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:08.803751 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:08.803713 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:08.803751 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:08.803734 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:08.803751 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:08.803746 2535 projected.go:194] Error preparing data for projected volume kube-api-access-9v2np for pod openshift-network-diagnostics/network-check-target-89stm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:08.803943 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:08.803803 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np podName:85ff2eb7-3fb1-424b-9402-d67103c35bf2 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:10.803785746 +0000 UTC m=+6.167585620 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-9v2np" (UniqueName: "kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np") pod "network-check-target-89stm" (UID: "85ff2eb7-3fb1-424b-9402-d67103c35bf2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:09.256323 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:09.256075 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:09.256323 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:09.256213 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-89stm" podUID="85ff2eb7-3fb1-424b-9402-d67103c35bf2" Apr 22 18:46:09.256800 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:09.256548 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:09.256800 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:09.256673 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g7dk" podUID="252dfd14-9c83-4928-bbcd-d84b479525bc" Apr 22 18:46:09.301810 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:09.301770 2535 generic.go:358] "Generic (PLEG): container finished" podID="6b3c4961afadf912d9474ffe329248eb" containerID="dddf6fa6a3e3bfaea06b9401e9b34464c628b080d201c460e6c035f66dcd4a7f" exitCode=0 Apr 22 18:46:09.303036 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:09.302967 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal" event={"ID":"6b3c4961afadf912d9474ffe329248eb","Type":"ContainerDied","Data":"dddf6fa6a3e3bfaea06b9401e9b34464c628b080d201c460e6c035f66dcd4a7f"} Apr 22 18:46:09.318688 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:09.317796 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-85.ec2.internal" podStartSLOduration=3.317779954 podStartE2EDuration="3.317779954s" podCreationTimestamp="2026-04-22 18:46:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:08.284677764 +0000 UTC m=+3.648477655" watchObservedRunningTime="2026-04-22 18:46:09.317779954 +0000 UTC m=+4.681579847" Apr 22 18:46:10.310030 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:10.309972 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal" event={"ID":"6b3c4961afadf912d9474ffe329248eb","Type":"ContainerStarted","Data":"8993e8378a36bfa17cd75ae5b62a6a859440a512971710bc72c734e0f8e649ec"} Apr 22 18:46:10.720017 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:10.719354 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs\") pod \"network-metrics-daemon-5g7dk\" (UID: \"252dfd14-9c83-4928-bbcd-d84b479525bc\") " pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:10.720017 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:10.719517 2535 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:10.720017 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:10.719579 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs podName:252dfd14-9c83-4928-bbcd-d84b479525bc nodeName:}" failed. No retries permitted until 2026-04-22 18:46:14.719559792 +0000 UTC m=+10.083359659 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs") pod "network-metrics-daemon-5g7dk" (UID: "252dfd14-9c83-4928-bbcd-d84b479525bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:10.820226 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:10.820135 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9v2np\" (UniqueName: \"kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np\") pod \"network-check-target-89stm\" (UID: \"85ff2eb7-3fb1-424b-9402-d67103c35bf2\") " pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:10.820831 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:10.820397 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:10.820831 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:10.820427 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:10.820831 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:10.820440 2535 projected.go:194] Error preparing data for projected volume kube-api-access-9v2np for pod openshift-network-diagnostics/network-check-target-89stm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:10.820831 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:10.820503 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np podName:85ff2eb7-3fb1-424b-9402-d67103c35bf2 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:14.820483783 +0000 UTC m=+10.184283668 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-9v2np" (UniqueName: "kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np") pod "network-check-target-89stm" (UID: "85ff2eb7-3fb1-424b-9402-d67103c35bf2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:11.256007 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:11.255977 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:11.256164 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:11.256105 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-89stm" podUID="85ff2eb7-3fb1-424b-9402-d67103c35bf2" Apr 22 18:46:11.256242 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:11.256157 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:11.256295 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:11.256276 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g7dk" podUID="252dfd14-9c83-4928-bbcd-d84b479525bc" Apr 22 18:46:13.255268 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:13.254719 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:13.255268 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:13.254844 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-89stm" podUID="85ff2eb7-3fb1-424b-9402-d67103c35bf2" Apr 22 18:46:13.255268 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:13.255167 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:13.255268 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:13.255239 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g7dk" podUID="252dfd14-9c83-4928-bbcd-d84b479525bc" Apr 22 18:46:14.753191 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:14.753155 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs\") pod \"network-metrics-daemon-5g7dk\" (UID: \"252dfd14-9c83-4928-bbcd-d84b479525bc\") " pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:14.753590 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:14.753343 2535 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:14.753590 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:14.753405 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs podName:252dfd14-9c83-4928-bbcd-d84b479525bc nodeName:}" failed. No retries permitted until 2026-04-22 18:46:22.753385265 +0000 UTC m=+18.117185147 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs") pod "network-metrics-daemon-5g7dk" (UID: "252dfd14-9c83-4928-bbcd-d84b479525bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:14.853889 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:14.853852 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9v2np\" (UniqueName: \"kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np\") pod \"network-check-target-89stm\" (UID: \"85ff2eb7-3fb1-424b-9402-d67103c35bf2\") " pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:14.854086 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:14.854034 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:14.854086 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:14.854056 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:14.854086 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:14.854070 2535 projected.go:194] Error preparing data for projected volume kube-api-access-9v2np for pod openshift-network-diagnostics/network-check-target-89stm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:14.854263 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:14.854126 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np podName:85ff2eb7-3fb1-424b-9402-d67103c35bf2 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:22.854110727 +0000 UTC m=+18.217910595 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-9v2np" (UniqueName: "kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np") pod "network-check-target-89stm" (UID: "85ff2eb7-3fb1-424b-9402-d67103c35bf2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:15.255363 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:15.255328 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:15.255512 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:15.255456 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g7dk" podUID="252dfd14-9c83-4928-bbcd-d84b479525bc" Apr 22 18:46:15.256013 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:15.255985 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:15.256117 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:15.256085 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-89stm" podUID="85ff2eb7-3fb1-424b-9402-d67103c35bf2" Apr 22 18:46:17.254791 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:17.254759 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:17.255189 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:17.254797 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:17.255189 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:17.254923 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g7dk" podUID="252dfd14-9c83-4928-bbcd-d84b479525bc" Apr 22 18:46:17.255189 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:17.255040 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-89stm" podUID="85ff2eb7-3fb1-424b-9402-d67103c35bf2" Apr 22 18:46:19.254725 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:19.254686 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:19.255172 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:19.254804 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g7dk" podUID="252dfd14-9c83-4928-bbcd-d84b479525bc" Apr 22 18:46:19.255172 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:19.254674 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:19.255264 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:19.255229 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-89stm" podUID="85ff2eb7-3fb1-424b-9402-d67103c35bf2" Apr 22 18:46:21.253946 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:21.253914 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:21.253946 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:21.253954 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:21.254363 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:21.254034 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-89stm" podUID="85ff2eb7-3fb1-424b-9402-d67103c35bf2" Apr 22 18:46:21.254363 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:21.254159 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g7dk" podUID="252dfd14-9c83-4928-bbcd-d84b479525bc" Apr 22 18:46:22.816615 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:22.816578 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs\") pod \"network-metrics-daemon-5g7dk\" (UID: \"252dfd14-9c83-4928-bbcd-d84b479525bc\") " pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:22.817145 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:22.816761 2535 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:22.817145 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:22.816862 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs podName:252dfd14-9c83-4928-bbcd-d84b479525bc nodeName:}" failed. No retries permitted until 2026-04-22 18:46:38.816843193 +0000 UTC m=+34.180643078 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs") pod "network-metrics-daemon-5g7dk" (UID: "252dfd14-9c83-4928-bbcd-d84b479525bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:22.916936 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:22.916893 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9v2np\" (UniqueName: \"kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np\") pod \"network-check-target-89stm\" (UID: \"85ff2eb7-3fb1-424b-9402-d67103c35bf2\") " pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:22.917106 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:22.917088 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:22.917186 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:22.917113 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:22.917186 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:22.917126 2535 projected.go:194] Error preparing data for projected volume kube-api-access-9v2np for pod openshift-network-diagnostics/network-check-target-89stm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:22.917287 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:22.917194 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np podName:85ff2eb7-3fb1-424b-9402-d67103c35bf2 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:38.917175198 +0000 UTC m=+34.280975067 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-9v2np" (UniqueName: "kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np") pod "network-check-target-89stm" (UID: "85ff2eb7-3fb1-424b-9402-d67103c35bf2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:23.260257 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:23.260175 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:23.260408 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:23.260185 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:23.260408 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:23.260298 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-89stm" podUID="85ff2eb7-3fb1-424b-9402-d67103c35bf2" Apr 22 18:46:23.260510 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:23.260410 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g7dk" podUID="252dfd14-9c83-4928-bbcd-d84b479525bc" Apr 22 18:46:25.258033 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:25.257808 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:25.258287 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:25.258150 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g7dk" podUID="252dfd14-9c83-4928-bbcd-d84b479525bc" Apr 22 18:46:25.258543 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:25.258524 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:25.258643 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:25.258624 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-89stm" podUID="85ff2eb7-3fb1-424b-9402-d67103c35bf2" Apr 22 18:46:25.335678 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:25.335652 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-642sx" event={"ID":"476d713f-1df9-4369-9fcb-94a61680226e","Type":"ContainerStarted","Data":"d857cfcf2922240bc37a946878ff7b1e6cc7c21fc642cc8e72b9f52f2331fd22"} Apr 22 18:46:25.336891 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:25.336868 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cmrnt" event={"ID":"92f3b87a-b94c-43ff-b9f1-8f66016fc2ce","Type":"ContainerStarted","Data":"9a11627825be884f0fe00a173bcf33c57eaaa9dd22228004ef05aa260bca9229"} Apr 22 18:46:25.338111 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:25.338090 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" event={"ID":"6aedca06-2e88-42cc-a622-3d71dec7b063","Type":"ContainerStarted","Data":"c091a422acd527e59daf1a9c8d5a64a141bed778df6eaeef0ae5ddb1bc24c057"} Apr 22 18:46:25.349675 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:25.349630 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-85.ec2.internal" podStartSLOduration=19.3496197 podStartE2EDuration="19.3496197s" podCreationTimestamp="2026-04-22 18:46:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:10.343137955 +0000 UTC m=+5.706937844" watchObservedRunningTime="2026-04-22 18:46:25.3496197 +0000 UTC m=+20.713419592" Apr 22 18:46:25.349957 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:25.349936 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-cmrnt" podStartSLOduration=3.01109828 podStartE2EDuration="20.349930422s" podCreationTimestamp="2026-04-22 18:46:05 +0000 UTC" firstStartedPulling="2026-04-22 18:46:07.824764234 +0000 UTC m=+3.188564101" lastFinishedPulling="2026-04-22 18:46:25.163596374 +0000 UTC m=+20.527396243" observedRunningTime="2026-04-22 18:46:25.349325008 +0000 UTC m=+20.713124897" watchObservedRunningTime="2026-04-22 18:46:25.349930422 +0000 UTC m=+20.713730315" Apr 22 18:46:25.369430 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:25.369395 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-5rsfp" podStartSLOduration=3.025004331 podStartE2EDuration="20.369384475s" podCreationTimestamp="2026-04-22 18:46:05 +0000 UTC" firstStartedPulling="2026-04-22 18:46:07.819047199 +0000 UTC m=+3.182847066" lastFinishedPulling="2026-04-22 18:46:25.163427344 +0000 UTC m=+20.527227210" observedRunningTime="2026-04-22 18:46:25.368673933 +0000 UTC m=+20.732473821" watchObservedRunningTime="2026-04-22 18:46:25.369384475 +0000 UTC m=+20.733184363" Apr 22 18:46:26.341727 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:26.341460 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pjv5t" event={"ID":"2251f062-5650-40fb-b187-729124eb8087","Type":"ContainerStarted","Data":"e6f6592c401c8af7b0851aeab489a7f10a7693c7bbbfc308135783a615aa7732"} Apr 22 18:46:26.344092 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:26.344062 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" event={"ID":"88bb7a51-9742-46e6-817e-2c17c4357d07","Type":"ContainerStarted","Data":"c259dc478fcb2a23b1c1ff120a249504889a9ab9c535611f51a3ce96c5bfbf52"} Apr 22 18:46:26.344092 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:26.344094 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" event={"ID":"88bb7a51-9742-46e6-817e-2c17c4357d07","Type":"ContainerStarted","Data":"071a909ff14cbc17aaf9ce835597d54ea75c37f3e455e561c8c1be65281614ca"} Apr 22 18:46:26.344253 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:26.344104 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" event={"ID":"88bb7a51-9742-46e6-817e-2c17c4357d07","Type":"ContainerStarted","Data":"cd5aac92e06cc254ea39c9f17f41c21269fd4971ff022a344b437db070d94679"} Apr 22 18:46:26.344253 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:26.344111 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" event={"ID":"88bb7a51-9742-46e6-817e-2c17c4357d07","Type":"ContainerStarted","Data":"37050ec8755ec93d2d29f0f92eb29b3fc35e214d58a4b10f92bc792d07d2b0a1"} Apr 22 18:46:26.344253 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:26.344119 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" event={"ID":"88bb7a51-9742-46e6-817e-2c17c4357d07","Type":"ContainerStarted","Data":"113e66a6cbda7172e394cb092d57ae4157bb2fcdc575cfbe598e04649c5c394b"} Apr 22 18:46:26.344253 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:26.344127 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" event={"ID":"88bb7a51-9742-46e6-817e-2c17c4357d07","Type":"ContainerStarted","Data":"1cca273094dc4198252ea878c3bde69c97aed5e5744b87c1e9d2407d3695f0aa"} Apr 22 18:46:26.345285 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:26.345262 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hgtq9" event={"ID":"62c7a7b2-9390-4fb2-84e7-23f981d15afe","Type":"ContainerStarted","Data":"db0a6b6208769721f0340a09c6a85b5bcf1329b354a9be1f2ce93e9d788c4357"} Apr 22 18:46:26.346493 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:26.346469 2535 generic.go:358] "Generic (PLEG): container finished" podID="ac730e3f-49e1-4703-9a89-4d82e11d265d" containerID="9d89aa023fcf5b6aa5e45d4931f0eafe70d2bf8ebf244a75027598682a57aa42" exitCode=0 Apr 22 18:46:26.346577 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:26.346542 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87xk8" event={"ID":"ac730e3f-49e1-4703-9a89-4d82e11d265d","Type":"ContainerDied","Data":"9d89aa023fcf5b6aa5e45d4931f0eafe70d2bf8ebf244a75027598682a57aa42"} Apr 22 18:46:26.347793 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:26.347772 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" event={"ID":"7a529510-3e55-4661-b315-c2dac62260b8","Type":"ContainerStarted","Data":"a9a6d33e8d6d48b11cc53d52b82332c7afd1a826abbe727616756e79be7d065a"} Apr 22 18:46:26.359086 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:26.359048 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pjv5t" podStartSLOduration=3.827377869 podStartE2EDuration="21.359035249s" podCreationTimestamp="2026-04-22 18:46:05 +0000 UTC" firstStartedPulling="2026-04-22 18:46:07.828054639 +0000 UTC m=+3.191854507" lastFinishedPulling="2026-04-22 18:46:25.35971202 +0000 UTC m=+20.723511887" observedRunningTime="2026-04-22 18:46:26.359004687 +0000 UTC m=+21.722804577" watchObservedRunningTime="2026-04-22 18:46:26.359035249 +0000 UTC m=+21.722835119" Apr 22 18:46:26.394150 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:26.394115 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-642sx" podStartSLOduration=4.05645251 podStartE2EDuration="21.394105551s" podCreationTimestamp="2026-04-22 18:46:05 +0000 UTC" firstStartedPulling="2026-04-22 18:46:07.826155981 +0000 UTC m=+3.189955857" lastFinishedPulling="2026-04-22 18:46:25.163809018 +0000 UTC m=+20.527608898" observedRunningTime="2026-04-22 18:46:26.394001242 +0000 UTC m=+21.757801131" watchObservedRunningTime="2026-04-22 18:46:26.394105551 +0000 UTC m=+21.757905439" Apr 22 18:46:26.408566 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:26.408522 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hgtq9" podStartSLOduration=4.064608652 podStartE2EDuration="21.408341891s" podCreationTimestamp="2026-04-22 18:46:05 +0000 UTC" firstStartedPulling="2026-04-22 18:46:07.831536552 +0000 UTC m=+3.195336422" lastFinishedPulling="2026-04-22 18:46:25.175269793 +0000 UTC m=+20.539069661" observedRunningTime="2026-04-22 18:46:26.408267328 +0000 UTC m=+21.772067207" watchObservedRunningTime="2026-04-22 18:46:26.408341891 +0000 UTC m=+21.772141770" Apr 22 18:46:26.526960 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:26.526940 2535 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-cmrnt" Apr 22 18:46:26.527734 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:26.527716 2535 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-cmrnt" Apr 22 18:46:26.642676 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:26.642650 2535 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:46:27.168695 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:27.168404 2535 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:46:26.642671371Z","UUID":"b38d68cc-1d1b-4493-b256-f2679b0030a4","Handler":null,"Name":"","Endpoint":""} Apr 22 18:46:27.170593 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:27.170571 2535 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:46:27.170717 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:27.170602 2535 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:46:27.254549 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:27.254514 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:27.254708 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:27.254563 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:27.254708 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:27.254665 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-89stm" podUID="85ff2eb7-3fb1-424b-9402-d67103c35bf2" Apr 22 18:46:27.255100 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:27.255077 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g7dk" podUID="252dfd14-9c83-4928-bbcd-d84b479525bc" Apr 22 18:46:27.350822 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:27.350789 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hkwlr" event={"ID":"d21b0e10-276b-4b90-8d4c-23aa447f6f29","Type":"ContainerStarted","Data":"b3156669a40828d11746168b22e9207149cc33e41afd006a97fe3ed17a385829"} Apr 22 18:46:27.352975 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:27.352942 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" event={"ID":"7a529510-3e55-4661-b315-c2dac62260b8","Type":"ContainerStarted","Data":"d7d85b162ffed3275d29d0d253c6b59122805ffd6f82b8b12863f50c99ff8459"} Apr 22 18:46:27.364468 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:27.364425 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-hkwlr" podStartSLOduration=5.02287583 podStartE2EDuration="22.364413898s" podCreationTimestamp="2026-04-22 18:46:05 +0000 UTC" firstStartedPulling="2026-04-22 18:46:07.821920157 +0000 UTC m=+3.185720038" lastFinishedPulling="2026-04-22 18:46:25.163458234 +0000 UTC m=+20.527258106" observedRunningTime="2026-04-22 18:46:27.364305385 +0000 UTC m=+22.728105276" watchObservedRunningTime="2026-04-22 18:46:27.364413898 +0000 UTC m=+22.728213786" Apr 22 18:46:28.359220 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:28.359191 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" event={"ID":"88bb7a51-9742-46e6-817e-2c17c4357d07","Type":"ContainerStarted","Data":"d6babe5834c56e1e75df8049d9ebfd3e58812536d14f1f0a4388e679453754a2"} Apr 22 18:46:28.362604 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:28.362575 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" event={"ID":"7a529510-3e55-4661-b315-c2dac62260b8","Type":"ContainerStarted","Data":"690586593db375b9e98c8558fcec2e90291ae5e877d40bba85575b5e17e5b4a8"} Apr 22 18:46:28.362732 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:28.362658 2535 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:46:28.378770 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:28.378731 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gq5hm" podStartSLOduration=3.744627988 podStartE2EDuration="23.378715798s" podCreationTimestamp="2026-04-22 18:46:05 +0000 UTC" firstStartedPulling="2026-04-22 18:46:07.830973384 +0000 UTC m=+3.194773264" lastFinishedPulling="2026-04-22 18:46:27.465061192 +0000 UTC m=+22.828861074" observedRunningTime="2026-04-22 18:46:28.378583084 +0000 UTC m=+23.742382972" watchObservedRunningTime="2026-04-22 18:46:28.378715798 +0000 UTC m=+23.742515688" Apr 22 18:46:29.253982 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:29.253952 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:29.254157 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:29.253992 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:29.254157 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:29.254082 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-89stm" podUID="85ff2eb7-3fb1-424b-9402-d67103c35bf2" Apr 22 18:46:29.254259 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:29.254177 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g7dk" podUID="252dfd14-9c83-4928-bbcd-d84b479525bc" Apr 22 18:46:29.278789 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:29.278766 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7njpk"] Apr 22 18:46:29.284747 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:29.284725 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:29.284883 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:29.284810 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7njpk" podUID="8f788b1d-19db-4fcb-a814-0c4d333cdee1" Apr 22 18:46:29.362630 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:29.362601 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8f788b1d-19db-4fcb-a814-0c4d333cdee1-kubelet-config\") pod \"global-pull-secret-syncer-7njpk\" (UID: \"8f788b1d-19db-4fcb-a814-0c4d333cdee1\") " pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:29.363379 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:29.362644 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8f788b1d-19db-4fcb-a814-0c4d333cdee1-dbus\") pod \"global-pull-secret-syncer-7njpk\" (UID: \"8f788b1d-19db-4fcb-a814-0c4d333cdee1\") " pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:29.363379 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:29.362707 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8f788b1d-19db-4fcb-a814-0c4d333cdee1-original-pull-secret\") pod \"global-pull-secret-syncer-7njpk\" (UID: \"8f788b1d-19db-4fcb-a814-0c4d333cdee1\") " pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:29.463437 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:29.463406 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8f788b1d-19db-4fcb-a814-0c4d333cdee1-kubelet-config\") pod \"global-pull-secret-syncer-7njpk\" (UID: \"8f788b1d-19db-4fcb-a814-0c4d333cdee1\") " pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:29.463586 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:29.463442 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8f788b1d-19db-4fcb-a814-0c4d333cdee1-dbus\") pod \"global-pull-secret-syncer-7njpk\" (UID: \"8f788b1d-19db-4fcb-a814-0c4d333cdee1\") " pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:29.463586 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:29.463472 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8f788b1d-19db-4fcb-a814-0c4d333cdee1-original-pull-secret\") pod \"global-pull-secret-syncer-7njpk\" (UID: \"8f788b1d-19db-4fcb-a814-0c4d333cdee1\") " pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:29.463586 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:29.463543 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8f788b1d-19db-4fcb-a814-0c4d333cdee1-kubelet-config\") pod \"global-pull-secret-syncer-7njpk\" (UID: \"8f788b1d-19db-4fcb-a814-0c4d333cdee1\") " pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:29.463586 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:29.463570 2535 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:29.463775 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:29.463634 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f788b1d-19db-4fcb-a814-0c4d333cdee1-original-pull-secret podName:8f788b1d-19db-4fcb-a814-0c4d333cdee1 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:29.963619626 +0000 UTC m=+25.327419493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8f788b1d-19db-4fcb-a814-0c4d333cdee1-original-pull-secret") pod "global-pull-secret-syncer-7njpk" (UID: "8f788b1d-19db-4fcb-a814-0c4d333cdee1") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:29.463775 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:29.463733 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8f788b1d-19db-4fcb-a814-0c4d333cdee1-dbus\") pod \"global-pull-secret-syncer-7njpk\" (UID: \"8f788b1d-19db-4fcb-a814-0c4d333cdee1\") " pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:29.968614 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:29.968582 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8f788b1d-19db-4fcb-a814-0c4d333cdee1-original-pull-secret\") pod \"global-pull-secret-syncer-7njpk\" (UID: \"8f788b1d-19db-4fcb-a814-0c4d333cdee1\") " pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:29.968788 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:29.968755 2535 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:29.968848 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:29.968800 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f788b1d-19db-4fcb-a814-0c4d333cdee1-original-pull-secret podName:8f788b1d-19db-4fcb-a814-0c4d333cdee1 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:30.968787485 +0000 UTC m=+26.332587352 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8f788b1d-19db-4fcb-a814-0c4d333cdee1-original-pull-secret") pod "global-pull-secret-syncer-7njpk" (UID: "8f788b1d-19db-4fcb-a814-0c4d333cdee1") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:30.976276 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:30.976116 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8f788b1d-19db-4fcb-a814-0c4d333cdee1-original-pull-secret\") pod \"global-pull-secret-syncer-7njpk\" (UID: \"8f788b1d-19db-4fcb-a814-0c4d333cdee1\") " pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:30.976276 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:30.976264 2535 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:30.976686 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:30.976329 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f788b1d-19db-4fcb-a814-0c4d333cdee1-original-pull-secret podName:8f788b1d-19db-4fcb-a814-0c4d333cdee1 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:32.976309629 +0000 UTC m=+28.340109499 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8f788b1d-19db-4fcb-a814-0c4d333cdee1-original-pull-secret") pod "global-pull-secret-syncer-7njpk" (UID: "8f788b1d-19db-4fcb-a814-0c4d333cdee1") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:31.254354 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:31.254270 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:31.254354 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:31.254319 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:31.254519 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:31.254415 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g7dk" podUID="252dfd14-9c83-4928-bbcd-d84b479525bc" Apr 22 18:46:31.254519 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:31.254447 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-89stm" podUID="85ff2eb7-3fb1-424b-9402-d67103c35bf2" Apr 22 18:46:31.254519 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:31.254475 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:31.254621 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:31.254543 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7njpk" podUID="8f788b1d-19db-4fcb-a814-0c4d333cdee1" Apr 22 18:46:31.368232 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:31.368202 2535 generic.go:358] "Generic (PLEG): container finished" podID="ac730e3f-49e1-4703-9a89-4d82e11d265d" containerID="0d911d9f26d1bab2144793415d8434f957247f363a0e827afcb7f75d9978d2cc" exitCode=0 Apr 22 18:46:31.368355 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:31.368287 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87xk8" event={"ID":"ac730e3f-49e1-4703-9a89-4d82e11d265d","Type":"ContainerDied","Data":"0d911d9f26d1bab2144793415d8434f957247f363a0e827afcb7f75d9978d2cc"} Apr 22 18:46:31.371384 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:31.371361 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" event={"ID":"88bb7a51-9742-46e6-817e-2c17c4357d07","Type":"ContainerStarted","Data":"8aeeaf0200cad1400652890e737c672c0e1982bc8069900a28bb547fc11c147c"} Apr 22 18:46:31.371709 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:31.371688 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:31.371830 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:31.371717 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:31.371830 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:31.371730 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:31.386675 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:31.386655 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:31.386792 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:31.386764 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:46:31.417204 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:31.417150 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" podStartSLOduration=8.555879051 podStartE2EDuration="26.417138254s" podCreationTimestamp="2026-04-22 18:46:05 +0000 UTC" firstStartedPulling="2026-04-22 18:46:07.820382311 +0000 UTC m=+3.184182179" lastFinishedPulling="2026-04-22 18:46:25.681641502 +0000 UTC m=+21.045441382" observedRunningTime="2026-04-22 18:46:31.416613808 +0000 UTC m=+26.780413697" watchObservedRunningTime="2026-04-22 18:46:31.417138254 +0000 UTC m=+26.780938145" Apr 22 18:46:32.375405 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:32.375231 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87xk8" event={"ID":"ac730e3f-49e1-4703-9a89-4d82e11d265d","Type":"ContainerStarted","Data":"5d14fad10a2a0a678f3dc7f71813b282d349775ffa781bde196ebdc05bae6c1e"} Apr 22 18:46:32.502312 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:32.502248 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-89stm"] Apr 22 18:46:32.502419 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:32.502351 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:32.502471 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:32.502445 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-89stm" podUID="85ff2eb7-3fb1-424b-9402-d67103c35bf2" Apr 22 18:46:32.505421 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:32.505397 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5g7dk"] Apr 22 18:46:32.505536 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:32.505490 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:32.505583 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:32.505561 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g7dk" podUID="252dfd14-9c83-4928-bbcd-d84b479525bc" Apr 22 18:46:32.514090 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:32.514068 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7njpk"] Apr 22 18:46:32.514242 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:32.514171 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:32.514311 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:32.514262 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7njpk" podUID="8f788b1d-19db-4fcb-a814-0c4d333cdee1" Apr 22 18:46:32.991123 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:32.991032 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8f788b1d-19db-4fcb-a814-0c4d333cdee1-original-pull-secret\") pod \"global-pull-secret-syncer-7njpk\" (UID: \"8f788b1d-19db-4fcb-a814-0c4d333cdee1\") " pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:32.991273 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:32.991179 2535 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:32.991273 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:32.991249 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f788b1d-19db-4fcb-a814-0c4d333cdee1-original-pull-secret podName:8f788b1d-19db-4fcb-a814-0c4d333cdee1 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:36.991229139 +0000 UTC m=+32.355029006 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8f788b1d-19db-4fcb-a814-0c4d333cdee1-original-pull-secret") pod "global-pull-secret-syncer-7njpk" (UID: "8f788b1d-19db-4fcb-a814-0c4d333cdee1") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:33.379251 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:33.379158 2535 generic.go:358] "Generic (PLEG): container finished" podID="ac730e3f-49e1-4703-9a89-4d82e11d265d" containerID="5d14fad10a2a0a678f3dc7f71813b282d349775ffa781bde196ebdc05bae6c1e" exitCode=0 Apr 22 18:46:33.379251 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:33.379235 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87xk8" event={"ID":"ac730e3f-49e1-4703-9a89-4d82e11d265d","Type":"ContainerDied","Data":"5d14fad10a2a0a678f3dc7f71813b282d349775ffa781bde196ebdc05bae6c1e"} Apr 22 18:46:34.254533 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:34.254459 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:34.254653 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:34.254461 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:34.254653 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:34.254461 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:34.254653 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:34.254550 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-89stm" podUID="85ff2eb7-3fb1-424b-9402-d67103c35bf2" Apr 22 18:46:34.254763 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:34.254678 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g7dk" podUID="252dfd14-9c83-4928-bbcd-d84b479525bc" Apr 22 18:46:34.254763 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:34.254732 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7njpk" podUID="8f788b1d-19db-4fcb-a814-0c4d333cdee1" Apr 22 18:46:34.384082 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:34.384052 2535 generic.go:358] "Generic (PLEG): container finished" podID="ac730e3f-49e1-4703-9a89-4d82e11d265d" containerID="8b2f7f5047144ca5d917b20e4337c6e94a64519da0ec8b5eb0e45d72e06dc03f" exitCode=0 Apr 22 18:46:34.384455 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:34.384113 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87xk8" event={"ID":"ac730e3f-49e1-4703-9a89-4d82e11d265d","Type":"ContainerDied","Data":"8b2f7f5047144ca5d917b20e4337c6e94a64519da0ec8b5eb0e45d72e06dc03f"} Apr 22 18:46:36.213509 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:36.213474 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-cmrnt" Apr 22 18:46:36.214220 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:36.213615 2535 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:46:36.214220 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:36.214078 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-cmrnt" Apr 22 18:46:36.254221 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:36.254196 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:36.254221 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:36.254217 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:36.254443 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:36.254218 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:36.254443 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:36.254292 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7njpk" podUID="8f788b1d-19db-4fcb-a814-0c4d333cdee1" Apr 22 18:46:36.254443 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:36.254389 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g7dk" podUID="252dfd14-9c83-4928-bbcd-d84b479525bc" Apr 22 18:46:36.254593 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:36.254512 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-89stm" podUID="85ff2eb7-3fb1-424b-9402-d67103c35bf2" Apr 22 18:46:37.018854 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:37.018761 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8f788b1d-19db-4fcb-a814-0c4d333cdee1-original-pull-secret\") pod \"global-pull-secret-syncer-7njpk\" (UID: \"8f788b1d-19db-4fcb-a814-0c4d333cdee1\") " pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:37.019131 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:37.018953 2535 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:37.019131 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:37.019030 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f788b1d-19db-4fcb-a814-0c4d333cdee1-original-pull-secret podName:8f788b1d-19db-4fcb-a814-0c4d333cdee1 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:45.019010313 +0000 UTC m=+40.382810197 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8f788b1d-19db-4fcb-a814-0c4d333cdee1-original-pull-secret") pod "global-pull-secret-syncer-7njpk" (UID: "8f788b1d-19db-4fcb-a814-0c4d333cdee1") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:38.254955 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.254861 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:38.255534 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.254861 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:38.255534 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:38.254986 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7njpk" podUID="8f788b1d-19db-4fcb-a814-0c4d333cdee1" Apr 22 18:46:38.255534 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.254861 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:38.255534 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:38.255065 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-89stm" podUID="85ff2eb7-3fb1-424b-9402-d67103c35bf2" Apr 22 18:46:38.255534 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:38.255134 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5g7dk" podUID="252dfd14-9c83-4928-bbcd-d84b479525bc" Apr 22 18:46:38.472601 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.472567 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-85.ec2.internal" event="NodeReady" Apr 22 18:46:38.472768 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.472703 2535 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:46:38.509140 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.509073 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-86798d6c57-pqh5r"] Apr 22 18:46:38.531390 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.531124 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xlf8v"] Apr 22 18:46:38.531545 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.531409 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.534128 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.534103 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:46:38.534128 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.534110 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:46:38.534320 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.534284 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:46:38.534459 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.534368 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-mvb7n\"" Apr 22 18:46:38.540823 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.539968 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:46:38.544620 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.543753 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5xfcn"] Apr 22 18:46:38.544620 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.544147 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xlf8v" Apr 22 18:46:38.546613 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.546595 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:46:38.546763 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.546743 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:46:38.546867 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.546841 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-5j5kr\"" Apr 22 18:46:38.547005 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.546983 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:46:38.553422 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.553398 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86798d6c57-pqh5r"] Apr 22 18:46:38.553422 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.553425 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xlf8v"] Apr 22 18:46:38.553561 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.553436 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5xfcn"] Apr 22 18:46:38.553561 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.553526 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5xfcn" Apr 22 18:46:38.555625 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.555607 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wnqbz\"" Apr 22 18:46:38.555717 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.555641 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:46:38.555717 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.555658 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:46:38.631204 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.631174 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/068518bd-bcf7-445a-954c-83b86af21011-config-volume\") pod \"dns-default-5xfcn\" (UID: \"068518bd-bcf7-445a-954c-83b86af21011\") " pod="openshift-dns/dns-default-5xfcn" Apr 22 18:46:38.631347 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.631227 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-ca-trust-extracted\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.631347 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.631258 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-bound-sa-token\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.631347 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.631283 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/068518bd-bcf7-445a-954c-83b86af21011-tmp-dir\") pod \"dns-default-5xfcn\" (UID: \"068518bd-bcf7-445a-954c-83b86af21011\") " pod="openshift-dns/dns-default-5xfcn" Apr 22 18:46:38.631347 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.631313 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k95jf\" (UniqueName: \"kubernetes.io/projected/ff946298-48b5-420f-ae4d-8f56d72aebaf-kube-api-access-k95jf\") pod \"ingress-canary-xlf8v\" (UID: \"ff946298-48b5-420f-ae4d-8f56d72aebaf\") " pod="openshift-ingress-canary/ingress-canary-xlf8v" Apr 22 18:46:38.631534 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.631411 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-certificates\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.631534 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.631447 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj4mm\" (UniqueName: \"kubernetes.io/projected/068518bd-bcf7-445a-954c-83b86af21011-kube-api-access-qj4mm\") pod \"dns-default-5xfcn\" (UID: \"068518bd-bcf7-445a-954c-83b86af21011\") " pod="openshift-dns/dns-default-5xfcn" Apr 22 18:46:38.631534 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.631479 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/068518bd-bcf7-445a-954c-83b86af21011-metrics-tls\") pod \"dns-default-5xfcn\" (UID: \"068518bd-bcf7-445a-954c-83b86af21011\") " pod="openshift-dns/dns-default-5xfcn" Apr 22 18:46:38.631534 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.631518 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.631706 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.631546 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-installation-pull-secrets\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.631706 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.631574 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dxqc\" (UniqueName: \"kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-kube-api-access-8dxqc\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.631706 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.631597 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff946298-48b5-420f-ae4d-8f56d72aebaf-cert\") pod \"ingress-canary-xlf8v\" (UID: \"ff946298-48b5-420f-ae4d-8f56d72aebaf\") " pod="openshift-ingress-canary/ingress-canary-xlf8v" Apr 22 18:46:38.631706 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.631661 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-image-registry-private-configuration\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.631860 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.631703 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-trusted-ca\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.732731 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.732695 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.732933 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.732751 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-installation-pull-secrets\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.732933 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:38.732862 2535 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:46:38.732933 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:38.732887 2535 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86798d6c57-pqh5r: secret "image-registry-tls" not found Apr 22 18:46:38.732933 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.732881 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dxqc\" (UniqueName: \"kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-kube-api-access-8dxqc\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.733093 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.732939 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff946298-48b5-420f-ae4d-8f56d72aebaf-cert\") pod \"ingress-canary-xlf8v\" (UID: \"ff946298-48b5-420f-ae4d-8f56d72aebaf\") " pod="openshift-ingress-canary/ingress-canary-xlf8v" Apr 22 18:46:38.733093 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.732965 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-image-registry-private-configuration\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.733093 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:38.732991 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls podName:b20ac4ad-b759-4b80-9fc7-1085aeefc7ff nodeName:}" failed. No retries permitted until 2026-04-22 18:46:39.232968992 +0000 UTC m=+34.596768877 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls") pod "image-registry-86798d6c57-pqh5r" (UID: "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff") : secret "image-registry-tls" not found Apr 22 18:46:38.733093 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.733020 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-trusted-ca\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.733093 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.733044 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/068518bd-bcf7-445a-954c-83b86af21011-config-volume\") pod \"dns-default-5xfcn\" (UID: \"068518bd-bcf7-445a-954c-83b86af21011\") " pod="openshift-dns/dns-default-5xfcn" Apr 22 18:46:38.733093 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.733070 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-ca-trust-extracted\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.733375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.733096 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-bound-sa-token\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.733375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.733119 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/068518bd-bcf7-445a-954c-83b86af21011-tmp-dir\") pod \"dns-default-5xfcn\" (UID: \"068518bd-bcf7-445a-954c-83b86af21011\") " pod="openshift-dns/dns-default-5xfcn" Apr 22 18:46:38.733375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.733153 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k95jf\" (UniqueName: \"kubernetes.io/projected/ff946298-48b5-420f-ae4d-8f56d72aebaf-kube-api-access-k95jf\") pod \"ingress-canary-xlf8v\" (UID: \"ff946298-48b5-420f-ae4d-8f56d72aebaf\") " pod="openshift-ingress-canary/ingress-canary-xlf8v" Apr 22 18:46:38.733375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.733217 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-certificates\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.733375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.733238 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qj4mm\" (UniqueName: \"kubernetes.io/projected/068518bd-bcf7-445a-954c-83b86af21011-kube-api-access-qj4mm\") pod \"dns-default-5xfcn\" (UID: \"068518bd-bcf7-445a-954c-83b86af21011\") " pod="openshift-dns/dns-default-5xfcn" Apr 22 18:46:38.733375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.733266 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/068518bd-bcf7-445a-954c-83b86af21011-metrics-tls\") pod \"dns-default-5xfcn\" (UID: \"068518bd-bcf7-445a-954c-83b86af21011\") " pod="openshift-dns/dns-default-5xfcn" Apr 22 18:46:38.733375 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:38.733370 2535 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:46:38.733697 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:38.733412 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/068518bd-bcf7-445a-954c-83b86af21011-metrics-tls podName:068518bd-bcf7-445a-954c-83b86af21011 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:39.233396891 +0000 UTC m=+34.597196761 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/068518bd-bcf7-445a-954c-83b86af21011-metrics-tls") pod "dns-default-5xfcn" (UID: "068518bd-bcf7-445a-954c-83b86af21011") : secret "dns-default-metrics-tls" not found Apr 22 18:46:38.733697 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:38.733517 2535 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:46:38.733697 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:38.733583 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff946298-48b5-420f-ae4d-8f56d72aebaf-cert podName:ff946298-48b5-420f-ae4d-8f56d72aebaf nodeName:}" failed. No retries permitted until 2026-04-22 18:46:39.233564361 +0000 UTC m=+34.597364229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff946298-48b5-420f-ae4d-8f56d72aebaf-cert") pod "ingress-canary-xlf8v" (UID: "ff946298-48b5-420f-ae4d-8f56d72aebaf") : secret "canary-serving-cert" not found Apr 22 18:46:38.733893 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.733793 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/068518bd-bcf7-445a-954c-83b86af21011-tmp-dir\") pod \"dns-default-5xfcn\" (UID: \"068518bd-bcf7-445a-954c-83b86af21011\") " pod="openshift-dns/dns-default-5xfcn" Apr 22 18:46:38.734067 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.734044 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-ca-trust-extracted\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.734172 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.734145 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/068518bd-bcf7-445a-954c-83b86af21011-config-volume\") pod \"dns-default-5xfcn\" (UID: \"068518bd-bcf7-445a-954c-83b86af21011\") " pod="openshift-dns/dns-default-5xfcn" Apr 22 18:46:38.734172 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.734165 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-certificates\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.734822 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.734741 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-trusted-ca\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.737472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.737449 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-installation-pull-secrets\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.737571 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.737450 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-image-registry-private-configuration\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.742438 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.742415 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dxqc\" (UniqueName: \"kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-kube-api-access-8dxqc\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.742547 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.742476 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj4mm\" (UniqueName: \"kubernetes.io/projected/068518bd-bcf7-445a-954c-83b86af21011-kube-api-access-qj4mm\") pod \"dns-default-5xfcn\" (UID: \"068518bd-bcf7-445a-954c-83b86af21011\") " pod="openshift-dns/dns-default-5xfcn" Apr 22 18:46:38.742810 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.742777 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-bound-sa-token\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:38.742919 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.742869 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k95jf\" (UniqueName: \"kubernetes.io/projected/ff946298-48b5-420f-ae4d-8f56d72aebaf-kube-api-access-k95jf\") pod \"ingress-canary-xlf8v\" (UID: \"ff946298-48b5-420f-ae4d-8f56d72aebaf\") " pod="openshift-ingress-canary/ingress-canary-xlf8v" Apr 22 18:46:38.834497 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.834264 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs\") pod \"network-metrics-daemon-5g7dk\" (UID: \"252dfd14-9c83-4928-bbcd-d84b479525bc\") " pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:38.834497 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:38.834419 2535 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:38.834699 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:38.834549 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs podName:252dfd14-9c83-4928-bbcd-d84b479525bc nodeName:}" failed. No retries permitted until 2026-04-22 18:47:10.834530719 +0000 UTC m=+66.198330588 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs") pod "network-metrics-daemon-5g7dk" (UID: "252dfd14-9c83-4928-bbcd-d84b479525bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:38.935429 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:38.935386 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9v2np\" (UniqueName: \"kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np\") pod \"network-check-target-89stm\" (UID: \"85ff2eb7-3fb1-424b-9402-d67103c35bf2\") " pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:38.935636 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:38.935616 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:38.935636 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:38.935634 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:38.935800 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:38.935647 2535 projected.go:194] Error preparing data for projected volume kube-api-access-9v2np for pod openshift-network-diagnostics/network-check-target-89stm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:38.935800 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:38.935709 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np podName:85ff2eb7-3fb1-424b-9402-d67103c35bf2 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:10.93569104 +0000 UTC m=+66.299490908 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-9v2np" (UniqueName: "kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np") pod "network-check-target-89stm" (UID: "85ff2eb7-3fb1-424b-9402-d67103c35bf2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:39.236885 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:39.236803 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/068518bd-bcf7-445a-954c-83b86af21011-metrics-tls\") pod \"dns-default-5xfcn\" (UID: \"068518bd-bcf7-445a-954c-83b86af21011\") " pod="openshift-dns/dns-default-5xfcn" Apr 22 18:46:39.236885 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:39.236880 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:39.237100 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:39.236931 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff946298-48b5-420f-ae4d-8f56d72aebaf-cert\") pod \"ingress-canary-xlf8v\" (UID: \"ff946298-48b5-420f-ae4d-8f56d72aebaf\") " pod="openshift-ingress-canary/ingress-canary-xlf8v" Apr 22 18:46:39.237100 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:39.236954 2535 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:46:39.237100 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:39.237012 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/068518bd-bcf7-445a-954c-83b86af21011-metrics-tls podName:068518bd-bcf7-445a-954c-83b86af21011 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:40.236996494 +0000 UTC m=+35.600796364 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/068518bd-bcf7-445a-954c-83b86af21011-metrics-tls") pod "dns-default-5xfcn" (UID: "068518bd-bcf7-445a-954c-83b86af21011") : secret "dns-default-metrics-tls" not found Apr 22 18:46:39.237100 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:39.237042 2535 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:46:39.237100 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:39.237085 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff946298-48b5-420f-ae4d-8f56d72aebaf-cert podName:ff946298-48b5-420f-ae4d-8f56d72aebaf nodeName:}" failed. No retries permitted until 2026-04-22 18:46:40.237073262 +0000 UTC m=+35.600873129 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff946298-48b5-420f-ae4d-8f56d72aebaf-cert") pod "ingress-canary-xlf8v" (UID: "ff946298-48b5-420f-ae4d-8f56d72aebaf") : secret "canary-serving-cert" not found Apr 22 18:46:39.237260 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:39.237144 2535 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:46:39.237260 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:39.237152 2535 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86798d6c57-pqh5r: secret "image-registry-tls" not found Apr 22 18:46:39.237260 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:39.237173 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls podName:b20ac4ad-b759-4b80-9fc7-1085aeefc7ff nodeName:}" failed. No retries permitted until 2026-04-22 18:46:40.237166587 +0000 UTC m=+35.600966455 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls") pod "image-registry-86798d6c57-pqh5r" (UID: "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff") : secret "image-registry-tls" not found Apr 22 18:46:39.848163 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:39.848126 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-c59lp"] Apr 22 18:46:39.874658 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:39.874634 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-c59lp"] Apr 22 18:46:39.874806 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:39.874741 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-c59lp" Apr 22 18:46:39.877448 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:39.877421 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 18:46:39.878204 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:39.878177 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 18:46:39.878334 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:39.878190 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-pzwbg\"" Apr 22 18:46:39.878334 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:39.878320 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 18:46:39.878443 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:39.878356 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 18:46:40.044169 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.044134 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/52795c72-c0eb-4ff8-a408-490265dbe6ed-signing-key\") pod \"service-ca-865cb79987-c59lp\" (UID: \"52795c72-c0eb-4ff8-a408-490265dbe6ed\") " pod="openshift-service-ca/service-ca-865cb79987-c59lp" Apr 22 18:46:40.044169 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.044182 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/52795c72-c0eb-4ff8-a408-490265dbe6ed-signing-cabundle\") pod \"service-ca-865cb79987-c59lp\" (UID: \"52795c72-c0eb-4ff8-a408-490265dbe6ed\") " pod="openshift-service-ca/service-ca-865cb79987-c59lp" Apr 22 18:46:40.044347 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.044238 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mzll\" (UniqueName: \"kubernetes.io/projected/52795c72-c0eb-4ff8-a408-490265dbe6ed-kube-api-access-4mzll\") pod \"service-ca-865cb79987-c59lp\" (UID: \"52795c72-c0eb-4ff8-a408-490265dbe6ed\") " pod="openshift-service-ca/service-ca-865cb79987-c59lp" Apr 22 18:46:40.145272 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.145167 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/52795c72-c0eb-4ff8-a408-490265dbe6ed-signing-key\") pod \"service-ca-865cb79987-c59lp\" (UID: \"52795c72-c0eb-4ff8-a408-490265dbe6ed\") " pod="openshift-service-ca/service-ca-865cb79987-c59lp" Apr 22 18:46:40.145272 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.145211 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/52795c72-c0eb-4ff8-a408-490265dbe6ed-signing-cabundle\") pod \"service-ca-865cb79987-c59lp\" (UID: \"52795c72-c0eb-4ff8-a408-490265dbe6ed\") " pod="openshift-service-ca/service-ca-865cb79987-c59lp" Apr 22 18:46:40.145446 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.145326 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mzll\" (UniqueName: \"kubernetes.io/projected/52795c72-c0eb-4ff8-a408-490265dbe6ed-kube-api-access-4mzll\") pod \"service-ca-865cb79987-c59lp\" (UID: \"52795c72-c0eb-4ff8-a408-490265dbe6ed\") " pod="openshift-service-ca/service-ca-865cb79987-c59lp" Apr 22 18:46:40.145866 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.145841 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/52795c72-c0eb-4ff8-a408-490265dbe6ed-signing-cabundle\") pod \"service-ca-865cb79987-c59lp\" (UID: \"52795c72-c0eb-4ff8-a408-490265dbe6ed\") " pod="openshift-service-ca/service-ca-865cb79987-c59lp" Apr 22 18:46:40.147982 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.147966 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/52795c72-c0eb-4ff8-a408-490265dbe6ed-signing-key\") pod \"service-ca-865cb79987-c59lp\" (UID: \"52795c72-c0eb-4ff8-a408-490265dbe6ed\") " pod="openshift-service-ca/service-ca-865cb79987-c59lp" Apr 22 18:46:40.152601 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.152580 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mzll\" (UniqueName: \"kubernetes.io/projected/52795c72-c0eb-4ff8-a408-490265dbe6ed-kube-api-access-4mzll\") pod \"service-ca-865cb79987-c59lp\" (UID: \"52795c72-c0eb-4ff8-a408-490265dbe6ed\") " pod="openshift-service-ca/service-ca-865cb79987-c59lp" Apr 22 18:46:40.185174 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.185147 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-c59lp" Apr 22 18:46:40.246031 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.246006 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/068518bd-bcf7-445a-954c-83b86af21011-metrics-tls\") pod \"dns-default-5xfcn\" (UID: \"068518bd-bcf7-445a-954c-83b86af21011\") " pod="openshift-dns/dns-default-5xfcn" Apr 22 18:46:40.246124 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.246051 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:40.246124 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.246070 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff946298-48b5-420f-ae4d-8f56d72aebaf-cert\") pod \"ingress-canary-xlf8v\" (UID: \"ff946298-48b5-420f-ae4d-8f56d72aebaf\") " pod="openshift-ingress-canary/ingress-canary-xlf8v" Apr 22 18:46:40.246190 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:40.246152 2535 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:46:40.246190 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:40.246156 2535 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:46:40.246251 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:40.246158 2535 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:46:40.246251 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:40.246204 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff946298-48b5-420f-ae4d-8f56d72aebaf-cert podName:ff946298-48b5-420f-ae4d-8f56d72aebaf nodeName:}" failed. No retries permitted until 2026-04-22 18:46:42.246191138 +0000 UTC m=+37.609991005 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff946298-48b5-420f-ae4d-8f56d72aebaf-cert") pod "ingress-canary-xlf8v" (UID: "ff946298-48b5-420f-ae4d-8f56d72aebaf") : secret "canary-serving-cert" not found Apr 22 18:46:40.246251 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:40.246204 2535 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86798d6c57-pqh5r: secret "image-registry-tls" not found Apr 22 18:46:40.246251 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:40.246216 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/068518bd-bcf7-445a-954c-83b86af21011-metrics-tls podName:068518bd-bcf7-445a-954c-83b86af21011 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:42.246210507 +0000 UTC m=+37.610010374 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/068518bd-bcf7-445a-954c-83b86af21011-metrics-tls") pod "dns-default-5xfcn" (UID: "068518bd-bcf7-445a-954c-83b86af21011") : secret "dns-default-metrics-tls" not found Apr 22 18:46:40.246251 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:40.246235 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls podName:b20ac4ad-b759-4b80-9fc7-1085aeefc7ff nodeName:}" failed. No retries permitted until 2026-04-22 18:46:42.246224181 +0000 UTC m=+37.610024048 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls") pod "image-registry-86798d6c57-pqh5r" (UID: "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff") : secret "image-registry-tls" not found Apr 22 18:46:40.254473 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.254457 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:40.254473 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.254470 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:46:40.254576 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.254470 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:46:40.257283 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.257254 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:46:40.257375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.257293 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:46:40.257375 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.257321 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-scwsl\"" Apr 22 18:46:40.258252 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.258233 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:46:40.258347 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.258251 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:46:40.258347 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.258235 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4wrrh\"" Apr 22 18:46:40.556669 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:40.556642 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-c59lp"] Apr 22 18:46:40.560494 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:40.560470 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52795c72_c0eb_4ff8_a408_490265dbe6ed.slice/crio-189cbcb60b64520d3516f054a3710fb2c7429fcfaf12b79b131bde7487e88d47 WatchSource:0}: Error finding container 189cbcb60b64520d3516f054a3710fb2c7429fcfaf12b79b131bde7487e88d47: Status 404 returned error can't find the container with id 189cbcb60b64520d3516f054a3710fb2c7429fcfaf12b79b131bde7487e88d47 Apr 22 18:46:41.400230 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:41.400199 2535 generic.go:358] "Generic (PLEG): container finished" podID="ac730e3f-49e1-4703-9a89-4d82e11d265d" containerID="7e052f2ce6b134a3ff694b3011317e41a18decc4c380ee01084203978f8eed4a" exitCode=0 Apr 22 18:46:41.400637 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:41.400278 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87xk8" event={"ID":"ac730e3f-49e1-4703-9a89-4d82e11d265d","Type":"ContainerDied","Data":"7e052f2ce6b134a3ff694b3011317e41a18decc4c380ee01084203978f8eed4a"} Apr 22 18:46:41.401768 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:41.401357 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-c59lp" event={"ID":"52795c72-c0eb-4ff8-a408-490265dbe6ed","Type":"ContainerStarted","Data":"189cbcb60b64520d3516f054a3710fb2c7429fcfaf12b79b131bde7487e88d47"} Apr 22 18:46:41.671363 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:41.671290 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hgtq9_62c7a7b2-9390-4fb2-84e7-23f981d15afe/dns-node-resolver/0.log" Apr 22 18:46:42.262643 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:42.262449 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/068518bd-bcf7-445a-954c-83b86af21011-metrics-tls\") pod \"dns-default-5xfcn\" (UID: \"068518bd-bcf7-445a-954c-83b86af21011\") " pod="openshift-dns/dns-default-5xfcn" Apr 22 18:46:42.262819 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:42.262685 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:42.262819 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:42.262710 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff946298-48b5-420f-ae4d-8f56d72aebaf-cert\") pod \"ingress-canary-xlf8v\" (UID: \"ff946298-48b5-420f-ae4d-8f56d72aebaf\") " pod="openshift-ingress-canary/ingress-canary-xlf8v" Apr 22 18:46:42.262819 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:42.262605 2535 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:46:42.262819 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:42.262805 2535 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:46:42.262819 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:42.262818 2535 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86798d6c57-pqh5r: secret "image-registry-tls" not found Apr 22 18:46:42.263071 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:42.262825 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/068518bd-bcf7-445a-954c-83b86af21011-metrics-tls podName:068518bd-bcf7-445a-954c-83b86af21011 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:46.262806738 +0000 UTC m=+41.626606607 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/068518bd-bcf7-445a-954c-83b86af21011-metrics-tls") pod "dns-default-5xfcn" (UID: "068518bd-bcf7-445a-954c-83b86af21011") : secret "dns-default-metrics-tls" not found Apr 22 18:46:42.263071 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:42.262825 2535 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:46:42.263071 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:42.262850 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls podName:b20ac4ad-b759-4b80-9fc7-1085aeefc7ff nodeName:}" failed. No retries permitted until 2026-04-22 18:46:46.262840525 +0000 UTC m=+41.626640393 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls") pod "image-registry-86798d6c57-pqh5r" (UID: "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff") : secret "image-registry-tls" not found Apr 22 18:46:42.263071 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:42.262890 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff946298-48b5-420f-ae4d-8f56d72aebaf-cert podName:ff946298-48b5-420f-ae4d-8f56d72aebaf nodeName:}" failed. No retries permitted until 2026-04-22 18:46:46.262870252 +0000 UTC m=+41.626670133 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff946298-48b5-420f-ae4d-8f56d72aebaf-cert") pod "ingress-canary-xlf8v" (UID: "ff946298-48b5-420f-ae4d-8f56d72aebaf") : secret "canary-serving-cert" not found Apr 22 18:46:42.407685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:42.407601 2535 generic.go:358] "Generic (PLEG): container finished" podID="ac730e3f-49e1-4703-9a89-4d82e11d265d" containerID="a765eee057017a9e185b2c299c973a90e866011124b69e905e0441badbe4ae8a" exitCode=0 Apr 22 18:46:42.407685 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:42.407666 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87xk8" event={"ID":"ac730e3f-49e1-4703-9a89-4d82e11d265d","Type":"ContainerDied","Data":"a765eee057017a9e185b2c299c973a90e866011124b69e905e0441badbe4ae8a"} Apr 22 18:46:42.670984 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:42.670936 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-642sx_476d713f-1df9-4369-9fcb-94a61680226e/node-ca/0.log" Apr 22 18:46:43.412842 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:43.412765 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-87xk8" event={"ID":"ac730e3f-49e1-4703-9a89-4d82e11d265d","Type":"ContainerStarted","Data":"fcbe723e62ac64b82b3895268485062c1f319e5b11f0a77b38842a800b3d0714"} Apr 22 18:46:43.414115 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:43.414084 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-c59lp" event={"ID":"52795c72-c0eb-4ff8-a408-490265dbe6ed","Type":"ContainerStarted","Data":"6dcb2005953644fd94ee369a47152a1a16de85f273e9a304cb748811bd91f563"} Apr 22 18:46:43.436864 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:43.436822 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-87xk8" podStartSLOduration=5.855170298 podStartE2EDuration="38.43678873s" podCreationTimestamp="2026-04-22 18:46:05 +0000 UTC" firstStartedPulling="2026-04-22 18:46:07.830732741 +0000 UTC m=+3.194532610" lastFinishedPulling="2026-04-22 18:46:40.412351173 +0000 UTC m=+35.776151042" observedRunningTime="2026-04-22 18:46:43.435695708 +0000 UTC m=+38.799495596" watchObservedRunningTime="2026-04-22 18:46:43.43678873 +0000 UTC m=+38.800588620" Apr 22 18:46:43.451140 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:43.451087 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-c59lp" podStartSLOduration=2.115746442 podStartE2EDuration="4.451077025s" podCreationTimestamp="2026-04-22 18:46:39 +0000 UTC" firstStartedPulling="2026-04-22 18:46:40.562523454 +0000 UTC m=+35.926323321" lastFinishedPulling="2026-04-22 18:46:42.897854021 +0000 UTC m=+38.261653904" observedRunningTime="2026-04-22 18:46:43.45016334 +0000 UTC m=+38.813963229" watchObservedRunningTime="2026-04-22 18:46:43.451077025 +0000 UTC m=+38.814876913" Apr 22 18:46:45.085992 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:45.085949 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8f788b1d-19db-4fcb-a814-0c4d333cdee1-original-pull-secret\") pod \"global-pull-secret-syncer-7njpk\" (UID: \"8f788b1d-19db-4fcb-a814-0c4d333cdee1\") " pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:45.090195 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:45.090159 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8f788b1d-19db-4fcb-a814-0c4d333cdee1-original-pull-secret\") pod \"global-pull-secret-syncer-7njpk\" (UID: \"8f788b1d-19db-4fcb-a814-0c4d333cdee1\") " pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:45.364722 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:45.364646 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7njpk" Apr 22 18:46:45.479198 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:45.479164 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7njpk"] Apr 22 18:46:45.481809 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:45.481780 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f788b1d_19db_4fcb_a814_0c4d333cdee1.slice/crio-e031d6270270cbf1a6accde4bc8ddeeda1e3dedf49e1247155daba9457d3a685 WatchSource:0}: Error finding container e031d6270270cbf1a6accde4bc8ddeeda1e3dedf49e1247155daba9457d3a685: Status 404 returned error can't find the container with id e031d6270270cbf1a6accde4bc8ddeeda1e3dedf49e1247155daba9457d3a685 Apr 22 18:46:46.295929 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:46.295879 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/068518bd-bcf7-445a-954c-83b86af21011-metrics-tls\") pod \"dns-default-5xfcn\" (UID: \"068518bd-bcf7-445a-954c-83b86af21011\") " pod="openshift-dns/dns-default-5xfcn" Apr 22 18:46:46.296359 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:46.295955 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:46.296359 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:46.295980 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff946298-48b5-420f-ae4d-8f56d72aebaf-cert\") pod \"ingress-canary-xlf8v\" (UID: \"ff946298-48b5-420f-ae4d-8f56d72aebaf\") " pod="openshift-ingress-canary/ingress-canary-xlf8v" Apr 22 18:46:46.296359 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:46.296056 2535 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:46:46.296359 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:46.296100 2535 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:46:46.296359 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:46.296117 2535 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86798d6c57-pqh5r: secret "image-registry-tls" not found Apr 22 18:46:46.296359 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:46.296128 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/068518bd-bcf7-445a-954c-83b86af21011-metrics-tls podName:068518bd-bcf7-445a-954c-83b86af21011 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:54.296106018 +0000 UTC m=+49.659905906 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/068518bd-bcf7-445a-954c-83b86af21011-metrics-tls") pod "dns-default-5xfcn" (UID: "068518bd-bcf7-445a-954c-83b86af21011") : secret "dns-default-metrics-tls" not found Apr 22 18:46:46.296359 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:46.296159 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls podName:b20ac4ad-b759-4b80-9fc7-1085aeefc7ff nodeName:}" failed. No retries permitted until 2026-04-22 18:46:54.296146486 +0000 UTC m=+49.659946353 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls") pod "image-registry-86798d6c57-pqh5r" (UID: "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff") : secret "image-registry-tls" not found Apr 22 18:46:46.296359 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:46.296101 2535 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:46:46.296359 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:46.296185 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff946298-48b5-420f-ae4d-8f56d72aebaf-cert podName:ff946298-48b5-420f-ae4d-8f56d72aebaf nodeName:}" failed. No retries permitted until 2026-04-22 18:46:54.296178752 +0000 UTC m=+49.659978619 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff946298-48b5-420f-ae4d-8f56d72aebaf-cert") pod "ingress-canary-xlf8v" (UID: "ff946298-48b5-420f-ae4d-8f56d72aebaf") : secret "canary-serving-cert" not found Apr 22 18:46:46.421806 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:46.421769 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7njpk" event={"ID":"8f788b1d-19db-4fcb-a814-0c4d333cdee1","Type":"ContainerStarted","Data":"e031d6270270cbf1a6accde4bc8ddeeda1e3dedf49e1247155daba9457d3a685"} Apr 22 18:46:50.431149 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:50.431112 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7njpk" event={"ID":"8f788b1d-19db-4fcb-a814-0c4d333cdee1","Type":"ContainerStarted","Data":"07514032dddc293eeace37a07c39c179ad9ad7c20a64abf150f675d1f9f04476"} Apr 22 18:46:50.445847 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:50.445802 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7njpk" podStartSLOduration=16.987812402 podStartE2EDuration="21.4457911s" podCreationTimestamp="2026-04-22 18:46:29 +0000 UTC" firstStartedPulling="2026-04-22 18:46:45.483485836 +0000 UTC m=+40.847285703" lastFinishedPulling="2026-04-22 18:46:49.941464533 +0000 UTC m=+45.305264401" observedRunningTime="2026-04-22 18:46:50.44557643 +0000 UTC m=+45.809376319" watchObservedRunningTime="2026-04-22 18:46:50.4457911 +0000 UTC m=+45.809590988" Apr 22 18:46:54.355080 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:54.355039 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/068518bd-bcf7-445a-954c-83b86af21011-metrics-tls\") pod \"dns-default-5xfcn\" (UID: \"068518bd-bcf7-445a-954c-83b86af21011\") " pod="openshift-dns/dns-default-5xfcn" Apr 22 18:46:54.355589 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:54.355111 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:54.355589 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:54.355140 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff946298-48b5-420f-ae4d-8f56d72aebaf-cert\") pod \"ingress-canary-xlf8v\" (UID: \"ff946298-48b5-420f-ae4d-8f56d72aebaf\") " pod="openshift-ingress-canary/ingress-canary-xlf8v" Apr 22 18:46:54.357545 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:54.357514 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/068518bd-bcf7-445a-954c-83b86af21011-metrics-tls\") pod \"dns-default-5xfcn\" (UID: \"068518bd-bcf7-445a-954c-83b86af21011\") " pod="openshift-dns/dns-default-5xfcn" Apr 22 18:46:54.358081 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:54.358063 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls\") pod \"image-registry-86798d6c57-pqh5r\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:54.358190 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:54.358169 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff946298-48b5-420f-ae4d-8f56d72aebaf-cert\") pod \"ingress-canary-xlf8v\" (UID: \"ff946298-48b5-420f-ae4d-8f56d72aebaf\") " pod="openshift-ingress-canary/ingress-canary-xlf8v" Apr 22 18:46:54.448652 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:54.448617 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:54.455288 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:54.455232 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xlf8v" Apr 22 18:46:54.463100 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:54.462856 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5xfcn" Apr 22 18:46:54.598796 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:54.598763 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86798d6c57-pqh5r"] Apr 22 18:46:54.604174 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:54.604142 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb20ac4ad_b759_4b80_9fc7_1085aeefc7ff.slice/crio-a10456364baba15b56c90a17197c7fed40fffc57026e645c30d90f9110785846 WatchSource:0}: Error finding container a10456364baba15b56c90a17197c7fed40fffc57026e645c30d90f9110785846: Status 404 returned error can't find the container with id a10456364baba15b56c90a17197c7fed40fffc57026e645c30d90f9110785846 Apr 22 18:46:54.611985 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:54.611960 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xlf8v"] Apr 22 18:46:54.615404 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:54.615378 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff946298_48b5_420f_ae4d_8f56d72aebaf.slice/crio-13baa67035cbd47b07c8e441e52f30fdfaac5c15322c76c432900541b32fb546 WatchSource:0}: Error finding container 13baa67035cbd47b07c8e441e52f30fdfaac5c15322c76c432900541b32fb546: Status 404 returned error can't find the container with id 13baa67035cbd47b07c8e441e52f30fdfaac5c15322c76c432900541b32fb546 Apr 22 18:46:54.627836 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:54.627802 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5xfcn"] Apr 22 18:46:54.631110 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:54.631090 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod068518bd_bcf7_445a_954c_83b86af21011.slice/crio-23ffec75c163348b0533c163ee5005ed3fcd14c32a632fa5049b57c52bfad85b WatchSource:0}: Error finding container 23ffec75c163348b0533c163ee5005ed3fcd14c32a632fa5049b57c52bfad85b: Status 404 returned error can't find the container with id 23ffec75c163348b0533c163ee5005ed3fcd14c32a632fa5049b57c52bfad85b Apr 22 18:46:55.443909 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:55.443874 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" event={"ID":"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff","Type":"ContainerStarted","Data":"8e9747b8b0bd22fbfdd6746813a3955f38d6b6f8782e86db676ff66efe0aecc9"} Apr 22 18:46:55.444384 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:55.443938 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" event={"ID":"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff","Type":"ContainerStarted","Data":"a10456364baba15b56c90a17197c7fed40fffc57026e645c30d90f9110785846"} Apr 22 18:46:55.444680 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:55.444647 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:46:55.446279 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:55.446248 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5xfcn" event={"ID":"068518bd-bcf7-445a-954c-83b86af21011","Type":"ContainerStarted","Data":"23ffec75c163348b0533c163ee5005ed3fcd14c32a632fa5049b57c52bfad85b"} Apr 22 18:46:55.448716 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:55.448689 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xlf8v" event={"ID":"ff946298-48b5-420f-ae4d-8f56d72aebaf","Type":"ContainerStarted","Data":"13baa67035cbd47b07c8e441e52f30fdfaac5c15322c76c432900541b32fb546"} Apr 22 18:46:57.454875 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:57.454833 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5xfcn" event={"ID":"068518bd-bcf7-445a-954c-83b86af21011","Type":"ContainerStarted","Data":"6bd224bdce01ecee6c5c196abf53adf5ed571c50b11ade68a31b719d29b96689"} Apr 22 18:46:57.454875 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:57.454878 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5xfcn" event={"ID":"068518bd-bcf7-445a-954c-83b86af21011","Type":"ContainerStarted","Data":"13a8a11889e3421a97f35f4c505ca01dae17d616231a0e9d761c55ca79415cf9"} Apr 22 18:46:57.455334 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:57.454956 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-5xfcn" Apr 22 18:46:57.456109 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:57.456083 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xlf8v" event={"ID":"ff946298-48b5-420f-ae4d-8f56d72aebaf","Type":"ContainerStarted","Data":"7d8d115be13da72831814f6281f3a27fbebd6ae40962a337b38a9a11c1afbcbe"} Apr 22 18:46:57.470703 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:57.470657 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" podStartSLOduration=50.470645637 podStartE2EDuration="50.470645637s" podCreationTimestamp="2026-04-22 18:46:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:55.464525296 +0000 UTC m=+50.828325184" watchObservedRunningTime="2026-04-22 18:46:57.470645637 +0000 UTC m=+52.834445507" Apr 22 18:46:57.470863 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:57.470843 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5xfcn" podStartSLOduration=17.477692246 podStartE2EDuration="19.470836966s" podCreationTimestamp="2026-04-22 18:46:38 +0000 UTC" firstStartedPulling="2026-04-22 18:46:54.63321746 +0000 UTC m=+49.997017331" lastFinishedPulling="2026-04-22 18:46:56.62636218 +0000 UTC m=+51.990162051" observedRunningTime="2026-04-22 18:46:57.470699374 +0000 UTC m=+52.834499264" watchObservedRunningTime="2026-04-22 18:46:57.470836966 +0000 UTC m=+52.834636855" Apr 22 18:46:58.628505 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.628401 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xlf8v" podStartSLOduration=18.616757412 podStartE2EDuration="20.628381414s" podCreationTimestamp="2026-04-22 18:46:38 +0000 UTC" firstStartedPulling="2026-04-22 18:46:54.617688615 +0000 UTC m=+49.981488488" lastFinishedPulling="2026-04-22 18:46:56.629312623 +0000 UTC m=+51.993112490" observedRunningTime="2026-04-22 18:46:57.484489299 +0000 UTC m=+52.848289189" watchObservedRunningTime="2026-04-22 18:46:58.628381414 +0000 UTC m=+53.992181289" Apr 22 18:46:58.629447 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.629425 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8vl7f"] Apr 22 18:46:58.632261 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.632241 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8vl7f" Apr 22 18:46:58.634852 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.634826 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 18:46:58.634962 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.634832 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-nz4qn\"" Apr 22 18:46:58.647152 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.647128 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8vl7f"] Apr 22 18:46:58.653586 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.653563 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-86798d6c57-pqh5r"] Apr 22 18:46:58.697743 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.697718 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-58bf448f79-bgqh8"] Apr 22 18:46:58.700523 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.700504 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.716417 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.716396 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-58bf448f79-bgqh8"] Apr 22 18:46:58.754892 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.754870 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6m5dg"] Apr 22 18:46:58.757821 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.757804 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6m5dg" Apr 22 18:46:58.760603 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.760583 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:46:58.760836 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.760821 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:46:58.760950 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.760867 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-gf99c\"" Apr 22 18:46:58.760950 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.760884 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:46:58.761063 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.761050 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:46:58.766863 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.766846 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6m5dg"] Apr 22 18:46:58.790689 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.788571 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/3e0d1edc-3f84-4d4b-b197-983f0a58e4e9-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-8vl7f\" (UID: \"3e0d1edc-3f84-4d4b-b197-983f0a58e4e9\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8vl7f" Apr 22 18:46:58.889886 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.889799 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf5c4059-62e9-440e-9901-5c1a13f99ab4-registry-tls\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.889886 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.889837 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e58478a2-3048-4cd7-9baa-4ab6d6f7f226-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6m5dg\" (UID: \"e58478a2-3048-4cd7-9baa-4ab6d6f7f226\") " pod="openshift-insights/insights-runtime-extractor-6m5dg" Apr 22 18:46:58.889886 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.889855 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e58478a2-3048-4cd7-9baa-4ab6d6f7f226-data-volume\") pod \"insights-runtime-extractor-6m5dg\" (UID: \"e58478a2-3048-4cd7-9baa-4ab6d6f7f226\") " pod="openshift-insights/insights-runtime-extractor-6m5dg" Apr 22 18:46:58.889886 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.889876 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf5c4059-62e9-440e-9901-5c1a13f99ab4-bound-sa-token\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.890186 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.889973 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bf5c4059-62e9-440e-9901-5c1a13f99ab4-image-registry-private-configuration\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.890186 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.890022 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf5c4059-62e9-440e-9901-5c1a13f99ab4-trusted-ca\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.890186 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.890047 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf5c4059-62e9-440e-9901-5c1a13f99ab4-registry-certificates\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.890186 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.890105 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2lg7\" (UniqueName: \"kubernetes.io/projected/e58478a2-3048-4cd7-9baa-4ab6d6f7f226-kube-api-access-k2lg7\") pod \"insights-runtime-extractor-6m5dg\" (UID: \"e58478a2-3048-4cd7-9baa-4ab6d6f7f226\") " pod="openshift-insights/insights-runtime-extractor-6m5dg" Apr 22 18:46:58.890186 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.890160 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e58478a2-3048-4cd7-9baa-4ab6d6f7f226-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6m5dg\" (UID: \"e58478a2-3048-4cd7-9baa-4ab6d6f7f226\") " pod="openshift-insights/insights-runtime-extractor-6m5dg" Apr 22 18:46:58.890376 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.890195 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e58478a2-3048-4cd7-9baa-4ab6d6f7f226-crio-socket\") pod \"insights-runtime-extractor-6m5dg\" (UID: \"e58478a2-3048-4cd7-9baa-4ab6d6f7f226\") " pod="openshift-insights/insights-runtime-extractor-6m5dg" Apr 22 18:46:58.890376 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.890254 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/3e0d1edc-3f84-4d4b-b197-983f0a58e4e9-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-8vl7f\" (UID: \"3e0d1edc-3f84-4d4b-b197-983f0a58e4e9\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8vl7f" Apr 22 18:46:58.890376 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.890274 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d89p\" (UniqueName: \"kubernetes.io/projected/bf5c4059-62e9-440e-9901-5c1a13f99ab4-kube-api-access-7d89p\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.890376 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.890297 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf5c4059-62e9-440e-9901-5c1a13f99ab4-ca-trust-extracted\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.890503 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:58.890375 2535 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 22 18:46:58.890503 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.890374 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf5c4059-62e9-440e-9901-5c1a13f99ab4-installation-pull-secrets\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.890503 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:46:58.890437 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e0d1edc-3f84-4d4b-b197-983f0a58e4e9-tls-certificates podName:3e0d1edc-3f84-4d4b-b197-983f0a58e4e9 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:59.390419343 +0000 UTC m=+54.754219213 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/3e0d1edc-3f84-4d4b-b197-983f0a58e4e9-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-8vl7f" (UID: "3e0d1edc-3f84-4d4b-b197-983f0a58e4e9") : secret "prometheus-operator-admission-webhook-tls" not found Apr 22 18:46:58.990867 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.990840 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2lg7\" (UniqueName: \"kubernetes.io/projected/e58478a2-3048-4cd7-9baa-4ab6d6f7f226-kube-api-access-k2lg7\") pod \"insights-runtime-extractor-6m5dg\" (UID: \"e58478a2-3048-4cd7-9baa-4ab6d6f7f226\") " pod="openshift-insights/insights-runtime-extractor-6m5dg" Apr 22 18:46:58.991077 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.990882 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e58478a2-3048-4cd7-9baa-4ab6d6f7f226-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6m5dg\" (UID: \"e58478a2-3048-4cd7-9baa-4ab6d6f7f226\") " pod="openshift-insights/insights-runtime-extractor-6m5dg" Apr 22 18:46:58.991077 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.991028 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e58478a2-3048-4cd7-9baa-4ab6d6f7f226-crio-socket\") pod \"insights-runtime-extractor-6m5dg\" (UID: \"e58478a2-3048-4cd7-9baa-4ab6d6f7f226\") " pod="openshift-insights/insights-runtime-extractor-6m5dg" Apr 22 18:46:58.991201 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.991102 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7d89p\" (UniqueName: \"kubernetes.io/projected/bf5c4059-62e9-440e-9901-5c1a13f99ab4-kube-api-access-7d89p\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.991201 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.991129 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf5c4059-62e9-440e-9901-5c1a13f99ab4-ca-trust-extracted\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.991201 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.991167 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf5c4059-62e9-440e-9901-5c1a13f99ab4-installation-pull-secrets\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.991349 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.991206 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf5c4059-62e9-440e-9901-5c1a13f99ab4-registry-tls\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.991349 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.991230 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e58478a2-3048-4cd7-9baa-4ab6d6f7f226-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6m5dg\" (UID: \"e58478a2-3048-4cd7-9baa-4ab6d6f7f226\") " pod="openshift-insights/insights-runtime-extractor-6m5dg" Apr 22 18:46:58.991349 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.991235 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e58478a2-3048-4cd7-9baa-4ab6d6f7f226-crio-socket\") pod \"insights-runtime-extractor-6m5dg\" (UID: \"e58478a2-3048-4cd7-9baa-4ab6d6f7f226\") " pod="openshift-insights/insights-runtime-extractor-6m5dg" Apr 22 18:46:58.991349 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.991259 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e58478a2-3048-4cd7-9baa-4ab6d6f7f226-data-volume\") pod \"insights-runtime-extractor-6m5dg\" (UID: \"e58478a2-3048-4cd7-9baa-4ab6d6f7f226\") " pod="openshift-insights/insights-runtime-extractor-6m5dg" Apr 22 18:46:58.991349 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.991313 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf5c4059-62e9-440e-9901-5c1a13f99ab4-bound-sa-token\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.991578 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.991356 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bf5c4059-62e9-440e-9901-5c1a13f99ab4-image-registry-private-configuration\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.991578 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.991421 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf5c4059-62e9-440e-9901-5c1a13f99ab4-trusted-ca\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.991578 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.991464 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf5c4059-62e9-440e-9901-5c1a13f99ab4-registry-certificates\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.991730 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.991623 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e58478a2-3048-4cd7-9baa-4ab6d6f7f226-data-volume\") pod \"insights-runtime-extractor-6m5dg\" (UID: \"e58478a2-3048-4cd7-9baa-4ab6d6f7f226\") " pod="openshift-insights/insights-runtime-extractor-6m5dg" Apr 22 18:46:58.992131 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.992107 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf5c4059-62e9-440e-9901-5c1a13f99ab4-ca-trust-extracted\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.992603 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.992322 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf5c4059-62e9-440e-9901-5c1a13f99ab4-registry-certificates\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.992776 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.992748 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf5c4059-62e9-440e-9901-5c1a13f99ab4-trusted-ca\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.992927 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.992887 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e58478a2-3048-4cd7-9baa-4ab6d6f7f226-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6m5dg\" (UID: \"e58478a2-3048-4cd7-9baa-4ab6d6f7f226\") " pod="openshift-insights/insights-runtime-extractor-6m5dg" Apr 22 18:46:58.994192 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.994159 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bf5c4059-62e9-440e-9901-5c1a13f99ab4-image-registry-private-configuration\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.994276 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.994166 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e58478a2-3048-4cd7-9baa-4ab6d6f7f226-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6m5dg\" (UID: \"e58478a2-3048-4cd7-9baa-4ab6d6f7f226\") " pod="openshift-insights/insights-runtime-extractor-6m5dg" Apr 22 18:46:58.994502 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.994485 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf5c4059-62e9-440e-9901-5c1a13f99ab4-registry-tls\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:58.995015 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:58.995000 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf5c4059-62e9-440e-9901-5c1a13f99ab4-installation-pull-secrets\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:59.003132 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:59.003104 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d89p\" (UniqueName: \"kubernetes.io/projected/bf5c4059-62e9-440e-9901-5c1a13f99ab4-kube-api-access-7d89p\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:59.003364 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:59.003343 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf5c4059-62e9-440e-9901-5c1a13f99ab4-bound-sa-token\") pod \"image-registry-58bf448f79-bgqh8\" (UID: \"bf5c4059-62e9-440e-9901-5c1a13f99ab4\") " pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:59.003711 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:59.003690 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2lg7\" (UniqueName: \"kubernetes.io/projected/e58478a2-3048-4cd7-9baa-4ab6d6f7f226-kube-api-access-k2lg7\") pod \"insights-runtime-extractor-6m5dg\" (UID: \"e58478a2-3048-4cd7-9baa-4ab6d6f7f226\") " pod="openshift-insights/insights-runtime-extractor-6m5dg" Apr 22 18:46:59.008698 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:59.008681 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:59.066787 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:59.066219 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6m5dg" Apr 22 18:46:59.147187 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:59.147028 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-58bf448f79-bgqh8"] Apr 22 18:46:59.150312 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:59.150276 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf5c4059_62e9_440e_9901_5c1a13f99ab4.slice/crio-c1abb8fe321caae60d2c0124dadc72b3780f9c259ef9d90781701e3bed05e19a WatchSource:0}: Error finding container c1abb8fe321caae60d2c0124dadc72b3780f9c259ef9d90781701e3bed05e19a: Status 404 returned error can't find the container with id c1abb8fe321caae60d2c0124dadc72b3780f9c259ef9d90781701e3bed05e19a Apr 22 18:46:59.198841 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:59.198816 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6m5dg"] Apr 22 18:46:59.202394 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:59.202371 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode58478a2_3048_4cd7_9baa_4ab6d6f7f226.slice/crio-a51795fe8ae8e1ea216e4481d19f24d50465bec7acc2da97c18e1d4c170d7dbf WatchSource:0}: Error finding container a51795fe8ae8e1ea216e4481d19f24d50465bec7acc2da97c18e1d4c170d7dbf: Status 404 returned error can't find the container with id a51795fe8ae8e1ea216e4481d19f24d50465bec7acc2da97c18e1d4c170d7dbf Apr 22 18:46:59.396172 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:59.396134 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/3e0d1edc-3f84-4d4b-b197-983f0a58e4e9-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-8vl7f\" (UID: \"3e0d1edc-3f84-4d4b-b197-983f0a58e4e9\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8vl7f" Apr 22 18:46:59.399235 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:59.399189 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/3e0d1edc-3f84-4d4b-b197-983f0a58e4e9-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-8vl7f\" (UID: \"3e0d1edc-3f84-4d4b-b197-983f0a58e4e9\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8vl7f" Apr 22 18:46:59.462432 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:59.462398 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6m5dg" event={"ID":"e58478a2-3048-4cd7-9baa-4ab6d6f7f226","Type":"ContainerStarted","Data":"262fc888f729c9e20e4bea85d9131f4f2c8ac01823a91c4cd6d2f5b208819003"} Apr 22 18:46:59.462578 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:59.462436 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6m5dg" event={"ID":"e58478a2-3048-4cd7-9baa-4ab6d6f7f226","Type":"ContainerStarted","Data":"a51795fe8ae8e1ea216e4481d19f24d50465bec7acc2da97c18e1d4c170d7dbf"} Apr 22 18:46:59.463790 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:59.463764 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" event={"ID":"bf5c4059-62e9-440e-9901-5c1a13f99ab4","Type":"ContainerStarted","Data":"15607cb858c863618ebae04daa7b9b85f304ea79d35922aca29f6533a4064715"} Apr 22 18:46:59.463897 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:59.463796 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" event={"ID":"bf5c4059-62e9-440e-9901-5c1a13f99ab4","Type":"ContainerStarted","Data":"c1abb8fe321caae60d2c0124dadc72b3780f9c259ef9d90781701e3bed05e19a"} Apr 22 18:46:59.463897 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:59.463892 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:46:59.481870 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:59.481807 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" podStartSLOduration=1.481792721 podStartE2EDuration="1.481792721s" podCreationTimestamp="2026-04-22 18:46:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:59.481138444 +0000 UTC m=+54.844938335" watchObservedRunningTime="2026-04-22 18:46:59.481792721 +0000 UTC m=+54.845592610" Apr 22 18:46:59.540451 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:59.540413 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8vl7f" Apr 22 18:46:59.654223 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:46:59.654164 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8vl7f"] Apr 22 18:46:59.656937 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:46:59.656894 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e0d1edc_3f84_4d4b_b197_983f0a58e4e9.slice/crio-e7a26e4187238c9281c33d2ed31f7bc205e9e14da9600a6a948ebab260cc11a6 WatchSource:0}: Error finding container e7a26e4187238c9281c33d2ed31f7bc205e9e14da9600a6a948ebab260cc11a6: Status 404 returned error can't find the container with id e7a26e4187238c9281c33d2ed31f7bc205e9e14da9600a6a948ebab260cc11a6 Apr 22 18:47:00.469684 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:00.469644 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6m5dg" event={"ID":"e58478a2-3048-4cd7-9baa-4ab6d6f7f226","Type":"ContainerStarted","Data":"255ed963a78a472932844a74b8a9288c89eb12f14acc16dd75fa54455563033b"} Apr 22 18:47:00.470819 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:00.470795 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8vl7f" event={"ID":"3e0d1edc-3f84-4d4b-b197-983f0a58e4e9","Type":"ContainerStarted","Data":"e7a26e4187238c9281c33d2ed31f7bc205e9e14da9600a6a948ebab260cc11a6"} Apr 22 18:47:01.475514 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:01.475471 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8vl7f" event={"ID":"3e0d1edc-3f84-4d4b-b197-983f0a58e4e9","Type":"ContainerStarted","Data":"ae708decf20efd7d21a57350ba7178a49f01b2a78c9217cc42f0e8f58459e16e"} Apr 22 18:47:01.475974 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:01.475754 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8vl7f" Apr 22 18:47:01.481996 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:01.481970 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8vl7f" Apr 22 18:47:01.491489 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:01.491396 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8vl7f" podStartSLOduration=2.457996616 podStartE2EDuration="3.491376531s" podCreationTimestamp="2026-04-22 18:46:58 +0000 UTC" firstStartedPulling="2026-04-22 18:46:59.658736678 +0000 UTC m=+55.022536545" lastFinishedPulling="2026-04-22 18:47:00.69211659 +0000 UTC m=+56.055916460" observedRunningTime="2026-04-22 18:47:01.48993088 +0000 UTC m=+56.853730770" watchObservedRunningTime="2026-04-22 18:47:01.491376531 +0000 UTC m=+56.855176423" Apr 22 18:47:02.479330 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.479278 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6m5dg" event={"ID":"e58478a2-3048-4cd7-9baa-4ab6d6f7f226","Type":"ContainerStarted","Data":"0ea1efb0a94289fb38df2095643275999be3dfd4471f39feccadbc694ec12aec"} Apr 22 18:47:02.496778 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.496734 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6m5dg" podStartSLOduration=1.844891769 podStartE2EDuration="4.496720677s" podCreationTimestamp="2026-04-22 18:46:58 +0000 UTC" firstStartedPulling="2026-04-22 18:46:59.266767061 +0000 UTC m=+54.630566935" lastFinishedPulling="2026-04-22 18:47:01.918595975 +0000 UTC m=+57.282395843" observedRunningTime="2026-04-22 18:47:02.495660749 +0000 UTC m=+57.859460637" watchObservedRunningTime="2026-04-22 18:47:02.496720677 +0000 UTC m=+57.860520566" Apr 22 18:47:02.587060 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.587026 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-t2jsf"] Apr 22 18:47:02.611757 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.611729 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-t2jsf"] Apr 22 18:47:02.611913 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.611841 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-t2jsf" Apr 22 18:47:02.614567 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.614543 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 18:47:02.614681 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.614564 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:47:02.614681 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.614613 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 18:47:02.614798 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.614723 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:47:02.614858 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.614841 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:47:02.614925 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.614841 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-vnwlt\"" Apr 22 18:47:02.620768 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.620739 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tl4f\" (UniqueName: \"kubernetes.io/projected/8db4bc65-ab32-491a-959b-d84aa9db30b1-kube-api-access-6tl4f\") pod \"prometheus-operator-5676c8c784-t2jsf\" (UID: \"8db4bc65-ab32-491a-959b-d84aa9db30b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2jsf" Apr 22 18:47:02.620768 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.620765 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/8db4bc65-ab32-491a-959b-d84aa9db30b1-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-t2jsf\" (UID: \"8db4bc65-ab32-491a-959b-d84aa9db30b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2jsf" Apr 22 18:47:02.620941 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.620920 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8db4bc65-ab32-491a-959b-d84aa9db30b1-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-t2jsf\" (UID: \"8db4bc65-ab32-491a-959b-d84aa9db30b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2jsf" Apr 22 18:47:02.621004 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.620971 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8db4bc65-ab32-491a-959b-d84aa9db30b1-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-t2jsf\" (UID: \"8db4bc65-ab32-491a-959b-d84aa9db30b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2jsf" Apr 22 18:47:02.721856 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.721824 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8db4bc65-ab32-491a-959b-d84aa9db30b1-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-t2jsf\" (UID: \"8db4bc65-ab32-491a-959b-d84aa9db30b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2jsf" Apr 22 18:47:02.722053 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.721865 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8db4bc65-ab32-491a-959b-d84aa9db30b1-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-t2jsf\" (UID: \"8db4bc65-ab32-491a-959b-d84aa9db30b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2jsf" Apr 22 18:47:02.722053 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.721944 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tl4f\" (UniqueName: \"kubernetes.io/projected/8db4bc65-ab32-491a-959b-d84aa9db30b1-kube-api-access-6tl4f\") pod \"prometheus-operator-5676c8c784-t2jsf\" (UID: \"8db4bc65-ab32-491a-959b-d84aa9db30b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2jsf" Apr 22 18:47:02.722053 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.721970 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/8db4bc65-ab32-491a-959b-d84aa9db30b1-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-t2jsf\" (UID: \"8db4bc65-ab32-491a-959b-d84aa9db30b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2jsf" Apr 22 18:47:02.722202 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:47:02.722099 2535 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 22 18:47:02.722202 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:47:02.722167 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8db4bc65-ab32-491a-959b-d84aa9db30b1-prometheus-operator-tls podName:8db4bc65-ab32-491a-959b-d84aa9db30b1 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:03.222147791 +0000 UTC m=+58.585947675 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/8db4bc65-ab32-491a-959b-d84aa9db30b1-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-t2jsf" (UID: "8db4bc65-ab32-491a-959b-d84aa9db30b1") : secret "prometheus-operator-tls" not found Apr 22 18:47:02.722562 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.722542 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8db4bc65-ab32-491a-959b-d84aa9db30b1-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-t2jsf\" (UID: \"8db4bc65-ab32-491a-959b-d84aa9db30b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2jsf" Apr 22 18:47:02.724388 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.724370 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8db4bc65-ab32-491a-959b-d84aa9db30b1-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-t2jsf\" (UID: \"8db4bc65-ab32-491a-959b-d84aa9db30b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2jsf" Apr 22 18:47:02.731267 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:02.731214 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tl4f\" (UniqueName: \"kubernetes.io/projected/8db4bc65-ab32-491a-959b-d84aa9db30b1-kube-api-access-6tl4f\") pod \"prometheus-operator-5676c8c784-t2jsf\" (UID: \"8db4bc65-ab32-491a-959b-d84aa9db30b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2jsf" Apr 22 18:47:03.226796 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:03.226758 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/8db4bc65-ab32-491a-959b-d84aa9db30b1-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-t2jsf\" (UID: \"8db4bc65-ab32-491a-959b-d84aa9db30b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2jsf" Apr 22 18:47:03.229163 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:03.229143 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/8db4bc65-ab32-491a-959b-d84aa9db30b1-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-t2jsf\" (UID: \"8db4bc65-ab32-491a-959b-d84aa9db30b1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t2jsf" Apr 22 18:47:03.390205 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:03.390180 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b8wsf" Apr 22 18:47:03.520945 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:03.520858 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-t2jsf" Apr 22 18:47:03.636645 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:03.636618 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-t2jsf"] Apr 22 18:47:03.641575 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:47:03.641547 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8db4bc65_ab32_491a_959b_d84aa9db30b1.slice/crio-33a35d381202b0f59736ea2c00ee040afd180d85963317d1bd2293391a49aadb WatchSource:0}: Error finding container 33a35d381202b0f59736ea2c00ee040afd180d85963317d1bd2293391a49aadb: Status 404 returned error can't find the container with id 33a35d381202b0f59736ea2c00ee040afd180d85963317d1bd2293391a49aadb Apr 22 18:47:04.485739 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:04.485693 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-t2jsf" event={"ID":"8db4bc65-ab32-491a-959b-d84aa9db30b1","Type":"ContainerStarted","Data":"33a35d381202b0f59736ea2c00ee040afd180d85963317d1bd2293391a49aadb"} Apr 22 18:47:05.490275 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:05.490198 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-t2jsf" event={"ID":"8db4bc65-ab32-491a-959b-d84aa9db30b1","Type":"ContainerStarted","Data":"30c5306c798f20e7d1dcd81e9cde41e48d30753e14083d2613d3941260da4869"} Apr 22 18:47:05.490275 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:05.490237 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-t2jsf" event={"ID":"8db4bc65-ab32-491a-959b-d84aa9db30b1","Type":"ContainerStarted","Data":"0f837ce609b50e6f490cb84988b41b39fe3e314befdb701d4dc0df2e08a5ad93"} Apr 22 18:47:05.508165 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:05.508088 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-t2jsf" podStartSLOduration=1.943664743 podStartE2EDuration="3.508071876s" podCreationTimestamp="2026-04-22 18:47:02 +0000 UTC" firstStartedPulling="2026-04-22 18:47:03.64354328 +0000 UTC m=+59.007343146" lastFinishedPulling="2026-04-22 18:47:05.207950405 +0000 UTC m=+60.571750279" observedRunningTime="2026-04-22 18:47:05.507563156 +0000 UTC m=+60.871363046" watchObservedRunningTime="2026-04-22 18:47:05.508071876 +0000 UTC m=+60.871871765" Apr 22 18:47:06.966882 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:06.966844 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qt4p8"] Apr 22 18:47:06.970490 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:06.970468 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:06.972960 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:06.972838 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:47:06.972960 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:06.972940 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:47:06.973120 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:06.972944 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7zc6s\"" Apr 22 18:47:06.973120 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:06.973112 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:47:07.055481 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.055449 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-sys\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.055584 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.055494 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-node-exporter-accelerators-collector-config\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.055584 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.055550 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brtgz\" (UniqueName: \"kubernetes.io/projected/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-kube-api-access-brtgz\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.055661 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.055643 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-node-exporter-wtmp\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.055698 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.055673 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-node-exporter-tls\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.055776 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.055758 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-root\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.055821 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.055805 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-metrics-client-ca\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.055851 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.055835 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.055925 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.055893 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-node-exporter-textfile\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.156847 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.156819 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-root\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.156981 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.156864 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-metrics-client-ca\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.156981 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.156890 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.156981 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.156943 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-node-exporter-textfile\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.157136 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.156981 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-sys\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.157136 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.157013 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-node-exporter-accelerators-collector-config\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.157136 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.157048 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brtgz\" (UniqueName: \"kubernetes.io/projected/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-kube-api-access-brtgz\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.157136 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.157070 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-sys\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.157136 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.157088 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-node-exporter-wtmp\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.157136 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.157130 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-node-exporter-tls\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.157409 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.157213 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-node-exporter-wtmp\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.157409 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:47:07.157243 2535 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:47:07.157409 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:47:07.157306 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-node-exporter-tls podName:1d364c8b-db8c-4766-a5ca-5c7ad02e0594 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:07.657287128 +0000 UTC m=+63.021086995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-node-exporter-tls") pod "node-exporter-qt4p8" (UID: "1d364c8b-db8c-4766-a5ca-5c7ad02e0594") : secret "node-exporter-tls" not found Apr 22 18:47:07.157558 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.156943 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-root\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.157610 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.157562 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-node-exporter-textfile\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.157663 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.157620 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-metrics-client-ca\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.157663 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.157628 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-node-exporter-accelerators-collector-config\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.159666 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.159645 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.167147 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.167124 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brtgz\" (UniqueName: \"kubernetes.io/projected/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-kube-api-access-brtgz\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.460298 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.460271 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5xfcn" Apr 22 18:47:07.660879 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.660846 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-node-exporter-tls\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.663191 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.663169 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1d364c8b-db8c-4766-a5ca-5c7ad02e0594-node-exporter-tls\") pod \"node-exporter-qt4p8\" (UID: \"1d364c8b-db8c-4766-a5ca-5c7ad02e0594\") " pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.881796 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:07.881728 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qt4p8" Apr 22 18:47:07.890735 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:47:07.890704 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d364c8b_db8c_4766_a5ca_5c7ad02e0594.slice/crio-6356bb4d63f7a437b87c2c78af97c22e7549bb2367b584160b0a9da2d10dabf3 WatchSource:0}: Error finding container 6356bb4d63f7a437b87c2c78af97c22e7549bb2367b584160b0a9da2d10dabf3: Status 404 returned error can't find the container with id 6356bb4d63f7a437b87c2c78af97c22e7549bb2367b584160b0a9da2d10dabf3 Apr 22 18:47:08.503738 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:08.503691 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qt4p8" event={"ID":"1d364c8b-db8c-4766-a5ca-5c7ad02e0594","Type":"ContainerStarted","Data":"6356bb4d63f7a437b87c2c78af97c22e7549bb2367b584160b0a9da2d10dabf3"} Apr 22 18:47:09.507863 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:09.507829 2535 generic.go:358] "Generic (PLEG): container finished" podID="1d364c8b-db8c-4766-a5ca-5c7ad02e0594" containerID="ce914414c7907bb569325f647e3184ed870e27e97441c220f65a419c9eccd30a" exitCode=0 Apr 22 18:47:09.508227 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:09.507881 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qt4p8" event={"ID":"1d364c8b-db8c-4766-a5ca-5c7ad02e0594","Type":"ContainerDied","Data":"ce914414c7907bb569325f647e3184ed870e27e97441c220f65a419c9eccd30a"} Apr 22 18:47:10.513044 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:10.513012 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qt4p8" event={"ID":"1d364c8b-db8c-4766-a5ca-5c7ad02e0594","Type":"ContainerStarted","Data":"d841a3913c9ca25c94ae09558936f41ad8c80f990389448b5b31454e6385a053"} Apr 22 18:47:10.513044 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:10.513046 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qt4p8" event={"ID":"1d364c8b-db8c-4766-a5ca-5c7ad02e0594","Type":"ContainerStarted","Data":"2b68345b9cbfa3fee5ef663dc5b4f734763927271e04df9b862e105094689dfd"} Apr 22 18:47:10.546259 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:10.546212 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qt4p8" podStartSLOduration=3.837510516 podStartE2EDuration="4.546199105s" podCreationTimestamp="2026-04-22 18:47:06 +0000 UTC" firstStartedPulling="2026-04-22 18:47:07.892674205 +0000 UTC m=+63.256474076" lastFinishedPulling="2026-04-22 18:47:08.601362784 +0000 UTC m=+63.965162665" observedRunningTime="2026-04-22 18:47:10.545030714 +0000 UTC m=+65.908830603" watchObservedRunningTime="2026-04-22 18:47:10.546199105 +0000 UTC m=+65.909998994" Apr 22 18:47:10.887713 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:10.887640 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs\") pod \"network-metrics-daemon-5g7dk\" (UID: \"252dfd14-9c83-4928-bbcd-d84b479525bc\") " pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:47:10.890396 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:10.890379 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:47:10.900703 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:10.900683 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/252dfd14-9c83-4928-bbcd-d84b479525bc-metrics-certs\") pod \"network-metrics-daemon-5g7dk\" (UID: \"252dfd14-9c83-4928-bbcd-d84b479525bc\") " pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:47:10.988156 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:10.988123 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9v2np\" (UniqueName: \"kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np\") pod \"network-check-target-89stm\" (UID: \"85ff2eb7-3fb1-424b-9402-d67103c35bf2\") " pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:47:10.990647 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:10.990628 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:47:11.000917 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:11.000883 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:47:11.011580 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:11.011555 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v2np\" (UniqueName: \"kubernetes.io/projected/85ff2eb7-3fb1-424b-9402-d67103c35bf2-kube-api-access-9v2np\") pod \"network-check-target-89stm\" (UID: \"85ff2eb7-3fb1-424b-9402-d67103c35bf2\") " pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:47:11.174522 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:11.174493 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4wrrh\"" Apr 22 18:47:11.178263 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:11.178246 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-scwsl\"" Apr 22 18:47:11.182966 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:11.182950 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:47:11.186628 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:11.186614 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5g7dk" Apr 22 18:47:11.303298 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:11.303271 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-89stm"] Apr 22 18:47:11.317519 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:47:11.317487 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85ff2eb7_3fb1_424b_9402_d67103c35bf2.slice/crio-64ae1170f6b908cdec2f599b7f93edc0f08ecb9b718c7f0c421184bb62ff8589 WatchSource:0}: Error finding container 64ae1170f6b908cdec2f599b7f93edc0f08ecb9b718c7f0c421184bb62ff8589: Status 404 returned error can't find the container with id 64ae1170f6b908cdec2f599b7f93edc0f08ecb9b718c7f0c421184bb62ff8589 Apr 22 18:47:11.328347 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:11.328324 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5g7dk"] Apr 22 18:47:11.331168 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:47:11.331144 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod252dfd14_9c83_4928_bbcd_d84b479525bc.slice/crio-2f3245b2ef45c8ad479cbb3eab65313969f8aaf25f24f9844fa6af8e30db1ef1 WatchSource:0}: Error finding container 2f3245b2ef45c8ad479cbb3eab65313969f8aaf25f24f9844fa6af8e30db1ef1: Status 404 returned error can't find the container with id 2f3245b2ef45c8ad479cbb3eab65313969f8aaf25f24f9844fa6af8e30db1ef1 Apr 22 18:47:11.517103 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:11.517021 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5g7dk" event={"ID":"252dfd14-9c83-4928-bbcd-d84b479525bc","Type":"ContainerStarted","Data":"2f3245b2ef45c8ad479cbb3eab65313969f8aaf25f24f9844fa6af8e30db1ef1"} Apr 22 18:47:11.518060 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:11.518038 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-89stm" event={"ID":"85ff2eb7-3fb1-424b-9402-d67103c35bf2","Type":"ContainerStarted","Data":"64ae1170f6b908cdec2f599b7f93edc0f08ecb9b718c7f0c421184bb62ff8589"} Apr 22 18:47:13.529940 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:13.528528 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5g7dk" event={"ID":"252dfd14-9c83-4928-bbcd-d84b479525bc","Type":"ContainerStarted","Data":"8259e85b48da03798814841b2f6a496c46c3ad1f06a55dd98d15f2c3345fb028"} Apr 22 18:47:13.529940 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:13.528571 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5g7dk" event={"ID":"252dfd14-9c83-4928-bbcd-d84b479525bc","Type":"ContainerStarted","Data":"557af9afcdfba5a1bbca13cb9e86f78575902a4a8ab33b5bddce887e02f51f8b"} Apr 22 18:47:14.196379 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:14.196325 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5g7dk" podStartSLOduration=68.069328727 podStartE2EDuration="1m9.196304287s" podCreationTimestamp="2026-04-22 18:46:05 +0000 UTC" firstStartedPulling="2026-04-22 18:47:11.332767469 +0000 UTC m=+66.696567336" lastFinishedPulling="2026-04-22 18:47:12.459743016 +0000 UTC m=+67.823542896" observedRunningTime="2026-04-22 18:47:13.5431663 +0000 UTC m=+68.906966190" watchObservedRunningTime="2026-04-22 18:47:14.196304287 +0000 UTC m=+69.560104176" Apr 22 18:47:14.197168 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:14.197139 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-xnrl5"] Apr 22 18:47:14.201675 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:14.201653 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-xnrl5" Apr 22 18:47:14.204099 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:14.204071 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 18:47:14.204220 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:14.204151 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 18:47:14.204428 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:14.204412 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-lnsnh\"" Apr 22 18:47:14.209440 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:14.209414 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-xnrl5"] Apr 22 18:47:14.321215 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:14.321182 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb869\" (UniqueName: \"kubernetes.io/projected/04248ee6-0336-4fa6-bef3-9f8c9e646abb-kube-api-access-hb869\") pod \"downloads-6bcc868b7-xnrl5\" (UID: \"04248ee6-0336-4fa6-bef3-9f8c9e646abb\") " pod="openshift-console/downloads-6bcc868b7-xnrl5" Apr 22 18:47:14.422439 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:14.422406 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hb869\" (UniqueName: \"kubernetes.io/projected/04248ee6-0336-4fa6-bef3-9f8c9e646abb-kube-api-access-hb869\") pod \"downloads-6bcc868b7-xnrl5\" (UID: \"04248ee6-0336-4fa6-bef3-9f8c9e646abb\") " pod="openshift-console/downloads-6bcc868b7-xnrl5" Apr 22 18:47:14.431713 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:14.431684 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb869\" (UniqueName: \"kubernetes.io/projected/04248ee6-0336-4fa6-bef3-9f8c9e646abb-kube-api-access-hb869\") pod \"downloads-6bcc868b7-xnrl5\" (UID: \"04248ee6-0336-4fa6-bef3-9f8c9e646abb\") " pod="openshift-console/downloads-6bcc868b7-xnrl5" Apr 22 18:47:14.512460 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:14.512438 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-xnrl5" Apr 22 18:47:14.637137 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:14.637115 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-xnrl5"] Apr 22 18:47:14.639368 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:47:14.639330 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04248ee6_0336_4fa6_bef3_9f8c9e646abb.slice/crio-836d73148c818e9704e027974a227e0c9d4b6ed4b7add4d7dc4629f12c16c743 WatchSource:0}: Error finding container 836d73148c818e9704e027974a227e0c9d4b6ed4b7add4d7dc4629f12c16c743: Status 404 returned error can't find the container with id 836d73148c818e9704e027974a227e0c9d4b6ed4b7add4d7dc4629f12c16c743 Apr 22 18:47:15.537272 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:15.537230 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-xnrl5" event={"ID":"04248ee6-0336-4fa6-bef3-9f8c9e646abb","Type":"ContainerStarted","Data":"836d73148c818e9704e027974a227e0c9d4b6ed4b7add4d7dc4629f12c16c743"} Apr 22 18:47:15.538773 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:15.538744 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-89stm" event={"ID":"85ff2eb7-3fb1-424b-9402-d67103c35bf2","Type":"ContainerStarted","Data":"883e49b9a6e3dff2bf1d4a527ec9e99ad3926ac98ed221d587dc2949004bbbe3"} Apr 22 18:47:15.539000 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:15.538980 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:47:15.554453 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:15.554396 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-89stm" podStartSLOduration=67.397129655 podStartE2EDuration="1m10.55438016s" podCreationTimestamp="2026-04-22 18:46:05 +0000 UTC" firstStartedPulling="2026-04-22 18:47:11.319403973 +0000 UTC m=+66.683203840" lastFinishedPulling="2026-04-22 18:47:14.476654475 +0000 UTC m=+69.840454345" observedRunningTime="2026-04-22 18:47:15.553777741 +0000 UTC m=+70.917577629" watchObservedRunningTime="2026-04-22 18:47:15.55438016 +0000 UTC m=+70.918180071" Apr 22 18:47:18.660036 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:18.660005 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:47:20.476307 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:20.476271 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-58bf448f79-bgqh8" Apr 22 18:47:23.674461 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:23.674393 2535 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" podUID="b20ac4ad-b759-4b80-9fc7-1085aeefc7ff" containerName="registry" containerID="cri-o://8e9747b8b0bd22fbfdd6746813a3955f38d6b6f8782e86db676ff66efe0aecc9" gracePeriod=30 Apr 22 18:47:23.923962 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:23.923941 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:47:24.004597 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.004534 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-certificates\") pod \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " Apr 22 18:47:24.004748 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.004595 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-image-registry-private-configuration\") pod \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " Apr 22 18:47:24.004748 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.004640 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-bound-sa-token\") pod \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " Apr 22 18:47:24.004748 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.004665 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-installation-pull-secrets\") pod \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " Apr 22 18:47:24.004748 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.004702 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dxqc\" (UniqueName: \"kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-kube-api-access-8dxqc\") pod \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " Apr 22 18:47:24.004748 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.004736 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-ca-trust-extracted\") pod \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " Apr 22 18:47:24.005036 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.004760 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-trusted-ca\") pod \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " Apr 22 18:47:24.005036 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.004795 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls\") pod \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\" (UID: \"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff\") " Apr 22 18:47:24.005036 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.004990 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff" (UID: "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:47:24.005499 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.005438 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff" (UID: "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:47:24.007878 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.007811 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff" (UID: "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:47:24.008191 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.008167 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff" (UID: "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:47:24.008443 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.008413 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-kube-api-access-8dxqc" (OuterVolumeSpecName: "kube-api-access-8dxqc") pod "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff" (UID: "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff"). InnerVolumeSpecName "kube-api-access-8dxqc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:47:24.008704 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.008678 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff" (UID: "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:24.008776 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.008756 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff" (UID: "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:24.016735 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.016709 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff" (UID: "b20ac4ad-b759-4b80-9fc7-1085aeefc7ff"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:47:24.105357 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.105330 2535 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-bound-sa-token\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:47:24.105357 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.105354 2535 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-installation-pull-secrets\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:47:24.105531 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.105366 2535 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8dxqc\" (UniqueName: \"kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-kube-api-access-8dxqc\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:47:24.105531 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.105375 2535 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-ca-trust-extracted\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:47:24.105531 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.105385 2535 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-trusted-ca\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:47:24.105531 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.105394 2535 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-tls\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:47:24.105531 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.105402 2535 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-registry-certificates\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:47:24.105531 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.105412 2535 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff-image-registry-private-configuration\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:47:24.565376 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.565246 2535 generic.go:358] "Generic (PLEG): container finished" podID="b20ac4ad-b759-4b80-9fc7-1085aeefc7ff" containerID="8e9747b8b0bd22fbfdd6746813a3955f38d6b6f8782e86db676ff66efe0aecc9" exitCode=0 Apr 22 18:47:24.565376 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.565297 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" event={"ID":"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff","Type":"ContainerDied","Data":"8e9747b8b0bd22fbfdd6746813a3955f38d6b6f8782e86db676ff66efe0aecc9"} Apr 22 18:47:24.565376 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.565324 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" Apr 22 18:47:24.565376 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.565340 2535 scope.go:117] "RemoveContainer" containerID="8e9747b8b0bd22fbfdd6746813a3955f38d6b6f8782e86db676ff66efe0aecc9" Apr 22 18:47:24.565711 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.565327 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86798d6c57-pqh5r" event={"ID":"b20ac4ad-b759-4b80-9fc7-1085aeefc7ff","Type":"ContainerDied","Data":"a10456364baba15b56c90a17197c7fed40fffc57026e645c30d90f9110785846"} Apr 22 18:47:24.588681 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.588654 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-86798d6c57-pqh5r"] Apr 22 18:47:24.592695 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:24.592670 2535 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-86798d6c57-pqh5r"] Apr 22 18:47:25.117063 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.117030 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-55685f47cf-8kn8f"] Apr 22 18:47:25.117671 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.117335 2535 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b20ac4ad-b759-4b80-9fc7-1085aeefc7ff" containerName="registry" Apr 22 18:47:25.117671 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.117352 2535 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20ac4ad-b759-4b80-9fc7-1085aeefc7ff" containerName="registry" Apr 22 18:47:25.117671 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.117410 2535 memory_manager.go:356] "RemoveStaleState removing state" podUID="b20ac4ad-b759-4b80-9fc7-1085aeefc7ff" containerName="registry" Apr 22 18:47:25.121922 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.121883 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:25.124532 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.124506 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 18:47:25.124664 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.124506 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 18:47:25.125176 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.125157 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 18:47:25.125282 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.125203 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 18:47:25.125522 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.125502 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-mk7sg\"" Apr 22 18:47:25.125645 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.125628 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 18:47:25.133502 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.133480 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55685f47cf-8kn8f"] Apr 22 18:47:25.212766 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.212735 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-service-ca\") pod \"console-55685f47cf-8kn8f\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:25.212955 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.212778 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-console-oauth-config\") pod \"console-55685f47cf-8kn8f\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:25.212955 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.212872 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5djx\" (UniqueName: \"kubernetes.io/projected/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-kube-api-access-v5djx\") pod \"console-55685f47cf-8kn8f\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:25.212955 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.212928 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-console-serving-cert\") pod \"console-55685f47cf-8kn8f\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:25.213116 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.212968 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-oauth-serving-cert\") pod \"console-55685f47cf-8kn8f\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:25.213116 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.213050 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-console-config\") pod \"console-55685f47cf-8kn8f\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:25.258723 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.258685 2535 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b20ac4ad-b759-4b80-9fc7-1085aeefc7ff" path="/var/lib/kubelet/pods/b20ac4ad-b759-4b80-9fc7-1085aeefc7ff/volumes" Apr 22 18:47:25.313434 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.313399 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-console-config\") pod \"console-55685f47cf-8kn8f\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:25.313610 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.313459 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-service-ca\") pod \"console-55685f47cf-8kn8f\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:25.313610 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.313493 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-console-oauth-config\") pod \"console-55685f47cf-8kn8f\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:25.313993 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.313967 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5djx\" (UniqueName: \"kubernetes.io/projected/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-kube-api-access-v5djx\") pod \"console-55685f47cf-8kn8f\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:25.314155 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.314028 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-console-serving-cert\") pod \"console-55685f47cf-8kn8f\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:25.314155 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.314067 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-oauth-serving-cert\") pod \"console-55685f47cf-8kn8f\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:25.314155 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.314134 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-console-config\") pod \"console-55685f47cf-8kn8f\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:25.314155 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.314144 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-service-ca\") pod \"console-55685f47cf-8kn8f\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:25.314745 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.314694 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-oauth-serving-cert\") pod \"console-55685f47cf-8kn8f\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:25.319401 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.319365 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-console-oauth-config\") pod \"console-55685f47cf-8kn8f\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:25.319498 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.319406 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-console-serving-cert\") pod \"console-55685f47cf-8kn8f\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:25.324966 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.324945 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5djx\" (UniqueName: \"kubernetes.io/projected/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-kube-api-access-v5djx\") pod \"console-55685f47cf-8kn8f\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:25.432600 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:25.432565 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:31.297672 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:31.297567 2535 scope.go:117] "RemoveContainer" containerID="8e9747b8b0bd22fbfdd6746813a3955f38d6b6f8782e86db676ff66efe0aecc9" Apr 22 18:47:31.297977 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:47:31.297951 2535 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e9747b8b0bd22fbfdd6746813a3955f38d6b6f8782e86db676ff66efe0aecc9\": container with ID starting with 8e9747b8b0bd22fbfdd6746813a3955f38d6b6f8782e86db676ff66efe0aecc9 not found: ID does not exist" containerID="8e9747b8b0bd22fbfdd6746813a3955f38d6b6f8782e86db676ff66efe0aecc9" Apr 22 18:47:31.298023 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:31.297985 2535 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e9747b8b0bd22fbfdd6746813a3955f38d6b6f8782e86db676ff66efe0aecc9"} err="failed to get container status \"8e9747b8b0bd22fbfdd6746813a3955f38d6b6f8782e86db676ff66efe0aecc9\": rpc error: code = NotFound desc = could not find container \"8e9747b8b0bd22fbfdd6746813a3955f38d6b6f8782e86db676ff66efe0aecc9\": container with ID starting with 8e9747b8b0bd22fbfdd6746813a3955f38d6b6f8782e86db676ff66efe0aecc9 not found: ID does not exist" Apr 22 18:47:31.421579 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:31.421552 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55685f47cf-8kn8f"] Apr 22 18:47:31.425236 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:47:31.425205 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe4c331d_3bdc_48ca_a00f_9bea37f6a4b2.slice/crio-d8d2031cfaa648fd9fb05e64986e85234673037fffbad320bde74ccb128c2bba WatchSource:0}: Error finding container d8d2031cfaa648fd9fb05e64986e85234673037fffbad320bde74ccb128c2bba: Status 404 returned error can't find the container with id d8d2031cfaa648fd9fb05e64986e85234673037fffbad320bde74ccb128c2bba Apr 22 18:47:31.587657 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:31.587532 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55685f47cf-8kn8f" event={"ID":"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2","Type":"ContainerStarted","Data":"d8d2031cfaa648fd9fb05e64986e85234673037fffbad320bde74ccb128c2bba"} Apr 22 18:47:31.590012 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:31.589984 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-xnrl5" event={"ID":"04248ee6-0336-4fa6-bef3-9f8c9e646abb","Type":"ContainerStarted","Data":"1bbe0853cde8d0d2766e8f783b904c81285085e9f715165db8b821eda1dddbac"} Apr 22 18:47:31.590234 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:31.590214 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-xnrl5" Apr 22 18:47:31.591873 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:31.591845 2535 patch_prober.go:28] interesting pod/downloads-6bcc868b7-xnrl5 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.133.0.14:8080/\": dial tcp 10.133.0.14:8080: connect: connection refused" start-of-body= Apr 22 18:47:31.592010 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:31.591894 2535 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-xnrl5" podUID="04248ee6-0336-4fa6-bef3-9f8c9e646abb" containerName="download-server" probeResult="failure" output="Get \"http://10.133.0.14:8080/\": dial tcp 10.133.0.14:8080: connect: connection refused" Apr 22 18:47:31.606482 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:31.606423 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-xnrl5" podStartSLOduration=0.845111239 podStartE2EDuration="17.606406979s" podCreationTimestamp="2026-04-22 18:47:14 +0000 UTC" firstStartedPulling="2026-04-22 18:47:14.641192321 +0000 UTC m=+70.004992189" lastFinishedPulling="2026-04-22 18:47:31.402488062 +0000 UTC m=+86.766287929" observedRunningTime="2026-04-22 18:47:31.605101006 +0000 UTC m=+86.968900895" watchObservedRunningTime="2026-04-22 18:47:31.606406979 +0000 UTC m=+86.970206871" Apr 22 18:47:32.608264 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:32.608232 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-xnrl5" Apr 22 18:47:33.690142 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.689067 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-64b866b994-sjc6h"] Apr 22 18:47:33.714715 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.714680 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64b866b994-sjc6h"] Apr 22 18:47:33.714884 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.714809 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.725634 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.725610 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 18:47:33.785328 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.785278 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-console-serving-cert\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.785530 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.785346 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-trusted-ca-bundle\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.785530 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.785379 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-service-ca\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.785530 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.785407 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-oauth-serving-cert\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.785530 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.785480 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-console-oauth-config\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.785530 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.785522 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsvhm\" (UniqueName: \"kubernetes.io/projected/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-kube-api-access-vsvhm\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.785718 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.785558 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-console-config\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.887348 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.886843 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-console-serving-cert\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.887348 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.886893 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-trusted-ca-bundle\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.887348 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.886945 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-service-ca\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.887348 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.886975 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-oauth-serving-cert\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.887348 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.887041 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-console-oauth-config\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.887348 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.887091 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsvhm\" (UniqueName: \"kubernetes.io/projected/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-kube-api-access-vsvhm\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.887348 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.887130 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-console-config\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.888124 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.888075 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-trusted-ca-bundle\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.888372 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.888296 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-oauth-serving-cert\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.888751 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.888650 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-service-ca\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.888751 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.888679 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-console-config\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.890147 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.890123 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-console-oauth-config\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.896411 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.896370 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsvhm\" (UniqueName: \"kubernetes.io/projected/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-kube-api-access-vsvhm\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:33.898620 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:33.898594 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-console-serving-cert\") pod \"console-64b866b994-sjc6h\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:34.029369 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:34.028891 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:35.001851 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:35.001818 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64b866b994-sjc6h"] Apr 22 18:47:35.005797 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:47:35.005755 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4e95027_b4b1_4b88_9708_8bfb85e7b3b8.slice/crio-896d1fbbc122e9e946eded414c082e460776ff130659628a74ac88eb7f877ce5 WatchSource:0}: Error finding container 896d1fbbc122e9e946eded414c082e460776ff130659628a74ac88eb7f877ce5: Status 404 returned error can't find the container with id 896d1fbbc122e9e946eded414c082e460776ff130659628a74ac88eb7f877ce5 Apr 22 18:47:35.608573 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:35.608532 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55685f47cf-8kn8f" event={"ID":"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2","Type":"ContainerStarted","Data":"c9f007a089ba9ccd913e6968709689f2dff2d6dbb94ce412b8633231737778ea"} Apr 22 18:47:35.610120 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:35.610085 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64b866b994-sjc6h" event={"ID":"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8","Type":"ContainerStarted","Data":"896d1fbbc122e9e946eded414c082e460776ff130659628a74ac88eb7f877ce5"} Apr 22 18:47:35.627865 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:35.627816 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55685f47cf-8kn8f" podStartSLOduration=6.831610327 podStartE2EDuration="10.627798005s" podCreationTimestamp="2026-04-22 18:47:25 +0000 UTC" firstStartedPulling="2026-04-22 18:47:31.42667947 +0000 UTC m=+86.790479337" lastFinishedPulling="2026-04-22 18:47:35.222867137 +0000 UTC m=+90.586667015" observedRunningTime="2026-04-22 18:47:35.625862626 +0000 UTC m=+90.989662516" watchObservedRunningTime="2026-04-22 18:47:35.627798005 +0000 UTC m=+90.991597897" Apr 22 18:47:36.615391 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:36.615348 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64b866b994-sjc6h" event={"ID":"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8","Type":"ContainerStarted","Data":"6d4aea68fe2658a76fbd45523ccbbe302551d27275ae027aa1775bae5accc651"} Apr 22 18:47:36.633861 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:36.633808 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64b866b994-sjc6h" podStartSLOduration=3.060695657 podStartE2EDuration="3.633760031s" podCreationTimestamp="2026-04-22 18:47:33 +0000 UTC" firstStartedPulling="2026-04-22 18:47:35.008377416 +0000 UTC m=+90.372177283" lastFinishedPulling="2026-04-22 18:47:35.581441777 +0000 UTC m=+90.945241657" observedRunningTime="2026-04-22 18:47:36.632187287 +0000 UTC m=+91.995987176" watchObservedRunningTime="2026-04-22 18:47:36.633760031 +0000 UTC m=+91.997559922" Apr 22 18:47:44.029474 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:44.029436 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:44.029474 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:44.029478 2535 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:44.034034 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:44.034013 2535 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:44.642746 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:44.642719 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:47:44.686350 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:44.686310 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55685f47cf-8kn8f"] Apr 22 18:47:45.433186 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:45.433151 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:47:46.544065 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:47:46.544033 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-89stm" Apr 22 18:48:09.705485 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:09.705411 2535 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-55685f47cf-8kn8f" podUID="fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2" containerName="console" containerID="cri-o://c9f007a089ba9ccd913e6968709689f2dff2d6dbb94ce412b8633231737778ea" gracePeriod=15 Apr 22 18:48:09.991796 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:09.991772 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55685f47cf-8kn8f_fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2/console/0.log" Apr 22 18:48:09.991955 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:09.991850 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:48:10.037489 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.037456 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-console-serving-cert\") pod \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " Apr 22 18:48:10.037489 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.037504 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-console-config\") pod \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " Apr 22 18:48:10.037737 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.037539 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-console-oauth-config\") pod \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " Apr 22 18:48:10.037919 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.037868 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-console-config" (OuterVolumeSpecName: "console-config") pod "fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2" (UID: "fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:48:10.039870 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.039838 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2" (UID: "fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:48:10.040009 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.039894 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2" (UID: "fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:48:10.138043 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.138004 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5djx\" (UniqueName: \"kubernetes.io/projected/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-kube-api-access-v5djx\") pod \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " Apr 22 18:48:10.138191 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.138055 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-oauth-serving-cert\") pod \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " Apr 22 18:48:10.138191 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.138084 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-service-ca\") pod \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\" (UID: \"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2\") " Apr 22 18:48:10.138258 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.138247 2535 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-console-serving-cert\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:48:10.138307 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.138263 2535 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-console-config\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:48:10.138307 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.138278 2535 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-console-oauth-config\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:48:10.138407 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.138330 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2" (UID: "fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:48:10.138407 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.138389 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-service-ca" (OuterVolumeSpecName: "service-ca") pod "fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2" (UID: "fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:48:10.140330 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.140304 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-kube-api-access-v5djx" (OuterVolumeSpecName: "kube-api-access-v5djx") pod "fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2" (UID: "fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2"). InnerVolumeSpecName "kube-api-access-v5djx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:48:10.239260 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.239128 2535 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-service-ca\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:48:10.239260 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.239164 2535 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v5djx\" (UniqueName: \"kubernetes.io/projected/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-kube-api-access-v5djx\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:48:10.239260 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.239181 2535 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2-oauth-serving-cert\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:48:10.706583 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.706560 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55685f47cf-8kn8f_fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2/console/0.log" Apr 22 18:48:10.707065 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.706601 2535 generic.go:358] "Generic (PLEG): container finished" podID="fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2" containerID="c9f007a089ba9ccd913e6968709689f2dff2d6dbb94ce412b8633231737778ea" exitCode=2 Apr 22 18:48:10.707065 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.706666 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55685f47cf-8kn8f" Apr 22 18:48:10.707065 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.706677 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55685f47cf-8kn8f" event={"ID":"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2","Type":"ContainerDied","Data":"c9f007a089ba9ccd913e6968709689f2dff2d6dbb94ce412b8633231737778ea"} Apr 22 18:48:10.707065 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.706706 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55685f47cf-8kn8f" event={"ID":"fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2","Type":"ContainerDied","Data":"d8d2031cfaa648fd9fb05e64986e85234673037fffbad320bde74ccb128c2bba"} Apr 22 18:48:10.707065 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.706723 2535 scope.go:117] "RemoveContainer" containerID="c9f007a089ba9ccd913e6968709689f2dff2d6dbb94ce412b8633231737778ea" Apr 22 18:48:10.717582 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.717563 2535 scope.go:117] "RemoveContainer" containerID="c9f007a089ba9ccd913e6968709689f2dff2d6dbb94ce412b8633231737778ea" Apr 22 18:48:10.717893 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:48:10.717868 2535 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9f007a089ba9ccd913e6968709689f2dff2d6dbb94ce412b8633231737778ea\": container with ID starting with c9f007a089ba9ccd913e6968709689f2dff2d6dbb94ce412b8633231737778ea not found: ID does not exist" containerID="c9f007a089ba9ccd913e6968709689f2dff2d6dbb94ce412b8633231737778ea" Apr 22 18:48:10.717893 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.717921 2535 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9f007a089ba9ccd913e6968709689f2dff2d6dbb94ce412b8633231737778ea"} err="failed to get container status \"c9f007a089ba9ccd913e6968709689f2dff2d6dbb94ce412b8633231737778ea\": rpc error: code = NotFound desc = could not find container \"c9f007a089ba9ccd913e6968709689f2dff2d6dbb94ce412b8633231737778ea\": container with ID starting with c9f007a089ba9ccd913e6968709689f2dff2d6dbb94ce412b8633231737778ea not found: ID does not exist" Apr 22 18:48:10.733412 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.733374 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55685f47cf-8kn8f"] Apr 22 18:48:10.736858 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:10.736834 2535 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-55685f47cf-8kn8f"] Apr 22 18:48:11.260436 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:11.260063 2535 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2" path="/var/lib/kubelet/pods/fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2/volumes" Apr 22 18:48:35.081711 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.081673 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-65d96b8b74-zp5wc"] Apr 22 18:48:35.082172 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.081925 2535 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2" containerName="console" Apr 22 18:48:35.082172 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.081936 2535 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2" containerName="console" Apr 22 18:48:35.082172 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.081984 2535 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe4c331d-3bdc-48ca-a00f-9bea37f6a4b2" containerName="console" Apr 22 18:48:35.084811 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.084796 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.097613 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.097585 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65d96b8b74-zp5wc"] Apr 22 18:48:35.114343 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.114317 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-oauth-serving-cert\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.114466 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.114351 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-trusted-ca-bundle\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.114466 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.114371 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ab68f3c-05d3-474e-bcc9-248cf667594b-console-serving-cert\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.114466 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.114386 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-service-ca\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.114580 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.114482 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-console-config\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.114580 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.114515 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzfmq\" (UniqueName: \"kubernetes.io/projected/5ab68f3c-05d3-474e-bcc9-248cf667594b-kube-api-access-wzfmq\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.114580 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.114551 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ab68f3c-05d3-474e-bcc9-248cf667594b-console-oauth-config\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.214855 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.214825 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ab68f3c-05d3-474e-bcc9-248cf667594b-console-oauth-config\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.215010 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.214887 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-oauth-serving-cert\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.215010 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.214928 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-trusted-ca-bundle\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.215010 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.214958 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ab68f3c-05d3-474e-bcc9-248cf667594b-console-serving-cert\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.215171 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.215026 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-service-ca\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.215171 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.215105 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-console-config\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.215171 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.215144 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzfmq\" (UniqueName: \"kubernetes.io/projected/5ab68f3c-05d3-474e-bcc9-248cf667594b-kube-api-access-wzfmq\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.216080 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.216029 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-oauth-serving-cert\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.216080 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.216071 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-console-config\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.216228 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.216109 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-trusted-ca-bundle\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.216365 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.216342 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-service-ca\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.217668 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.217645 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ab68f3c-05d3-474e-bcc9-248cf667594b-console-serving-cert\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.218116 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.218097 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ab68f3c-05d3-474e-bcc9-248cf667594b-console-oauth-config\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.223111 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.223088 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzfmq\" (UniqueName: \"kubernetes.io/projected/5ab68f3c-05d3-474e-bcc9-248cf667594b-kube-api-access-wzfmq\") pod \"console-65d96b8b74-zp5wc\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.394097 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.394022 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:35.529674 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.529641 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65d96b8b74-zp5wc"] Apr 22 18:48:35.535054 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:48:35.535019 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ab68f3c_05d3_474e_bcc9_248cf667594b.slice/crio-b9cf9e4ded2a9db5e2cff2c144c40f01cae78738474289d0d8f2057ad0ce7137 WatchSource:0}: Error finding container b9cf9e4ded2a9db5e2cff2c144c40f01cae78738474289d0d8f2057ad0ce7137: Status 404 returned error can't find the container with id b9cf9e4ded2a9db5e2cff2c144c40f01cae78738474289d0d8f2057ad0ce7137 Apr 22 18:48:35.777734 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.777695 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65d96b8b74-zp5wc" event={"ID":"5ab68f3c-05d3-474e-bcc9-248cf667594b","Type":"ContainerStarted","Data":"50d41017504888db28c9829d8a5c2eab1aec163f755ec73c804bacaea9678bfa"} Apr 22 18:48:35.777734 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.777737 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65d96b8b74-zp5wc" event={"ID":"5ab68f3c-05d3-474e-bcc9-248cf667594b","Type":"ContainerStarted","Data":"b9cf9e4ded2a9db5e2cff2c144c40f01cae78738474289d0d8f2057ad0ce7137"} Apr 22 18:48:35.796194 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:35.796149 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-65d96b8b74-zp5wc" podStartSLOduration=0.796133221 podStartE2EDuration="796.133221ms" podCreationTimestamp="2026-04-22 18:48:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:48:35.794627449 +0000 UTC m=+151.158427338" watchObservedRunningTime="2026-04-22 18:48:35.796133221 +0000 UTC m=+151.159933109" Apr 22 18:48:45.394865 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:45.394823 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:45.395257 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:45.394975 2535 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:45.399666 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:45.399649 2535 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:45.808034 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:45.808006 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:48:45.852169 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:48:45.852138 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64b866b994-sjc6h"] Apr 22 18:49:10.874968 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:10.874912 2535 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-64b866b994-sjc6h" podUID="f4e95027-b4b1-4b88-9708-8bfb85e7b3b8" containerName="console" containerID="cri-o://6d4aea68fe2658a76fbd45523ccbbe302551d27275ae027aa1775bae5accc651" gracePeriod=15 Apr 22 18:49:11.109046 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.109019 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64b866b994-sjc6h_f4e95027-b4b1-4b88-9708-8bfb85e7b3b8/console/0.log" Apr 22 18:49:11.109153 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.109080 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:49:11.267846 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.267807 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-console-serving-cert\") pod \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " Apr 22 18:49:11.268050 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.267872 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-service-ca\") pod \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " Apr 22 18:49:11.268050 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.267896 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-trusted-ca-bundle\") pod \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " Apr 22 18:49:11.268050 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.267936 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsvhm\" (UniqueName: \"kubernetes.io/projected/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-kube-api-access-vsvhm\") pod \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " Apr 22 18:49:11.268212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.268066 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-console-config\") pod \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " Apr 22 18:49:11.268212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.268103 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-console-oauth-config\") pod \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " Apr 22 18:49:11.268212 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.268127 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-oauth-serving-cert\") pod \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\" (UID: \"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8\") " Apr 22 18:49:11.268445 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.268419 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-service-ca" (OuterVolumeSpecName: "service-ca") pod "f4e95027-b4b1-4b88-9708-8bfb85e7b3b8" (UID: "f4e95027-b4b1-4b88-9708-8bfb85e7b3b8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:49:11.268524 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.268412 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f4e95027-b4b1-4b88-9708-8bfb85e7b3b8" (UID: "f4e95027-b4b1-4b88-9708-8bfb85e7b3b8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:49:11.268524 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.268487 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-console-config" (OuterVolumeSpecName: "console-config") pod "f4e95027-b4b1-4b88-9708-8bfb85e7b3b8" (UID: "f4e95027-b4b1-4b88-9708-8bfb85e7b3b8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:49:11.268802 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.268619 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f4e95027-b4b1-4b88-9708-8bfb85e7b3b8" (UID: "f4e95027-b4b1-4b88-9708-8bfb85e7b3b8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:49:11.270189 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.270167 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f4e95027-b4b1-4b88-9708-8bfb85e7b3b8" (UID: "f4e95027-b4b1-4b88-9708-8bfb85e7b3b8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:49:11.270286 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.270220 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-kube-api-access-vsvhm" (OuterVolumeSpecName: "kube-api-access-vsvhm") pod "f4e95027-b4b1-4b88-9708-8bfb85e7b3b8" (UID: "f4e95027-b4b1-4b88-9708-8bfb85e7b3b8"). InnerVolumeSpecName "kube-api-access-vsvhm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:49:11.270328 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.270301 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f4e95027-b4b1-4b88-9708-8bfb85e7b3b8" (UID: "f4e95027-b4b1-4b88-9708-8bfb85e7b3b8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:49:11.369074 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.369038 2535 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-trusted-ca-bundle\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:49:11.369074 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.369074 2535 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vsvhm\" (UniqueName: \"kubernetes.io/projected/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-kube-api-access-vsvhm\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:49:11.369226 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.369089 2535 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-console-config\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:49:11.369226 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.369101 2535 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-console-oauth-config\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:49:11.369226 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.369115 2535 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-oauth-serving-cert\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:49:11.369226 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.369129 2535 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-console-serving-cert\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:49:11.369226 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.369140 2535 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8-service-ca\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:49:11.869372 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.869346 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64b866b994-sjc6h_f4e95027-b4b1-4b88-9708-8bfb85e7b3b8/console/0.log" Apr 22 18:49:11.869545 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.869384 2535 generic.go:358] "Generic (PLEG): container finished" podID="f4e95027-b4b1-4b88-9708-8bfb85e7b3b8" containerID="6d4aea68fe2658a76fbd45523ccbbe302551d27275ae027aa1775bae5accc651" exitCode=2 Apr 22 18:49:11.869545 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.869420 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64b866b994-sjc6h" event={"ID":"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8","Type":"ContainerDied","Data":"6d4aea68fe2658a76fbd45523ccbbe302551d27275ae027aa1775bae5accc651"} Apr 22 18:49:11.869545 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.869440 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64b866b994-sjc6h" event={"ID":"f4e95027-b4b1-4b88-9708-8bfb85e7b3b8","Type":"ContainerDied","Data":"896d1fbbc122e9e946eded414c082e460776ff130659628a74ac88eb7f877ce5"} Apr 22 18:49:11.869545 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.869449 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64b866b994-sjc6h" Apr 22 18:49:11.869545 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.869453 2535 scope.go:117] "RemoveContainer" containerID="6d4aea68fe2658a76fbd45523ccbbe302551d27275ae027aa1775bae5accc651" Apr 22 18:49:11.877580 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.877428 2535 scope.go:117] "RemoveContainer" containerID="6d4aea68fe2658a76fbd45523ccbbe302551d27275ae027aa1775bae5accc651" Apr 22 18:49:11.877777 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:49:11.877675 2535 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d4aea68fe2658a76fbd45523ccbbe302551d27275ae027aa1775bae5accc651\": container with ID starting with 6d4aea68fe2658a76fbd45523ccbbe302551d27275ae027aa1775bae5accc651 not found: ID does not exist" containerID="6d4aea68fe2658a76fbd45523ccbbe302551d27275ae027aa1775bae5accc651" Apr 22 18:49:11.877777 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.877700 2535 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d4aea68fe2658a76fbd45523ccbbe302551d27275ae027aa1775bae5accc651"} err="failed to get container status \"6d4aea68fe2658a76fbd45523ccbbe302551d27275ae027aa1775bae5accc651\": rpc error: code = NotFound desc = could not find container \"6d4aea68fe2658a76fbd45523ccbbe302551d27275ae027aa1775bae5accc651\": container with ID starting with 6d4aea68fe2658a76fbd45523ccbbe302551d27275ae027aa1775bae5accc651 not found: ID does not exist" Apr 22 18:49:11.888209 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.888190 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64b866b994-sjc6h"] Apr 22 18:49:11.891897 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:11.891865 2535 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-64b866b994-sjc6h"] Apr 22 18:49:13.258356 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:49:13.258324 2535 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4e95027-b4b1-4b88-9708-8bfb85e7b3b8" path="/var/lib/kubelet/pods/f4e95027-b4b1-4b88-9708-8bfb85e7b3b8/volumes" Apr 22 18:50:14.409120 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.409076 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c646db979-nf47d"] Apr 22 18:50:14.409568 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.409431 2535 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4e95027-b4b1-4b88-9708-8bfb85e7b3b8" containerName="console" Apr 22 18:50:14.409568 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.409446 2535 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e95027-b4b1-4b88-9708-8bfb85e7b3b8" containerName="console" Apr 22 18:50:14.409568 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.409500 2535 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4e95027-b4b1-4b88-9708-8bfb85e7b3b8" containerName="console" Apr 22 18:50:14.413557 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.413531 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.416588 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.416553 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c646db979-nf47d"] Apr 22 18:50:14.486396 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.486369 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28288801-d78f-4926-a7df-f28e3728ad53-console-oauth-config\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.486396 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.486397 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx65x\" (UniqueName: \"kubernetes.io/projected/28288801-d78f-4926-a7df-f28e3728ad53-kube-api-access-tx65x\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.486559 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.486416 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-console-config\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.486559 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.486469 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-service-ca\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.486559 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.486499 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-trusted-ca-bundle\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.486559 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.486541 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-oauth-serving-cert\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.486683 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.486573 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28288801-d78f-4926-a7df-f28e3728ad53-console-serving-cert\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.587788 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.587762 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28288801-d78f-4926-a7df-f28e3728ad53-console-oauth-config\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.587943 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.587793 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx65x\" (UniqueName: \"kubernetes.io/projected/28288801-d78f-4926-a7df-f28e3728ad53-kube-api-access-tx65x\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.587943 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.587810 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-console-config\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.587943 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.587840 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-service-ca\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.588073 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.587974 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-trusted-ca-bundle\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.588073 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.588018 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-oauth-serving-cert\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.588073 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.588062 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28288801-d78f-4926-a7df-f28e3728ad53-console-serving-cert\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.588626 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.588603 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-service-ca\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.588728 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.588629 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-console-config\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.588775 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.588750 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-trusted-ca-bundle\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.588878 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.588854 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-oauth-serving-cert\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.590302 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.590284 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28288801-d78f-4926-a7df-f28e3728ad53-console-oauth-config\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.590590 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.590571 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28288801-d78f-4926-a7df-f28e3728ad53-console-serving-cert\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.596046 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.596029 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx65x\" (UniqueName: \"kubernetes.io/projected/28288801-d78f-4926-a7df-f28e3728ad53-kube-api-access-tx65x\") pod \"console-7c646db979-nf47d\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.723867 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.723826 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:14.842085 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:14.842063 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c646db979-nf47d"] Apr 22 18:50:14.844679 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:50:14.844651 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28288801_d78f_4926_a7df_f28e3728ad53.slice/crio-c0c42f00308aaab3a7ec65539da40bd97f7daebd72c7b99edac703f00f657892 WatchSource:0}: Error finding container c0c42f00308aaab3a7ec65539da40bd97f7daebd72c7b99edac703f00f657892: Status 404 returned error can't find the container with id c0c42f00308aaab3a7ec65539da40bd97f7daebd72c7b99edac703f00f657892 Apr 22 18:50:15.030043 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:15.029955 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c646db979-nf47d" event={"ID":"28288801-d78f-4926-a7df-f28e3728ad53","Type":"ContainerStarted","Data":"bb30dcf6d32601770b806bec4e4cd4de417cccf017f4cfe539bcbfab2b84d66a"} Apr 22 18:50:15.030043 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:15.029996 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c646db979-nf47d" event={"ID":"28288801-d78f-4926-a7df-f28e3728ad53","Type":"ContainerStarted","Data":"c0c42f00308aaab3a7ec65539da40bd97f7daebd72c7b99edac703f00f657892"} Apr 22 18:50:15.048686 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:15.048636 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c646db979-nf47d" podStartSLOduration=1.048619392 podStartE2EDuration="1.048619392s" podCreationTimestamp="2026-04-22 18:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:50:15.048214223 +0000 UTC m=+250.412014127" watchObservedRunningTime="2026-04-22 18:50:15.048619392 +0000 UTC m=+250.412419282" Apr 22 18:50:24.724882 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:24.724797 2535 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:24.724882 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:24.724836 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:24.729514 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:24.729488 2535 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:25.064857 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:25.064780 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:50:25.119965 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:25.119933 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65d96b8b74-zp5wc"] Apr 22 18:50:41.760674 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:41.760634 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5"] Apr 22 18:50:41.763940 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:41.763920 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" Apr 22 18:50:41.766337 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:41.766315 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:50:41.766421 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:41.766316 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:50:41.767151 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:41.767134 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-4pt94\"" Apr 22 18:50:41.773473 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:41.773449 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5"] Apr 22 18:50:41.870040 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:41.869999 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3c00bff-0d4c-497c-a714-ef0e9b16d862-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5\" (UID: \"a3c00bff-0d4c-497c-a714-ef0e9b16d862\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" Apr 22 18:50:41.870191 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:41.870057 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3c00bff-0d4c-497c-a714-ef0e9b16d862-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5\" (UID: \"a3c00bff-0d4c-497c-a714-ef0e9b16d862\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" Apr 22 18:50:41.870191 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:41.870083 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7l7d\" (UniqueName: \"kubernetes.io/projected/a3c00bff-0d4c-497c-a714-ef0e9b16d862-kube-api-access-w7l7d\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5\" (UID: \"a3c00bff-0d4c-497c-a714-ef0e9b16d862\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" Apr 22 18:50:41.971306 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:41.971261 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3c00bff-0d4c-497c-a714-ef0e9b16d862-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5\" (UID: \"a3c00bff-0d4c-497c-a714-ef0e9b16d862\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" Apr 22 18:50:41.971471 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:41.971318 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7l7d\" (UniqueName: \"kubernetes.io/projected/a3c00bff-0d4c-497c-a714-ef0e9b16d862-kube-api-access-w7l7d\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5\" (UID: \"a3c00bff-0d4c-497c-a714-ef0e9b16d862\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" Apr 22 18:50:41.971471 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:41.971387 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3c00bff-0d4c-497c-a714-ef0e9b16d862-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5\" (UID: \"a3c00bff-0d4c-497c-a714-ef0e9b16d862\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" Apr 22 18:50:41.971646 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:41.971626 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3c00bff-0d4c-497c-a714-ef0e9b16d862-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5\" (UID: \"a3c00bff-0d4c-497c-a714-ef0e9b16d862\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" Apr 22 18:50:41.971734 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:41.971717 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3c00bff-0d4c-497c-a714-ef0e9b16d862-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5\" (UID: \"a3c00bff-0d4c-497c-a714-ef0e9b16d862\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" Apr 22 18:50:41.980742 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:41.980718 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7l7d\" (UniqueName: \"kubernetes.io/projected/a3c00bff-0d4c-497c-a714-ef0e9b16d862-kube-api-access-w7l7d\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5\" (UID: \"a3c00bff-0d4c-497c-a714-ef0e9b16d862\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" Apr 22 18:50:42.077274 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:42.077241 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" Apr 22 18:50:42.194574 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:42.194544 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5"] Apr 22 18:50:42.197812 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:50:42.197783 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3c00bff_0d4c_497c_a714_ef0e9b16d862.slice/crio-0f45016f9d694f5a99211461848bc52188634bb36ef17710f816216ac0f901f3 WatchSource:0}: Error finding container 0f45016f9d694f5a99211461848bc52188634bb36ef17710f816216ac0f901f3: Status 404 returned error can't find the container with id 0f45016f9d694f5a99211461848bc52188634bb36ef17710f816216ac0f901f3 Apr 22 18:50:43.112776 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:43.112735 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" event={"ID":"a3c00bff-0d4c-497c-a714-ef0e9b16d862","Type":"ContainerStarted","Data":"0f45016f9d694f5a99211461848bc52188634bb36ef17710f816216ac0f901f3"} Apr 22 18:50:47.126576 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:47.126544 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" event={"ID":"a3c00bff-0d4c-497c-a714-ef0e9b16d862","Type":"ContainerStarted","Data":"c9f44912d99de1a6c9e0d88f28fe2555e2d14f39c2f84507d8b8e6cb100fa8ea"} Apr 22 18:50:48.130005 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:48.129971 2535 generic.go:358] "Generic (PLEG): container finished" podID="a3c00bff-0d4c-497c-a714-ef0e9b16d862" containerID="c9f44912d99de1a6c9e0d88f28fe2555e2d14f39c2f84507d8b8e6cb100fa8ea" exitCode=0 Apr 22 18:50:48.130380 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:48.130064 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" event={"ID":"a3c00bff-0d4c-497c-a714-ef0e9b16d862","Type":"ContainerDied","Data":"c9f44912d99de1a6c9e0d88f28fe2555e2d14f39c2f84507d8b8e6cb100fa8ea"} Apr 22 18:50:50.139482 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.139447 2535 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-65d96b8b74-zp5wc" podUID="5ab68f3c-05d3-474e-bcc9-248cf667594b" containerName="console" containerID="cri-o://50d41017504888db28c9829d8a5c2eab1aec163f755ec73c804bacaea9678bfa" gracePeriod=15 Apr 22 18:50:50.140261 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.140233 2535 generic.go:358] "Generic (PLEG): container finished" podID="a3c00bff-0d4c-497c-a714-ef0e9b16d862" containerID="69dca306e2ac902e62118f6aabb700ae46599b2ac814e35eb700229af3fd0b4b" exitCode=0 Apr 22 18:50:50.140383 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.140300 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" event={"ID":"a3c00bff-0d4c-497c-a714-ef0e9b16d862","Type":"ContainerDied","Data":"69dca306e2ac902e62118f6aabb700ae46599b2ac814e35eb700229af3fd0b4b"} Apr 22 18:50:50.371979 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.371959 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65d96b8b74-zp5wc_5ab68f3c-05d3-474e-bcc9-248cf667594b/console/0.log" Apr 22 18:50:50.372077 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.372018 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:50:50.436995 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.436972 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-oauth-serving-cert\") pod \"5ab68f3c-05d3-474e-bcc9-248cf667594b\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " Apr 22 18:50:50.437121 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.437026 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-console-config\") pod \"5ab68f3c-05d3-474e-bcc9-248cf667594b\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " Apr 22 18:50:50.437121 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.437051 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzfmq\" (UniqueName: \"kubernetes.io/projected/5ab68f3c-05d3-474e-bcc9-248cf667594b-kube-api-access-wzfmq\") pod \"5ab68f3c-05d3-474e-bcc9-248cf667594b\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " Apr 22 18:50:50.437121 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.437079 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ab68f3c-05d3-474e-bcc9-248cf667594b-console-serving-cert\") pod \"5ab68f3c-05d3-474e-bcc9-248cf667594b\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " Apr 22 18:50:50.437121 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.437094 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-service-ca\") pod \"5ab68f3c-05d3-474e-bcc9-248cf667594b\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " Apr 22 18:50:50.437306 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.437132 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-trusted-ca-bundle\") pod \"5ab68f3c-05d3-474e-bcc9-248cf667594b\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " Apr 22 18:50:50.437306 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.437151 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ab68f3c-05d3-474e-bcc9-248cf667594b-console-oauth-config\") pod \"5ab68f3c-05d3-474e-bcc9-248cf667594b\" (UID: \"5ab68f3c-05d3-474e-bcc9-248cf667594b\") " Apr 22 18:50:50.437403 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.437371 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5ab68f3c-05d3-474e-bcc9-248cf667594b" (UID: "5ab68f3c-05d3-474e-bcc9-248cf667594b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:50.437639 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.437612 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-console-config" (OuterVolumeSpecName: "console-config") pod "5ab68f3c-05d3-474e-bcc9-248cf667594b" (UID: "5ab68f3c-05d3-474e-bcc9-248cf667594b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:50.437639 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.437632 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5ab68f3c-05d3-474e-bcc9-248cf667594b" (UID: "5ab68f3c-05d3-474e-bcc9-248cf667594b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:50.437766 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.437618 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-service-ca" (OuterVolumeSpecName: "service-ca") pod "5ab68f3c-05d3-474e-bcc9-248cf667594b" (UID: "5ab68f3c-05d3-474e-bcc9-248cf667594b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:50.439255 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.439232 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ab68f3c-05d3-474e-bcc9-248cf667594b-kube-api-access-wzfmq" (OuterVolumeSpecName: "kube-api-access-wzfmq") pod "5ab68f3c-05d3-474e-bcc9-248cf667594b" (UID: "5ab68f3c-05d3-474e-bcc9-248cf667594b"). InnerVolumeSpecName "kube-api-access-wzfmq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:50:50.439343 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.439297 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab68f3c-05d3-474e-bcc9-248cf667594b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5ab68f3c-05d3-474e-bcc9-248cf667594b" (UID: "5ab68f3c-05d3-474e-bcc9-248cf667594b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:50.439343 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.439309 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab68f3c-05d3-474e-bcc9-248cf667594b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5ab68f3c-05d3-474e-bcc9-248cf667594b" (UID: "5ab68f3c-05d3-474e-bcc9-248cf667594b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:50.538166 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.538115 2535 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-console-config\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:50:50.538166 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.538134 2535 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wzfmq\" (UniqueName: \"kubernetes.io/projected/5ab68f3c-05d3-474e-bcc9-248cf667594b-kube-api-access-wzfmq\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:50:50.538166 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.538145 2535 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5ab68f3c-05d3-474e-bcc9-248cf667594b-console-serving-cert\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:50:50.538166 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.538154 2535 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-service-ca\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:50:50.538166 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.538164 2535 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-trusted-ca-bundle\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:50:50.538368 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.538172 2535 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5ab68f3c-05d3-474e-bcc9-248cf667594b-console-oauth-config\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:50:50.538368 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:50.538182 2535 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5ab68f3c-05d3-474e-bcc9-248cf667594b-oauth-serving-cert\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:50:51.149968 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:51.149896 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65d96b8b74-zp5wc_5ab68f3c-05d3-474e-bcc9-248cf667594b/console/0.log" Apr 22 18:50:51.150404 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:51.150012 2535 generic.go:358] "Generic (PLEG): container finished" podID="5ab68f3c-05d3-474e-bcc9-248cf667594b" containerID="50d41017504888db28c9829d8a5c2eab1aec163f755ec73c804bacaea9678bfa" exitCode=2 Apr 22 18:50:51.150404 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:51.150066 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65d96b8b74-zp5wc" event={"ID":"5ab68f3c-05d3-474e-bcc9-248cf667594b","Type":"ContainerDied","Data":"50d41017504888db28c9829d8a5c2eab1aec163f755ec73c804bacaea9678bfa"} Apr 22 18:50:51.150404 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:51.150156 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65d96b8b74-zp5wc" event={"ID":"5ab68f3c-05d3-474e-bcc9-248cf667594b","Type":"ContainerDied","Data":"b9cf9e4ded2a9db5e2cff2c144c40f01cae78738474289d0d8f2057ad0ce7137"} Apr 22 18:50:51.150404 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:51.150190 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65d96b8b74-zp5wc" Apr 22 18:50:51.150404 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:51.150191 2535 scope.go:117] "RemoveContainer" containerID="50d41017504888db28c9829d8a5c2eab1aec163f755ec73c804bacaea9678bfa" Apr 22 18:50:51.158921 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:51.158878 2535 scope.go:117] "RemoveContainer" containerID="50d41017504888db28c9829d8a5c2eab1aec163f755ec73c804bacaea9678bfa" Apr 22 18:50:51.159202 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:50:51.159177 2535 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d41017504888db28c9829d8a5c2eab1aec163f755ec73c804bacaea9678bfa\": container with ID starting with 50d41017504888db28c9829d8a5c2eab1aec163f755ec73c804bacaea9678bfa not found: ID does not exist" containerID="50d41017504888db28c9829d8a5c2eab1aec163f755ec73c804bacaea9678bfa" Apr 22 18:50:51.159308 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:51.159212 2535 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d41017504888db28c9829d8a5c2eab1aec163f755ec73c804bacaea9678bfa"} err="failed to get container status \"50d41017504888db28c9829d8a5c2eab1aec163f755ec73c804bacaea9678bfa\": rpc error: code = NotFound desc = could not find container \"50d41017504888db28c9829d8a5c2eab1aec163f755ec73c804bacaea9678bfa\": container with ID starting with 50d41017504888db28c9829d8a5c2eab1aec163f755ec73c804bacaea9678bfa not found: ID does not exist" Apr 22 18:50:51.173384 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:51.173358 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65d96b8b74-zp5wc"] Apr 22 18:50:51.176684 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:51.176660 2535 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-65d96b8b74-zp5wc"] Apr 22 18:50:51.258998 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:51.258975 2535 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ab68f3c-05d3-474e-bcc9-248cf667594b" path="/var/lib/kubelet/pods/5ab68f3c-05d3-474e-bcc9-248cf667594b/volumes" Apr 22 18:50:56.166826 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:56.166796 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" event={"ID":"a3c00bff-0d4c-497c-a714-ef0e9b16d862","Type":"ContainerStarted","Data":"d129c755c0348719991273c1e603a8b9ea773a03b5641b00918432c8e5b09eb9"} Apr 22 18:50:56.184074 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:56.184031 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" podStartSLOduration=1.327279648 podStartE2EDuration="15.184016767s" podCreationTimestamp="2026-04-22 18:50:41 +0000 UTC" firstStartedPulling="2026-04-22 18:50:42.199893516 +0000 UTC m=+277.563693383" lastFinishedPulling="2026-04-22 18:50:56.056630632 +0000 UTC m=+291.420430502" observedRunningTime="2026-04-22 18:50:56.182018577 +0000 UTC m=+291.545818466" watchObservedRunningTime="2026-04-22 18:50:56.184016767 +0000 UTC m=+291.547816655" Apr 22 18:50:57.172055 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:57.172025 2535 generic.go:358] "Generic (PLEG): container finished" podID="a3c00bff-0d4c-497c-a714-ef0e9b16d862" containerID="d129c755c0348719991273c1e603a8b9ea773a03b5641b00918432c8e5b09eb9" exitCode=0 Apr 22 18:50:57.172395 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:57.172067 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" event={"ID":"a3c00bff-0d4c-497c-a714-ef0e9b16d862","Type":"ContainerDied","Data":"d129c755c0348719991273c1e603a8b9ea773a03b5641b00918432c8e5b09eb9"} Apr 22 18:50:58.296415 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:58.296395 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" Apr 22 18:50:58.404643 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:58.404617 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3c00bff-0d4c-497c-a714-ef0e9b16d862-bundle\") pod \"a3c00bff-0d4c-497c-a714-ef0e9b16d862\" (UID: \"a3c00bff-0d4c-497c-a714-ef0e9b16d862\") " Apr 22 18:50:58.404777 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:58.404661 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l7d\" (UniqueName: \"kubernetes.io/projected/a3c00bff-0d4c-497c-a714-ef0e9b16d862-kube-api-access-w7l7d\") pod \"a3c00bff-0d4c-497c-a714-ef0e9b16d862\" (UID: \"a3c00bff-0d4c-497c-a714-ef0e9b16d862\") " Apr 22 18:50:58.404777 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:58.404691 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3c00bff-0d4c-497c-a714-ef0e9b16d862-util\") pod \"a3c00bff-0d4c-497c-a714-ef0e9b16d862\" (UID: \"a3c00bff-0d4c-497c-a714-ef0e9b16d862\") " Apr 22 18:50:58.405231 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:58.405200 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3c00bff-0d4c-497c-a714-ef0e9b16d862-bundle" (OuterVolumeSpecName: "bundle") pod "a3c00bff-0d4c-497c-a714-ef0e9b16d862" (UID: "a3c00bff-0d4c-497c-a714-ef0e9b16d862"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:50:58.407007 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:58.406977 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c00bff-0d4c-497c-a714-ef0e9b16d862-kube-api-access-w7l7d" (OuterVolumeSpecName: "kube-api-access-w7l7d") pod "a3c00bff-0d4c-497c-a714-ef0e9b16d862" (UID: "a3c00bff-0d4c-497c-a714-ef0e9b16d862"). InnerVolumeSpecName "kube-api-access-w7l7d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:50:58.408983 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:58.408963 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3c00bff-0d4c-497c-a714-ef0e9b16d862-util" (OuterVolumeSpecName: "util") pod "a3c00bff-0d4c-497c-a714-ef0e9b16d862" (UID: "a3c00bff-0d4c-497c-a714-ef0e9b16d862"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:50:58.506107 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:58.506040 2535 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w7l7d\" (UniqueName: \"kubernetes.io/projected/a3c00bff-0d4c-497c-a714-ef0e9b16d862-kube-api-access-w7l7d\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:50:58.506107 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:58.506073 2535 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3c00bff-0d4c-497c-a714-ef0e9b16d862-util\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:50:58.506107 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:58.506083 2535 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3c00bff-0d4c-497c-a714-ef0e9b16d862-bundle\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:50:59.180061 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:59.180026 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" event={"ID":"a3c00bff-0d4c-497c-a714-ef0e9b16d862","Type":"ContainerDied","Data":"0f45016f9d694f5a99211461848bc52188634bb36ef17710f816216ac0f901f3"} Apr 22 18:50:59.180061 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:59.180064 2535 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f45016f9d694f5a99211461848bc52188634bb36ef17710f816216ac0f901f3" Apr 22 18:50:59.180267 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:50:59.180042 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwkdg5" Apr 22 18:51:03.328417 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.328381 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dmszc"] Apr 22 18:51:03.328792 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.328718 2535 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3c00bff-0d4c-497c-a714-ef0e9b16d862" containerName="util" Apr 22 18:51:03.328792 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.328735 2535 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c00bff-0d4c-497c-a714-ef0e9b16d862" containerName="util" Apr 22 18:51:03.328792 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.328758 2535 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ab68f3c-05d3-474e-bcc9-248cf667594b" containerName="console" Apr 22 18:51:03.328792 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.328767 2535 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab68f3c-05d3-474e-bcc9-248cf667594b" containerName="console" Apr 22 18:51:03.328792 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.328780 2535 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3c00bff-0d4c-497c-a714-ef0e9b16d862" containerName="pull" Apr 22 18:51:03.328792 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.328789 2535 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c00bff-0d4c-497c-a714-ef0e9b16d862" containerName="pull" Apr 22 18:51:03.328993 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.328799 2535 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3c00bff-0d4c-497c-a714-ef0e9b16d862" containerName="extract" Apr 22 18:51:03.328993 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.328808 2535 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c00bff-0d4c-497c-a714-ef0e9b16d862" containerName="extract" Apr 22 18:51:03.328993 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.328861 2535 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ab68f3c-05d3-474e-bcc9-248cf667594b" containerName="console" Apr 22 18:51:03.328993 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.328875 2535 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3c00bff-0d4c-497c-a714-ef0e9b16d862" containerName="extract" Apr 22 18:51:03.332075 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.332060 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dmszc" Apr 22 18:51:03.334684 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.334655 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 18:51:03.334822 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.334788 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-t4xlm\"" Apr 22 18:51:03.334955 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.334928 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 18:51:03.334955 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.334930 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 18:51:03.342472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.342435 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dmszc"] Apr 22 18:51:03.442610 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.442578 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/c86e27d0-7d7b-4a3a-8d63-a660673b7e79-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dmszc\" (UID: \"c86e27d0-7d7b-4a3a-8d63-a660673b7e79\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dmszc" Apr 22 18:51:03.442782 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.442624 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9k6g\" (UniqueName: \"kubernetes.io/projected/c86e27d0-7d7b-4a3a-8d63-a660673b7e79-kube-api-access-v9k6g\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dmszc\" (UID: \"c86e27d0-7d7b-4a3a-8d63-a660673b7e79\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dmszc" Apr 22 18:51:03.543532 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.543498 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/c86e27d0-7d7b-4a3a-8d63-a660673b7e79-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dmszc\" (UID: \"c86e27d0-7d7b-4a3a-8d63-a660673b7e79\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dmszc" Apr 22 18:51:03.543687 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.543541 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v9k6g\" (UniqueName: \"kubernetes.io/projected/c86e27d0-7d7b-4a3a-8d63-a660673b7e79-kube-api-access-v9k6g\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dmszc\" (UID: \"c86e27d0-7d7b-4a3a-8d63-a660673b7e79\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dmszc" Apr 22 18:51:03.546063 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.546038 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/c86e27d0-7d7b-4a3a-8d63-a660673b7e79-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dmszc\" (UID: \"c86e27d0-7d7b-4a3a-8d63-a660673b7e79\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dmszc" Apr 22 18:51:03.553088 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.553062 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9k6g\" (UniqueName: \"kubernetes.io/projected/c86e27d0-7d7b-4a3a-8d63-a660673b7e79-kube-api-access-v9k6g\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dmszc\" (UID: \"c86e27d0-7d7b-4a3a-8d63-a660673b7e79\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dmszc" Apr 22 18:51:03.641755 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.641676 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dmszc" Apr 22 18:51:03.764297 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:03.764260 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dmszc"] Apr 22 18:51:03.767631 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:51:03.767588 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc86e27d0_7d7b_4a3a_8d63_a660673b7e79.slice/crio-b29ded1cdd02a64ed800b004708f45009847c6dedd820788af0c9a817104f4f9 WatchSource:0}: Error finding container b29ded1cdd02a64ed800b004708f45009847c6dedd820788af0c9a817104f4f9: Status 404 returned error can't find the container with id b29ded1cdd02a64ed800b004708f45009847c6dedd820788af0c9a817104f4f9 Apr 22 18:51:04.195855 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:04.195820 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dmszc" event={"ID":"c86e27d0-7d7b-4a3a-8d63-a660673b7e79","Type":"ContainerStarted","Data":"b29ded1cdd02a64ed800b004708f45009847c6dedd820788af0c9a817104f4f9"} Apr 22 18:51:05.110037 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:05.110007 2535 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:51:09.209704 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.209672 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dmszc" event={"ID":"c86e27d0-7d7b-4a3a-8d63-a660673b7e79","Type":"ContainerStarted","Data":"cfc0592304c11a18e613d502dbc37a675ccac76a794152fb965bc02e0ff5a8db"} Apr 22 18:51:09.210111 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.209778 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dmszc" Apr 22 18:51:09.229255 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.229197 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dmszc" podStartSLOduration=1.114001574 podStartE2EDuration="6.229181891s" podCreationTimestamp="2026-04-22 18:51:03 +0000 UTC" firstStartedPulling="2026-04-22 18:51:03.769239988 +0000 UTC m=+299.133040063" lastFinishedPulling="2026-04-22 18:51:08.884420513 +0000 UTC m=+304.248220380" observedRunningTime="2026-04-22 18:51:09.227371102 +0000 UTC m=+304.591170992" watchObservedRunningTime="2026-04-22 18:51:09.229181891 +0000 UTC m=+304.592981781" Apr 22 18:51:09.423590 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.423554 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-96rh5"] Apr 22 18:51:09.427073 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.427054 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-96rh5" Apr 22 18:51:09.429475 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.429454 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 18:51:09.429577 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.429502 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 18:51:09.429626 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.429600 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-k65j8\"" Apr 22 18:51:09.435296 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.435265 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-96rh5"] Apr 22 18:51:09.585454 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.585377 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b786b9d2-7645-4ab1-bdfe-73261d723b0a-certificates\") pod \"keda-operator-ffbb595cb-96rh5\" (UID: \"b786b9d2-7645-4ab1-bdfe-73261d723b0a\") " pod="openshift-keda/keda-operator-ffbb595cb-96rh5" Apr 22 18:51:09.585454 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.585411 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/b786b9d2-7645-4ab1-bdfe-73261d723b0a-cabundle0\") pod \"keda-operator-ffbb595cb-96rh5\" (UID: \"b786b9d2-7645-4ab1-bdfe-73261d723b0a\") " pod="openshift-keda/keda-operator-ffbb595cb-96rh5" Apr 22 18:51:09.585454 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.585444 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmqkd\" (UniqueName: \"kubernetes.io/projected/b786b9d2-7645-4ab1-bdfe-73261d723b0a-kube-api-access-bmqkd\") pod \"keda-operator-ffbb595cb-96rh5\" (UID: \"b786b9d2-7645-4ab1-bdfe-73261d723b0a\") " pod="openshift-keda/keda-operator-ffbb595cb-96rh5" Apr 22 18:51:09.686714 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.686668 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b786b9d2-7645-4ab1-bdfe-73261d723b0a-certificates\") pod \"keda-operator-ffbb595cb-96rh5\" (UID: \"b786b9d2-7645-4ab1-bdfe-73261d723b0a\") " pod="openshift-keda/keda-operator-ffbb595cb-96rh5" Apr 22 18:51:09.686714 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.686714 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/b786b9d2-7645-4ab1-bdfe-73261d723b0a-cabundle0\") pod \"keda-operator-ffbb595cb-96rh5\" (UID: \"b786b9d2-7645-4ab1-bdfe-73261d723b0a\") " pod="openshift-keda/keda-operator-ffbb595cb-96rh5" Apr 22 18:51:09.686886 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:09.686812 2535 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:51:09.686886 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:09.686831 2535 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:51:09.686886 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:09.686840 2535 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-96rh5: references non-existent secret key: ca.crt Apr 22 18:51:09.686886 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:09.686886 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b786b9d2-7645-4ab1-bdfe-73261d723b0a-certificates podName:b786b9d2-7645-4ab1-bdfe-73261d723b0a nodeName:}" failed. No retries permitted until 2026-04-22 18:51:10.186870665 +0000 UTC m=+305.550670532 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b786b9d2-7645-4ab1-bdfe-73261d723b0a-certificates") pod "keda-operator-ffbb595cb-96rh5" (UID: "b786b9d2-7645-4ab1-bdfe-73261d723b0a") : references non-existent secret key: ca.crt Apr 22 18:51:09.687047 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.686962 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmqkd\" (UniqueName: \"kubernetes.io/projected/b786b9d2-7645-4ab1-bdfe-73261d723b0a-kube-api-access-bmqkd\") pod \"keda-operator-ffbb595cb-96rh5\" (UID: \"b786b9d2-7645-4ab1-bdfe-73261d723b0a\") " pod="openshift-keda/keda-operator-ffbb595cb-96rh5" Apr 22 18:51:09.687528 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.687507 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/b786b9d2-7645-4ab1-bdfe-73261d723b0a-cabundle0\") pod \"keda-operator-ffbb595cb-96rh5\" (UID: \"b786b9d2-7645-4ab1-bdfe-73261d723b0a\") " pod="openshift-keda/keda-operator-ffbb595cb-96rh5" Apr 22 18:51:09.699489 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.699464 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmqkd\" (UniqueName: \"kubernetes.io/projected/b786b9d2-7645-4ab1-bdfe-73261d723b0a-kube-api-access-bmqkd\") pod \"keda-operator-ffbb595cb-96rh5\" (UID: \"b786b9d2-7645-4ab1-bdfe-73261d723b0a\") " pod="openshift-keda/keda-operator-ffbb595cb-96rh5" Apr 22 18:51:09.702939 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.702919 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2"] Apr 22 18:51:09.706010 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.705993 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2" Apr 22 18:51:09.708582 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.708563 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 18:51:09.718357 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.718337 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2"] Apr 22 18:51:09.788176 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.788151 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl2cv\" (UniqueName: \"kubernetes.io/projected/10540aef-b319-4d03-af6d-57efac318830-kube-api-access-pl2cv\") pod \"keda-metrics-apiserver-7c9f485588-hhjc2\" (UID: \"10540aef-b319-4d03-af6d-57efac318830\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2" Apr 22 18:51:09.788320 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.788195 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/10540aef-b319-4d03-af6d-57efac318830-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hhjc2\" (UID: \"10540aef-b319-4d03-af6d-57efac318830\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2" Apr 22 18:51:09.788320 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.788259 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/10540aef-b319-4d03-af6d-57efac318830-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-hhjc2\" (UID: \"10540aef-b319-4d03-af6d-57efac318830\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2" Apr 22 18:51:09.889566 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.889488 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/10540aef-b319-4d03-af6d-57efac318830-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-hhjc2\" (UID: \"10540aef-b319-4d03-af6d-57efac318830\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2" Apr 22 18:51:09.889720 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.889587 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pl2cv\" (UniqueName: \"kubernetes.io/projected/10540aef-b319-4d03-af6d-57efac318830-kube-api-access-pl2cv\") pod \"keda-metrics-apiserver-7c9f485588-hhjc2\" (UID: \"10540aef-b319-4d03-af6d-57efac318830\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2" Apr 22 18:51:09.889720 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.889624 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/10540aef-b319-4d03-af6d-57efac318830-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hhjc2\" (UID: \"10540aef-b319-4d03-af6d-57efac318830\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2" Apr 22 18:51:09.889828 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:09.889722 2535 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:51:09.889828 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:09.889733 2535 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:51:09.889828 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:09.889752 2535 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2: references non-existent secret key: tls.crt Apr 22 18:51:09.889828 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:09.889801 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10540aef-b319-4d03-af6d-57efac318830-certificates podName:10540aef-b319-4d03-af6d-57efac318830 nodeName:}" failed. No retries permitted until 2026-04-22 18:51:10.389787382 +0000 UTC m=+305.753587249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/10540aef-b319-4d03-af6d-57efac318830-certificates") pod "keda-metrics-apiserver-7c9f485588-hhjc2" (UID: "10540aef-b319-4d03-af6d-57efac318830") : references non-existent secret key: tls.crt Apr 22 18:51:09.890047 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.889933 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/10540aef-b319-4d03-af6d-57efac318830-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-hhjc2\" (UID: \"10540aef-b319-4d03-af6d-57efac318830\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2" Apr 22 18:51:09.899016 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.898990 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl2cv\" (UniqueName: \"kubernetes.io/projected/10540aef-b319-4d03-af6d-57efac318830-kube-api-access-pl2cv\") pod \"keda-metrics-apiserver-7c9f485588-hhjc2\" (UID: \"10540aef-b319-4d03-af6d-57efac318830\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2" Apr 22 18:51:09.909543 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.909521 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-vr86c"] Apr 22 18:51:09.912539 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.912524 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-vr86c" Apr 22 18:51:09.915274 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.915255 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 18:51:09.921495 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.921475 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-vr86c"] Apr 22 18:51:09.990984 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.990959 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxf7l\" (UniqueName: \"kubernetes.io/projected/79e949a8-3336-4f0a-a9d2-9cc0e1021693-kube-api-access-gxf7l\") pod \"keda-admission-cf49989db-vr86c\" (UID: \"79e949a8-3336-4f0a-a9d2-9cc0e1021693\") " pod="openshift-keda/keda-admission-cf49989db-vr86c" Apr 22 18:51:09.991098 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:09.991018 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/79e949a8-3336-4f0a-a9d2-9cc0e1021693-certificates\") pod \"keda-admission-cf49989db-vr86c\" (UID: \"79e949a8-3336-4f0a-a9d2-9cc0e1021693\") " pod="openshift-keda/keda-admission-cf49989db-vr86c" Apr 22 18:51:10.091532 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:10.091484 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxf7l\" (UniqueName: \"kubernetes.io/projected/79e949a8-3336-4f0a-a9d2-9cc0e1021693-kube-api-access-gxf7l\") pod \"keda-admission-cf49989db-vr86c\" (UID: \"79e949a8-3336-4f0a-a9d2-9cc0e1021693\") " pod="openshift-keda/keda-admission-cf49989db-vr86c" Apr 22 18:51:10.091532 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:10.091539 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/79e949a8-3336-4f0a-a9d2-9cc0e1021693-certificates\") pod \"keda-admission-cf49989db-vr86c\" (UID: \"79e949a8-3336-4f0a-a9d2-9cc0e1021693\") " pod="openshift-keda/keda-admission-cf49989db-vr86c" Apr 22 18:51:10.091785 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:10.091646 2535 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 22 18:51:10.091785 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:10.091665 2535 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-vr86c: secret "keda-admission-webhooks-certs" not found Apr 22 18:51:10.091785 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:10.091725 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/79e949a8-3336-4f0a-a9d2-9cc0e1021693-certificates podName:79e949a8-3336-4f0a-a9d2-9cc0e1021693 nodeName:}" failed. No retries permitted until 2026-04-22 18:51:10.591709844 +0000 UTC m=+305.955509712 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/79e949a8-3336-4f0a-a9d2-9cc0e1021693-certificates") pod "keda-admission-cf49989db-vr86c" (UID: "79e949a8-3336-4f0a-a9d2-9cc0e1021693") : secret "keda-admission-webhooks-certs" not found Apr 22 18:51:10.104926 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:10.104869 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxf7l\" (UniqueName: \"kubernetes.io/projected/79e949a8-3336-4f0a-a9d2-9cc0e1021693-kube-api-access-gxf7l\") pod \"keda-admission-cf49989db-vr86c\" (UID: \"79e949a8-3336-4f0a-a9d2-9cc0e1021693\") " pod="openshift-keda/keda-admission-cf49989db-vr86c" Apr 22 18:51:10.193095 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:10.193036 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b786b9d2-7645-4ab1-bdfe-73261d723b0a-certificates\") pod \"keda-operator-ffbb595cb-96rh5\" (UID: \"b786b9d2-7645-4ab1-bdfe-73261d723b0a\") " pod="openshift-keda/keda-operator-ffbb595cb-96rh5" Apr 22 18:51:10.193287 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:10.193190 2535 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:51:10.193287 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:10.193212 2535 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:51:10.193287 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:10.193224 2535 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-96rh5: references non-existent secret key: ca.crt Apr 22 18:51:10.193287 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:10.193288 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b786b9d2-7645-4ab1-bdfe-73261d723b0a-certificates podName:b786b9d2-7645-4ab1-bdfe-73261d723b0a nodeName:}" failed. No retries permitted until 2026-04-22 18:51:11.193270057 +0000 UTC m=+306.557069927 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b786b9d2-7645-4ab1-bdfe-73261d723b0a-certificates") pod "keda-operator-ffbb595cb-96rh5" (UID: "b786b9d2-7645-4ab1-bdfe-73261d723b0a") : references non-existent secret key: ca.crt Apr 22 18:51:10.394479 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:10.394449 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/10540aef-b319-4d03-af6d-57efac318830-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hhjc2\" (UID: \"10540aef-b319-4d03-af6d-57efac318830\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2" Apr 22 18:51:10.394829 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:10.394585 2535 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:51:10.394829 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:10.394603 2535 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:51:10.394829 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:10.394622 2535 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2: references non-existent secret key: tls.crt Apr 22 18:51:10.394829 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:10.394674 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10540aef-b319-4d03-af6d-57efac318830-certificates podName:10540aef-b319-4d03-af6d-57efac318830 nodeName:}" failed. No retries permitted until 2026-04-22 18:51:11.394658685 +0000 UTC m=+306.758458552 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/10540aef-b319-4d03-af6d-57efac318830-certificates") pod "keda-metrics-apiserver-7c9f485588-hhjc2" (UID: "10540aef-b319-4d03-af6d-57efac318830") : references non-existent secret key: tls.crt Apr 22 18:51:10.595618 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:10.595537 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/79e949a8-3336-4f0a-a9d2-9cc0e1021693-certificates\") pod \"keda-admission-cf49989db-vr86c\" (UID: \"79e949a8-3336-4f0a-a9d2-9cc0e1021693\") " pod="openshift-keda/keda-admission-cf49989db-vr86c" Apr 22 18:51:10.598005 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:10.597979 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/79e949a8-3336-4f0a-a9d2-9cc0e1021693-certificates\") pod \"keda-admission-cf49989db-vr86c\" (UID: \"79e949a8-3336-4f0a-a9d2-9cc0e1021693\") " pod="openshift-keda/keda-admission-cf49989db-vr86c" Apr 22 18:51:10.824140 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:10.824105 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-vr86c" Apr 22 18:51:10.946083 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:10.946051 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-vr86c"] Apr 22 18:51:10.949152 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:51:10.949117 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79e949a8_3336_4f0a_a9d2_9cc0e1021693.slice/crio-ec83ea300cbe829c5142bc607162afb4e2cddace01feb6007284ed40b47964e1 WatchSource:0}: Error finding container ec83ea300cbe829c5142bc607162afb4e2cddace01feb6007284ed40b47964e1: Status 404 returned error can't find the container with id ec83ea300cbe829c5142bc607162afb4e2cddace01feb6007284ed40b47964e1 Apr 22 18:51:10.950594 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:10.950577 2535 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:51:11.201783 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:11.201749 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b786b9d2-7645-4ab1-bdfe-73261d723b0a-certificates\") pod \"keda-operator-ffbb595cb-96rh5\" (UID: \"b786b9d2-7645-4ab1-bdfe-73261d723b0a\") " pod="openshift-keda/keda-operator-ffbb595cb-96rh5" Apr 22 18:51:11.201967 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:11.201940 2535 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:51:11.201967 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:11.201954 2535 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:51:11.201967 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:11.201963 2535 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-96rh5: references non-existent secret key: ca.crt Apr 22 18:51:11.202070 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:11.202023 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b786b9d2-7645-4ab1-bdfe-73261d723b0a-certificates podName:b786b9d2-7645-4ab1-bdfe-73261d723b0a nodeName:}" failed. No retries permitted until 2026-04-22 18:51:13.201999765 +0000 UTC m=+308.565799632 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b786b9d2-7645-4ab1-bdfe-73261d723b0a-certificates") pod "keda-operator-ffbb595cb-96rh5" (UID: "b786b9d2-7645-4ab1-bdfe-73261d723b0a") : references non-existent secret key: ca.crt Apr 22 18:51:11.216425 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:11.216398 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-vr86c" event={"ID":"79e949a8-3336-4f0a-a9d2-9cc0e1021693","Type":"ContainerStarted","Data":"ec83ea300cbe829c5142bc607162afb4e2cddace01feb6007284ed40b47964e1"} Apr 22 18:51:11.403798 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:11.403766 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/10540aef-b319-4d03-af6d-57efac318830-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hhjc2\" (UID: \"10540aef-b319-4d03-af6d-57efac318830\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2" Apr 22 18:51:11.404140 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:11.403952 2535 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:51:11.404140 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:11.403973 2535 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:51:11.404140 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:11.403991 2535 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2: references non-existent secret key: tls.crt Apr 22 18:51:11.404140 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:11.404044 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10540aef-b319-4d03-af6d-57efac318830-certificates podName:10540aef-b319-4d03-af6d-57efac318830 nodeName:}" failed. No retries permitted until 2026-04-22 18:51:13.404027201 +0000 UTC m=+308.767827068 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/10540aef-b319-4d03-af6d-57efac318830-certificates") pod "keda-metrics-apiserver-7c9f485588-hhjc2" (UID: "10540aef-b319-4d03-af6d-57efac318830") : references non-existent secret key: tls.crt Apr 22 18:51:13.217075 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:13.217039 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b786b9d2-7645-4ab1-bdfe-73261d723b0a-certificates\") pod \"keda-operator-ffbb595cb-96rh5\" (UID: \"b786b9d2-7645-4ab1-bdfe-73261d723b0a\") " pod="openshift-keda/keda-operator-ffbb595cb-96rh5" Apr 22 18:51:13.217449 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:13.217151 2535 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:51:13.217449 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:13.217163 2535 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:51:13.217449 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:13.217172 2535 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-96rh5: references non-existent secret key: ca.crt Apr 22 18:51:13.217449 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:13.217226 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b786b9d2-7645-4ab1-bdfe-73261d723b0a-certificates podName:b786b9d2-7645-4ab1-bdfe-73261d723b0a nodeName:}" failed. No retries permitted until 2026-04-22 18:51:17.217212688 +0000 UTC m=+312.581012555 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b786b9d2-7645-4ab1-bdfe-73261d723b0a-certificates") pod "keda-operator-ffbb595cb-96rh5" (UID: "b786b9d2-7645-4ab1-bdfe-73261d723b0a") : references non-existent secret key: ca.crt Apr 22 18:51:13.223329 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:13.223291 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-vr86c" event={"ID":"79e949a8-3336-4f0a-a9d2-9cc0e1021693","Type":"ContainerStarted","Data":"8d07f345069ec6c4973eed4a8e71d530ef12f7cdb77d691d603af8600e597bc4"} Apr 22 18:51:13.223438 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:13.223355 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-vr86c" Apr 22 18:51:13.240451 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:13.240409 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-vr86c" podStartSLOduration=2.54195303 podStartE2EDuration="4.24039656s" podCreationTimestamp="2026-04-22 18:51:09 +0000 UTC" firstStartedPulling="2026-04-22 18:51:10.950696974 +0000 UTC m=+306.314496841" lastFinishedPulling="2026-04-22 18:51:12.649140497 +0000 UTC m=+308.012940371" observedRunningTime="2026-04-22 18:51:13.238024697 +0000 UTC m=+308.601824583" watchObservedRunningTime="2026-04-22 18:51:13.24039656 +0000 UTC m=+308.604196456" Apr 22 18:51:13.418024 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:13.417983 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/10540aef-b319-4d03-af6d-57efac318830-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hhjc2\" (UID: \"10540aef-b319-4d03-af6d-57efac318830\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2" Apr 22 18:51:13.418180 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:13.418114 2535 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:51:13.418180 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:13.418135 2535 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:51:13.418180 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:13.418153 2535 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2: references non-existent secret key: tls.crt Apr 22 18:51:13.418300 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:51:13.418203 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/10540aef-b319-4d03-af6d-57efac318830-certificates podName:10540aef-b319-4d03-af6d-57efac318830 nodeName:}" failed. No retries permitted until 2026-04-22 18:51:17.418188942 +0000 UTC m=+312.781988808 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/10540aef-b319-4d03-af6d-57efac318830-certificates") pod "keda-metrics-apiserver-7c9f485588-hhjc2" (UID: "10540aef-b319-4d03-af6d-57efac318830") : references non-existent secret key: tls.crt Apr 22 18:51:17.247603 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:17.247567 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b786b9d2-7645-4ab1-bdfe-73261d723b0a-certificates\") pod \"keda-operator-ffbb595cb-96rh5\" (UID: \"b786b9d2-7645-4ab1-bdfe-73261d723b0a\") " pod="openshift-keda/keda-operator-ffbb595cb-96rh5" Apr 22 18:51:17.250044 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:17.250023 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b786b9d2-7645-4ab1-bdfe-73261d723b0a-certificates\") pod \"keda-operator-ffbb595cb-96rh5\" (UID: \"b786b9d2-7645-4ab1-bdfe-73261d723b0a\") " pod="openshift-keda/keda-operator-ffbb595cb-96rh5" Apr 22 18:51:17.449921 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:17.449865 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/10540aef-b319-4d03-af6d-57efac318830-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hhjc2\" (UID: \"10540aef-b319-4d03-af6d-57efac318830\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2" Apr 22 18:51:17.452591 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:17.452566 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/10540aef-b319-4d03-af6d-57efac318830-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hhjc2\" (UID: \"10540aef-b319-4d03-af6d-57efac318830\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2" Apr 22 18:51:17.517071 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:17.517007 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2" Apr 22 18:51:17.538741 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:17.538716 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-96rh5" Apr 22 18:51:17.642626 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:17.642596 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2"] Apr 22 18:51:17.645037 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:51:17.645009 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10540aef_b319_4d03_af6d_57efac318830.slice/crio-97edd51b24131f6276171cd3c4b96a9bd0a0e4040309dc75a82130a100504b54 WatchSource:0}: Error finding container 97edd51b24131f6276171cd3c4b96a9bd0a0e4040309dc75a82130a100504b54: Status 404 returned error can't find the container with id 97edd51b24131f6276171cd3c4b96a9bd0a0e4040309dc75a82130a100504b54 Apr 22 18:51:17.664872 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:17.664848 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-96rh5"] Apr 22 18:51:17.667417 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:51:17.667384 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb786b9d2_7645_4ab1_bdfe_73261d723b0a.slice/crio-05e5a72fced8121a84d88dbff00db4f5123bc221d4ccf8f6b05fb99d128b839d WatchSource:0}: Error finding container 05e5a72fced8121a84d88dbff00db4f5123bc221d4ccf8f6b05fb99d128b839d: Status 404 returned error can't find the container with id 05e5a72fced8121a84d88dbff00db4f5123bc221d4ccf8f6b05fb99d128b839d Apr 22 18:51:18.237682 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:18.237641 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2" event={"ID":"10540aef-b319-4d03-af6d-57efac318830","Type":"ContainerStarted","Data":"97edd51b24131f6276171cd3c4b96a9bd0a0e4040309dc75a82130a100504b54"} Apr 22 18:51:18.238579 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:18.238543 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-96rh5" event={"ID":"b786b9d2-7645-4ab1-bdfe-73261d723b0a","Type":"ContainerStarted","Data":"05e5a72fced8121a84d88dbff00db4f5123bc221d4ccf8f6b05fb99d128b839d"} Apr 22 18:51:22.252283 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:22.252250 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2" event={"ID":"10540aef-b319-4d03-af6d-57efac318830","Type":"ContainerStarted","Data":"4a44ed173d7f7df5be2c5bac8c1cdd6cc0c1294235eeaa02569662ffc9cdb9a2"} Apr 22 18:51:22.252735 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:22.252344 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2" Apr 22 18:51:22.253570 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:22.253545 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-96rh5" event={"ID":"b786b9d2-7645-4ab1-bdfe-73261d723b0a","Type":"ContainerStarted","Data":"95d1e7a359ed4b33b35b03e6f5fca6b5763254c02c7fea1b4e17e072f274a471"} Apr 22 18:51:22.253693 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:22.253670 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-96rh5" Apr 22 18:51:22.278842 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:22.278803 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2" podStartSLOduration=9.612090089 podStartE2EDuration="13.278793113s" podCreationTimestamp="2026-04-22 18:51:09 +0000 UTC" firstStartedPulling="2026-04-22 18:51:17.646379437 +0000 UTC m=+313.010179304" lastFinishedPulling="2026-04-22 18:51:21.313082449 +0000 UTC m=+316.676882328" observedRunningTime="2026-04-22 18:51:22.277732936 +0000 UTC m=+317.641532825" watchObservedRunningTime="2026-04-22 18:51:22.278793113 +0000 UTC m=+317.642593003" Apr 22 18:51:22.301437 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:22.301393 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-96rh5" podStartSLOduration=9.652577449 podStartE2EDuration="13.301383495s" podCreationTimestamp="2026-04-22 18:51:09 +0000 UTC" firstStartedPulling="2026-04-22 18:51:17.668570333 +0000 UTC m=+313.032370201" lastFinishedPulling="2026-04-22 18:51:21.317376377 +0000 UTC m=+316.681176247" observedRunningTime="2026-04-22 18:51:22.300088431 +0000 UTC m=+317.663888328" watchObservedRunningTime="2026-04-22 18:51:22.301383495 +0000 UTC m=+317.665183383" Apr 22 18:51:30.214885 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:30.214851 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dmszc" Apr 22 18:51:33.260841 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:33.260813 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hhjc2" Apr 22 18:51:34.228131 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:34.228096 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-vr86c" Apr 22 18:51:43.258612 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:51:43.258579 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-96rh5" Apr 22 18:52:14.247839 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.247755 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-hzlgb"] Apr 22 18:52:14.249977 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.249954 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-hzlgb" Apr 22 18:52:14.252372 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.252350 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 18:52:14.252491 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.252354 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:52:14.252491 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.252390 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:52:14.253290 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.253272 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-b4p6c\"" Apr 22 18:52:14.255076 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.255052 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-h4wrj"] Apr 22 18:52:14.256773 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.256753 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-h4wrj" Apr 22 18:52:14.259018 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.258994 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 18:52:14.259018 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.259003 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-svsmq\"" Apr 22 18:52:14.261526 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.261503 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-hzlgb"] Apr 22 18:52:14.269920 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.269877 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-h4wrj"] Apr 22 18:52:14.346052 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.346024 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/38c7c659-31c5-470b-a597-f8e654ee068a-cert\") pod \"kserve-controller-manager-6f655776dd-hzlgb\" (UID: \"38c7c659-31c5-470b-a597-f8e654ee068a\") " pod="kserve/kserve-controller-manager-6f655776dd-hzlgb" Apr 22 18:52:14.346227 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.346076 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lbn2\" (UniqueName: \"kubernetes.io/projected/afa04f76-b491-46ff-b211-46822d1c0128-kube-api-access-2lbn2\") pod \"llmisvc-controller-manager-68cc5db7c4-h4wrj\" (UID: \"afa04f76-b491-46ff-b211-46822d1c0128\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-h4wrj" Apr 22 18:52:14.346285 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.346221 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afa04f76-b491-46ff-b211-46822d1c0128-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-h4wrj\" (UID: \"afa04f76-b491-46ff-b211-46822d1c0128\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-h4wrj" Apr 22 18:52:14.346285 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.346275 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggbv4\" (UniqueName: \"kubernetes.io/projected/38c7c659-31c5-470b-a597-f8e654ee068a-kube-api-access-ggbv4\") pod \"kserve-controller-manager-6f655776dd-hzlgb\" (UID: \"38c7c659-31c5-470b-a597-f8e654ee068a\") " pod="kserve/kserve-controller-manager-6f655776dd-hzlgb" Apr 22 18:52:14.447536 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.447510 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/38c7c659-31c5-470b-a597-f8e654ee068a-cert\") pod \"kserve-controller-manager-6f655776dd-hzlgb\" (UID: \"38c7c659-31c5-470b-a597-f8e654ee068a\") " pod="kserve/kserve-controller-manager-6f655776dd-hzlgb" Apr 22 18:52:14.447717 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.447547 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lbn2\" (UniqueName: \"kubernetes.io/projected/afa04f76-b491-46ff-b211-46822d1c0128-kube-api-access-2lbn2\") pod \"llmisvc-controller-manager-68cc5db7c4-h4wrj\" (UID: \"afa04f76-b491-46ff-b211-46822d1c0128\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-h4wrj" Apr 22 18:52:14.447717 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.447580 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afa04f76-b491-46ff-b211-46822d1c0128-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-h4wrj\" (UID: \"afa04f76-b491-46ff-b211-46822d1c0128\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-h4wrj" Apr 22 18:52:14.447717 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.447612 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggbv4\" (UniqueName: \"kubernetes.io/projected/38c7c659-31c5-470b-a597-f8e654ee068a-kube-api-access-ggbv4\") pod \"kserve-controller-manager-6f655776dd-hzlgb\" (UID: \"38c7c659-31c5-470b-a597-f8e654ee068a\") " pod="kserve/kserve-controller-manager-6f655776dd-hzlgb" Apr 22 18:52:14.450189 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.450165 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/38c7c659-31c5-470b-a597-f8e654ee068a-cert\") pod \"kserve-controller-manager-6f655776dd-hzlgb\" (UID: \"38c7c659-31c5-470b-a597-f8e654ee068a\") " pod="kserve/kserve-controller-manager-6f655776dd-hzlgb" Apr 22 18:52:14.450289 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.450221 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/afa04f76-b491-46ff-b211-46822d1c0128-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-h4wrj\" (UID: \"afa04f76-b491-46ff-b211-46822d1c0128\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-h4wrj" Apr 22 18:52:14.455799 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.455772 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggbv4\" (UniqueName: \"kubernetes.io/projected/38c7c659-31c5-470b-a597-f8e654ee068a-kube-api-access-ggbv4\") pod \"kserve-controller-manager-6f655776dd-hzlgb\" (UID: \"38c7c659-31c5-470b-a597-f8e654ee068a\") " pod="kserve/kserve-controller-manager-6f655776dd-hzlgb" Apr 22 18:52:14.456228 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.456204 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lbn2\" (UniqueName: \"kubernetes.io/projected/afa04f76-b491-46ff-b211-46822d1c0128-kube-api-access-2lbn2\") pod \"llmisvc-controller-manager-68cc5db7c4-h4wrj\" (UID: \"afa04f76-b491-46ff-b211-46822d1c0128\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-h4wrj" Apr 22 18:52:14.562603 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.562543 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-hzlgb" Apr 22 18:52:14.571248 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.571225 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-h4wrj" Apr 22 18:52:14.698651 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.698523 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-hzlgb"] Apr 22 18:52:14.701384 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:52:14.701352 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38c7c659_31c5_470b_a597_f8e654ee068a.slice/crio-e1cf425fd9eda75f6c39e8a29db0db86f70a06af61afd453bb0329fc358f65df WatchSource:0}: Error finding container e1cf425fd9eda75f6c39e8a29db0db86f70a06af61afd453bb0329fc358f65df: Status 404 returned error can't find the container with id e1cf425fd9eda75f6c39e8a29db0db86f70a06af61afd453bb0329fc358f65df Apr 22 18:52:14.718077 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:14.717965 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-h4wrj"] Apr 22 18:52:14.720429 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:52:14.720406 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podafa04f76_b491_46ff_b211_46822d1c0128.slice/crio-6479ebe1de59ac733624deef4e1c0f0ce6a4e86993d5d7dc81c0b1a1eebb7a7e WatchSource:0}: Error finding container 6479ebe1de59ac733624deef4e1c0f0ce6a4e86993d5d7dc81c0b1a1eebb7a7e: Status 404 returned error can't find the container with id 6479ebe1de59ac733624deef4e1c0f0ce6a4e86993d5d7dc81c0b1a1eebb7a7e Apr 22 18:52:15.391444 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:15.391388 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-h4wrj" event={"ID":"afa04f76-b491-46ff-b211-46822d1c0128","Type":"ContainerStarted","Data":"6479ebe1de59ac733624deef4e1c0f0ce6a4e86993d5d7dc81c0b1a1eebb7a7e"} Apr 22 18:52:15.392847 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:15.392818 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-hzlgb" event={"ID":"38c7c659-31c5-470b-a597-f8e654ee068a","Type":"ContainerStarted","Data":"e1cf425fd9eda75f6c39e8a29db0db86f70a06af61afd453bb0329fc358f65df"} Apr 22 18:52:18.404937 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:18.404882 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-h4wrj" event={"ID":"afa04f76-b491-46ff-b211-46822d1c0128","Type":"ContainerStarted","Data":"e44bb4f2887e65789cd664618fe4428f894cb3a94785fda6844d59ed845f7aff"} Apr 22 18:52:18.405365 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:18.405113 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-h4wrj" Apr 22 18:52:18.406223 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:18.406201 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-hzlgb" event={"ID":"38c7c659-31c5-470b-a597-f8e654ee068a","Type":"ContainerStarted","Data":"2eb5357f30062d30e963b680bb25d46759270e921b2cbc502f5e2e60ec28c108"} Apr 22 18:52:18.406321 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:18.406311 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6f655776dd-hzlgb" Apr 22 18:52:18.421372 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:18.421331 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-h4wrj" podStartSLOduration=1.256882376 podStartE2EDuration="4.421321019s" podCreationTimestamp="2026-04-22 18:52:14 +0000 UTC" firstStartedPulling="2026-04-22 18:52:14.721706022 +0000 UTC m=+370.085505889" lastFinishedPulling="2026-04-22 18:52:17.886144652 +0000 UTC m=+373.249944532" observedRunningTime="2026-04-22 18:52:18.420774826 +0000 UTC m=+373.784574716" watchObservedRunningTime="2026-04-22 18:52:18.421321019 +0000 UTC m=+373.785120913" Apr 22 18:52:18.437355 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:18.437315 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6f655776dd-hzlgb" podStartSLOduration=1.208160026 podStartE2EDuration="4.437305763s" podCreationTimestamp="2026-04-22 18:52:14 +0000 UTC" firstStartedPulling="2026-04-22 18:52:14.702737295 +0000 UTC m=+370.066537162" lastFinishedPulling="2026-04-22 18:52:17.931883032 +0000 UTC m=+373.295682899" observedRunningTime="2026-04-22 18:52:18.436732653 +0000 UTC m=+373.800532543" watchObservedRunningTime="2026-04-22 18:52:18.437305763 +0000 UTC m=+373.801105652" Apr 22 18:52:49.412921 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:49.412875 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-h4wrj" Apr 22 18:52:49.415955 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:49.415934 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6f655776dd-hzlgb" Apr 22 18:52:50.810181 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:50.810139 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-hzlgb"] Apr 22 18:52:50.810572 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:50.810370 2535 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-6f655776dd-hzlgb" podUID="38c7c659-31c5-470b-a597-f8e654ee068a" containerName="manager" containerID="cri-o://2eb5357f30062d30e963b680bb25d46759270e921b2cbc502f5e2e60ec28c108" gracePeriod=10 Apr 22 18:52:50.874931 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:50.874887 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-xcjhs"] Apr 22 18:52:50.899330 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:50.899309 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-xcjhs"] Apr 22 18:52:50.899432 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:50.899416 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-xcjhs" Apr 22 18:52:51.019555 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.019521 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55vl7\" (UniqueName: \"kubernetes.io/projected/7a0e563c-f2e0-49df-a8b6-91b9d8728e7a-kube-api-access-55vl7\") pod \"kserve-controller-manager-6f655776dd-xcjhs\" (UID: \"7a0e563c-f2e0-49df-a8b6-91b9d8728e7a\") " pod="kserve/kserve-controller-manager-6f655776dd-xcjhs" Apr 22 18:52:51.019692 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.019641 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a0e563c-f2e0-49df-a8b6-91b9d8728e7a-cert\") pod \"kserve-controller-manager-6f655776dd-xcjhs\" (UID: \"7a0e563c-f2e0-49df-a8b6-91b9d8728e7a\") " pod="kserve/kserve-controller-manager-6f655776dd-xcjhs" Apr 22 18:52:51.061392 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.061338 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-hzlgb" Apr 22 18:52:51.120751 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.120720 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a0e563c-f2e0-49df-a8b6-91b9d8728e7a-cert\") pod \"kserve-controller-manager-6f655776dd-xcjhs\" (UID: \"7a0e563c-f2e0-49df-a8b6-91b9d8728e7a\") " pod="kserve/kserve-controller-manager-6f655776dd-xcjhs" Apr 22 18:52:51.120970 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.120777 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55vl7\" (UniqueName: \"kubernetes.io/projected/7a0e563c-f2e0-49df-a8b6-91b9d8728e7a-kube-api-access-55vl7\") pod \"kserve-controller-manager-6f655776dd-xcjhs\" (UID: \"7a0e563c-f2e0-49df-a8b6-91b9d8728e7a\") " pod="kserve/kserve-controller-manager-6f655776dd-xcjhs" Apr 22 18:52:51.123434 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.123413 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a0e563c-f2e0-49df-a8b6-91b9d8728e7a-cert\") pod \"kserve-controller-manager-6f655776dd-xcjhs\" (UID: \"7a0e563c-f2e0-49df-a8b6-91b9d8728e7a\") " pod="kserve/kserve-controller-manager-6f655776dd-xcjhs" Apr 22 18:52:51.129921 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.129875 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55vl7\" (UniqueName: \"kubernetes.io/projected/7a0e563c-f2e0-49df-a8b6-91b9d8728e7a-kube-api-access-55vl7\") pod \"kserve-controller-manager-6f655776dd-xcjhs\" (UID: \"7a0e563c-f2e0-49df-a8b6-91b9d8728e7a\") " pod="kserve/kserve-controller-manager-6f655776dd-xcjhs" Apr 22 18:52:51.221695 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.221668 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggbv4\" (UniqueName: \"kubernetes.io/projected/38c7c659-31c5-470b-a597-f8e654ee068a-kube-api-access-ggbv4\") pod \"38c7c659-31c5-470b-a597-f8e654ee068a\" (UID: \"38c7c659-31c5-470b-a597-f8e654ee068a\") " Apr 22 18:52:51.221847 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.221721 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/38c7c659-31c5-470b-a597-f8e654ee068a-cert\") pod \"38c7c659-31c5-470b-a597-f8e654ee068a\" (UID: \"38c7c659-31c5-470b-a597-f8e654ee068a\") " Apr 22 18:52:51.223937 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.223888 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38c7c659-31c5-470b-a597-f8e654ee068a-cert" (OuterVolumeSpecName: "cert") pod "38c7c659-31c5-470b-a597-f8e654ee068a" (UID: "38c7c659-31c5-470b-a597-f8e654ee068a"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:52:51.224020 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.223958 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38c7c659-31c5-470b-a597-f8e654ee068a-kube-api-access-ggbv4" (OuterVolumeSpecName: "kube-api-access-ggbv4") pod "38c7c659-31c5-470b-a597-f8e654ee068a" (UID: "38c7c659-31c5-470b-a597-f8e654ee068a"). InnerVolumeSpecName "kube-api-access-ggbv4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:52:51.271777 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.271756 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-xcjhs" Apr 22 18:52:51.322219 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.322192 2535 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ggbv4\" (UniqueName: \"kubernetes.io/projected/38c7c659-31c5-470b-a597-f8e654ee068a-kube-api-access-ggbv4\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:52:51.322219 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.322219 2535 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/38c7c659-31c5-470b-a597-f8e654ee068a-cert\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:52:51.389459 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.389428 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-xcjhs"] Apr 22 18:52:51.392129 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:52:51.392105 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a0e563c_f2e0_49df_a8b6_91b9d8728e7a.slice/crio-20605c4e0f0d65d08765a059f3bfa05f4ffac8637876e96eaf2b51169805fab2 WatchSource:0}: Error finding container 20605c4e0f0d65d08765a059f3bfa05f4ffac8637876e96eaf2b51169805fab2: Status 404 returned error can't find the container with id 20605c4e0f0d65d08765a059f3bfa05f4ffac8637876e96eaf2b51169805fab2 Apr 22 18:52:51.504563 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.504532 2535 generic.go:358] "Generic (PLEG): container finished" podID="38c7c659-31c5-470b-a597-f8e654ee068a" containerID="2eb5357f30062d30e963b680bb25d46759270e921b2cbc502f5e2e60ec28c108" exitCode=0 Apr 22 18:52:51.504698 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.504593 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-hzlgb" Apr 22 18:52:51.504698 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.504596 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-hzlgb" event={"ID":"38c7c659-31c5-470b-a597-f8e654ee068a","Type":"ContainerDied","Data":"2eb5357f30062d30e963b680bb25d46759270e921b2cbc502f5e2e60ec28c108"} Apr 22 18:52:51.504813 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.504713 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-hzlgb" event={"ID":"38c7c659-31c5-470b-a597-f8e654ee068a","Type":"ContainerDied","Data":"e1cf425fd9eda75f6c39e8a29db0db86f70a06af61afd453bb0329fc358f65df"} Apr 22 18:52:51.504813 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.504737 2535 scope.go:117] "RemoveContainer" containerID="2eb5357f30062d30e963b680bb25d46759270e921b2cbc502f5e2e60ec28c108" Apr 22 18:52:51.505734 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.505686 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-xcjhs" event={"ID":"7a0e563c-f2e0-49df-a8b6-91b9d8728e7a","Type":"ContainerStarted","Data":"20605c4e0f0d65d08765a059f3bfa05f4ffac8637876e96eaf2b51169805fab2"} Apr 22 18:52:51.512305 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.512289 2535 scope.go:117] "RemoveContainer" containerID="2eb5357f30062d30e963b680bb25d46759270e921b2cbc502f5e2e60ec28c108" Apr 22 18:52:51.512541 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:52:51.512524 2535 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eb5357f30062d30e963b680bb25d46759270e921b2cbc502f5e2e60ec28c108\": container with ID starting with 2eb5357f30062d30e963b680bb25d46759270e921b2cbc502f5e2e60ec28c108 not found: ID does not exist" containerID="2eb5357f30062d30e963b680bb25d46759270e921b2cbc502f5e2e60ec28c108" Apr 22 18:52:51.512589 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.512549 2535 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eb5357f30062d30e963b680bb25d46759270e921b2cbc502f5e2e60ec28c108"} err="failed to get container status \"2eb5357f30062d30e963b680bb25d46759270e921b2cbc502f5e2e60ec28c108\": rpc error: code = NotFound desc = could not find container \"2eb5357f30062d30e963b680bb25d46759270e921b2cbc502f5e2e60ec28c108\": container with ID starting with 2eb5357f30062d30e963b680bb25d46759270e921b2cbc502f5e2e60ec28c108 not found: ID does not exist" Apr 22 18:52:51.519097 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.519078 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-hzlgb"] Apr 22 18:52:51.522457 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:51.522438 2535 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-hzlgb"] Apr 22 18:52:52.510736 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:52.510704 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-xcjhs" event={"ID":"7a0e563c-f2e0-49df-a8b6-91b9d8728e7a","Type":"ContainerStarted","Data":"494435c33e4cd1d7060e1bb533eb3a1e9ad0bc169d987e779045ffaef8b8ac5b"} Apr 22 18:52:52.511133 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:52.510834 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6f655776dd-xcjhs" Apr 22 18:52:52.526591 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:52.526541 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6f655776dd-xcjhs" podStartSLOduration=1.98894964 podStartE2EDuration="2.526527337s" podCreationTimestamp="2026-04-22 18:52:50 +0000 UTC" firstStartedPulling="2026-04-22 18:52:51.393384899 +0000 UTC m=+406.757184768" lastFinishedPulling="2026-04-22 18:52:51.930962597 +0000 UTC m=+407.294762465" observedRunningTime="2026-04-22 18:52:52.525325028 +0000 UTC m=+407.889124916" watchObservedRunningTime="2026-04-22 18:52:52.526527337 +0000 UTC m=+407.890327226" Apr 22 18:52:53.258523 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:52:53.258490 2535 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38c7c659-31c5-470b-a597-f8e654ee068a" path="/var/lib/kubelet/pods/38c7c659-31c5-470b-a597-f8e654ee068a/volumes" Apr 22 18:53:23.519197 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:23.519165 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6f655776dd-xcjhs" Apr 22 18:53:24.331915 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:24.331864 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-nj94k"] Apr 22 18:53:24.332306 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:24.332291 2535 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38c7c659-31c5-470b-a597-f8e654ee068a" containerName="manager" Apr 22 18:53:24.332349 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:24.332310 2535 state_mem.go:107] "Deleted CPUSet assignment" podUID="38c7c659-31c5-470b-a597-f8e654ee068a" containerName="manager" Apr 22 18:53:24.332400 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:24.332390 2535 memory_manager.go:356] "RemoveStaleState removing state" podUID="38c7c659-31c5-470b-a597-f8e654ee068a" containerName="manager" Apr 22 18:53:24.336606 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:24.336585 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-nj94k" Apr 22 18:53:24.339053 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:24.339027 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-d8kwb\"" Apr 22 18:53:24.339161 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:24.339068 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 18:53:24.345541 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:24.345516 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-nj94k"] Apr 22 18:53:24.458803 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:24.458722 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5383b6a7-aaf9-4d87-a3bf-a95b88093788-tls-certs\") pod \"model-serving-api-86f7b4b499-nj94k\" (UID: \"5383b6a7-aaf9-4d87-a3bf-a95b88093788\") " pod="kserve/model-serving-api-86f7b4b499-nj94k" Apr 22 18:53:24.458972 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:24.458809 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrrmm\" (UniqueName: \"kubernetes.io/projected/5383b6a7-aaf9-4d87-a3bf-a95b88093788-kube-api-access-hrrmm\") pod \"model-serving-api-86f7b4b499-nj94k\" (UID: \"5383b6a7-aaf9-4d87-a3bf-a95b88093788\") " pod="kserve/model-serving-api-86f7b4b499-nj94k" Apr 22 18:53:24.559653 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:24.559625 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5383b6a7-aaf9-4d87-a3bf-a95b88093788-tls-certs\") pod \"model-serving-api-86f7b4b499-nj94k\" (UID: \"5383b6a7-aaf9-4d87-a3bf-a95b88093788\") " pod="kserve/model-serving-api-86f7b4b499-nj94k" Apr 22 18:53:24.560090 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:24.559692 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrrmm\" (UniqueName: \"kubernetes.io/projected/5383b6a7-aaf9-4d87-a3bf-a95b88093788-kube-api-access-hrrmm\") pod \"model-serving-api-86f7b4b499-nj94k\" (UID: \"5383b6a7-aaf9-4d87-a3bf-a95b88093788\") " pod="kserve/model-serving-api-86f7b4b499-nj94k" Apr 22 18:53:24.562405 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:24.562379 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5383b6a7-aaf9-4d87-a3bf-a95b88093788-tls-certs\") pod \"model-serving-api-86f7b4b499-nj94k\" (UID: \"5383b6a7-aaf9-4d87-a3bf-a95b88093788\") " pod="kserve/model-serving-api-86f7b4b499-nj94k" Apr 22 18:53:24.568167 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:24.568140 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrrmm\" (UniqueName: \"kubernetes.io/projected/5383b6a7-aaf9-4d87-a3bf-a95b88093788-kube-api-access-hrrmm\") pod \"model-serving-api-86f7b4b499-nj94k\" (UID: \"5383b6a7-aaf9-4d87-a3bf-a95b88093788\") " pod="kserve/model-serving-api-86f7b4b499-nj94k" Apr 22 18:53:24.648027 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:24.647950 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-nj94k" Apr 22 18:53:24.767871 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:24.767847 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-nj94k"] Apr 22 18:53:25.607069 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:25.607027 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-nj94k" event={"ID":"5383b6a7-aaf9-4d87-a3bf-a95b88093788","Type":"ContainerStarted","Data":"df301a552fd375976d04d5ad014d8900847e2775f92f4956c5d9db22aa91738c"} Apr 22 18:53:27.618557 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:27.618523 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-nj94k" event={"ID":"5383b6a7-aaf9-4d87-a3bf-a95b88093788","Type":"ContainerStarted","Data":"4918d948f212037a87dedd2223045bbdafad9a2156a83748197f24b5f6975de7"} Apr 22 18:53:27.618951 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:27.618640 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-nj94k" Apr 22 18:53:27.634652 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:27.634598 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-nj94k" podStartSLOduration=1.4048241799999999 podStartE2EDuration="3.634584878s" podCreationTimestamp="2026-04-22 18:53:24 +0000 UTC" firstStartedPulling="2026-04-22 18:53:24.77346007 +0000 UTC m=+440.137259937" lastFinishedPulling="2026-04-22 18:53:27.00322075 +0000 UTC m=+442.367020635" observedRunningTime="2026-04-22 18:53:27.634413537 +0000 UTC m=+442.998213427" watchObservedRunningTime="2026-04-22 18:53:27.634584878 +0000 UTC m=+442.998384768" Apr 22 18:53:38.625605 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:38.625576 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-nj94k" Apr 22 18:53:44.194939 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.194889 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f5c6bdb9d-llxc5"] Apr 22 18:53:44.198116 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.198097 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.208076 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.208051 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f5c6bdb9d-llxc5"] Apr 22 18:53:44.315821 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.315792 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90e22475-c5a3-49c7-a874-a8337b0db905-console-serving-cert\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.316000 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.315834 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90e22475-c5a3-49c7-a874-a8337b0db905-console-config\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.316000 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.315861 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90e22475-c5a3-49c7-a874-a8337b0db905-service-ca\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.316000 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.315884 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90e22475-c5a3-49c7-a874-a8337b0db905-trusted-ca-bundle\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.316000 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.315927 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90e22475-c5a3-49c7-a874-a8337b0db905-oauth-serving-cert\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.316000 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.315948 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90e22475-c5a3-49c7-a874-a8337b0db905-console-oauth-config\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.316206 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.316011 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85l9m\" (UniqueName: \"kubernetes.io/projected/90e22475-c5a3-49c7-a874-a8337b0db905-kube-api-access-85l9m\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.417033 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.417001 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90e22475-c5a3-49c7-a874-a8337b0db905-console-config\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.417163 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.417045 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90e22475-c5a3-49c7-a874-a8337b0db905-service-ca\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.417163 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.417076 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90e22475-c5a3-49c7-a874-a8337b0db905-trusted-ca-bundle\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.417163 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.417100 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90e22475-c5a3-49c7-a874-a8337b0db905-oauth-serving-cert\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.417163 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.417131 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90e22475-c5a3-49c7-a874-a8337b0db905-console-oauth-config\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.417306 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.417183 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85l9m\" (UniqueName: \"kubernetes.io/projected/90e22475-c5a3-49c7-a874-a8337b0db905-kube-api-access-85l9m\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.417306 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.417219 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90e22475-c5a3-49c7-a874-a8337b0db905-console-serving-cert\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.417756 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.417733 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/90e22475-c5a3-49c7-a874-a8337b0db905-console-config\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.417883 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.417859 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/90e22475-c5a3-49c7-a874-a8337b0db905-oauth-serving-cert\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.417986 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.417870 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/90e22475-c5a3-49c7-a874-a8337b0db905-service-ca\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.418198 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.418176 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90e22475-c5a3-49c7-a874-a8337b0db905-trusted-ca-bundle\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.419800 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.419781 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/90e22475-c5a3-49c7-a874-a8337b0db905-console-oauth-config\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.419862 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.419846 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/90e22475-c5a3-49c7-a874-a8337b0db905-console-serving-cert\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.425922 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.425885 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85l9m\" (UniqueName: \"kubernetes.io/projected/90e22475-c5a3-49c7-a874-a8337b0db905-kube-api-access-85l9m\") pod \"console-5f5c6bdb9d-llxc5\" (UID: \"90e22475-c5a3-49c7-a874-a8337b0db905\") " pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.507164 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.507099 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:44.629166 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.629138 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f5c6bdb9d-llxc5"] Apr 22 18:53:44.631267 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:53:44.631236 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90e22475_c5a3_49c7_a874_a8337b0db905.slice/crio-a583ed3eb1bdf94859cea06b370b861cef924e279b1d5d2a1190f2137c504cf6 WatchSource:0}: Error finding container a583ed3eb1bdf94859cea06b370b861cef924e279b1d5d2a1190f2137c504cf6: Status 404 returned error can't find the container with id a583ed3eb1bdf94859cea06b370b861cef924e279b1d5d2a1190f2137c504cf6 Apr 22 18:53:44.672255 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:44.672226 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f5c6bdb9d-llxc5" event={"ID":"90e22475-c5a3-49c7-a874-a8337b0db905","Type":"ContainerStarted","Data":"a583ed3eb1bdf94859cea06b370b861cef924e279b1d5d2a1190f2137c504cf6"} Apr 22 18:53:45.676831 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:45.676795 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f5c6bdb9d-llxc5" event={"ID":"90e22475-c5a3-49c7-a874-a8337b0db905","Type":"ContainerStarted","Data":"a863b10e35bc07ab0f3cdf31a49c1a809c060d0e14c15dc1b1b6d296c4870e95"} Apr 22 18:53:45.694810 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:45.694768 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f5c6bdb9d-llxc5" podStartSLOduration=1.694755183 podStartE2EDuration="1.694755183s" podCreationTimestamp="2026-04-22 18:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:53:45.692358132 +0000 UTC m=+461.056158021" watchObservedRunningTime="2026-04-22 18:53:45.694755183 +0000 UTC m=+461.058555072" Apr 22 18:53:54.507937 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:54.507878 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:54.507937 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:54.507945 2535 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:54.512726 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:54.512700 2535 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:54.714194 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:54.714168 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f5c6bdb9d-llxc5" Apr 22 18:53:54.762367 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:53:54.762201 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c646db979-nf47d"] Apr 22 18:54:19.785478 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:19.785414 2535 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7c646db979-nf47d" podUID="28288801-d78f-4926-a7df-f28e3728ad53" containerName="console" containerID="cri-o://bb30dcf6d32601770b806bec4e4cd4de417cccf017f4cfe539bcbfab2b84d66a" gracePeriod=15 Apr 22 18:54:20.019608 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.019584 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c646db979-nf47d_28288801-d78f-4926-a7df-f28e3728ad53/console/0.log" Apr 22 18:54:20.019718 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.019650 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:54:20.087541 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.087447 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-service-ca\") pod \"28288801-d78f-4926-a7df-f28e3728ad53\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " Apr 22 18:54:20.087541 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.087487 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-console-config\") pod \"28288801-d78f-4926-a7df-f28e3728ad53\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " Apr 22 18:54:20.087541 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.087524 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-trusted-ca-bundle\") pod \"28288801-d78f-4926-a7df-f28e3728ad53\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " Apr 22 18:54:20.087791 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.087575 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-oauth-serving-cert\") pod \"28288801-d78f-4926-a7df-f28e3728ad53\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " Apr 22 18:54:20.087791 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.087612 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28288801-d78f-4926-a7df-f28e3728ad53-console-oauth-config\") pod \"28288801-d78f-4926-a7df-f28e3728ad53\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " Apr 22 18:54:20.087791 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.087657 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx65x\" (UniqueName: \"kubernetes.io/projected/28288801-d78f-4926-a7df-f28e3728ad53-kube-api-access-tx65x\") pod \"28288801-d78f-4926-a7df-f28e3728ad53\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " Apr 22 18:54:20.087791 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.087679 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28288801-d78f-4926-a7df-f28e3728ad53-console-serving-cert\") pod \"28288801-d78f-4926-a7df-f28e3728ad53\" (UID: \"28288801-d78f-4926-a7df-f28e3728ad53\") " Apr 22 18:54:20.087991 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.087968 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-service-ca" (OuterVolumeSpecName: "service-ca") pod "28288801-d78f-4926-a7df-f28e3728ad53" (UID: "28288801-d78f-4926-a7df-f28e3728ad53"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:54:20.088032 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.087980 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-console-config" (OuterVolumeSpecName: "console-config") pod "28288801-d78f-4926-a7df-f28e3728ad53" (UID: "28288801-d78f-4926-a7df-f28e3728ad53"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:54:20.088068 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.088034 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "28288801-d78f-4926-a7df-f28e3728ad53" (UID: "28288801-d78f-4926-a7df-f28e3728ad53"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:54:20.088281 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.088255 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "28288801-d78f-4926-a7df-f28e3728ad53" (UID: "28288801-d78f-4926-a7df-f28e3728ad53"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:54:20.090083 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.090038 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28288801-d78f-4926-a7df-f28e3728ad53-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "28288801-d78f-4926-a7df-f28e3728ad53" (UID: "28288801-d78f-4926-a7df-f28e3728ad53"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:54:20.090172 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.090104 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28288801-d78f-4926-a7df-f28e3728ad53-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "28288801-d78f-4926-a7df-f28e3728ad53" (UID: "28288801-d78f-4926-a7df-f28e3728ad53"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:54:20.090172 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.090119 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28288801-d78f-4926-a7df-f28e3728ad53-kube-api-access-tx65x" (OuterVolumeSpecName: "kube-api-access-tx65x") pod "28288801-d78f-4926-a7df-f28e3728ad53" (UID: "28288801-d78f-4926-a7df-f28e3728ad53"). InnerVolumeSpecName "kube-api-access-tx65x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:54:20.188632 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.188596 2535 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-oauth-serving-cert\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:54:20.188632 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.188623 2535 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/28288801-d78f-4926-a7df-f28e3728ad53-console-oauth-config\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:54:20.188632 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.188633 2535 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tx65x\" (UniqueName: \"kubernetes.io/projected/28288801-d78f-4926-a7df-f28e3728ad53-kube-api-access-tx65x\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:54:20.188846 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.188643 2535 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/28288801-d78f-4926-a7df-f28e3728ad53-console-serving-cert\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:54:20.188846 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.188653 2535 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-service-ca\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:54:20.188846 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.188662 2535 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-console-config\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:54:20.188846 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.188670 2535 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28288801-d78f-4926-a7df-f28e3728ad53-trusted-ca-bundle\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:54:20.788869 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.788841 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c646db979-nf47d_28288801-d78f-4926-a7df-f28e3728ad53/console/0.log" Apr 22 18:54:20.789337 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.788876 2535 generic.go:358] "Generic (PLEG): container finished" podID="28288801-d78f-4926-a7df-f28e3728ad53" containerID="bb30dcf6d32601770b806bec4e4cd4de417cccf017f4cfe539bcbfab2b84d66a" exitCode=2 Apr 22 18:54:20.789337 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.788970 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c646db979-nf47d" Apr 22 18:54:20.789337 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.788973 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c646db979-nf47d" event={"ID":"28288801-d78f-4926-a7df-f28e3728ad53","Type":"ContainerDied","Data":"bb30dcf6d32601770b806bec4e4cd4de417cccf017f4cfe539bcbfab2b84d66a"} Apr 22 18:54:20.789337 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.789013 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c646db979-nf47d" event={"ID":"28288801-d78f-4926-a7df-f28e3728ad53","Type":"ContainerDied","Data":"c0c42f00308aaab3a7ec65539da40bd97f7daebd72c7b99edac703f00f657892"} Apr 22 18:54:20.789337 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.789030 2535 scope.go:117] "RemoveContainer" containerID="bb30dcf6d32601770b806bec4e4cd4de417cccf017f4cfe539bcbfab2b84d66a" Apr 22 18:54:20.797513 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.797498 2535 scope.go:117] "RemoveContainer" containerID="bb30dcf6d32601770b806bec4e4cd4de417cccf017f4cfe539bcbfab2b84d66a" Apr 22 18:54:20.797748 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:54:20.797731 2535 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb30dcf6d32601770b806bec4e4cd4de417cccf017f4cfe539bcbfab2b84d66a\": container with ID starting with bb30dcf6d32601770b806bec4e4cd4de417cccf017f4cfe539bcbfab2b84d66a not found: ID does not exist" containerID="bb30dcf6d32601770b806bec4e4cd4de417cccf017f4cfe539bcbfab2b84d66a" Apr 22 18:54:20.797788 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.797756 2535 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb30dcf6d32601770b806bec4e4cd4de417cccf017f4cfe539bcbfab2b84d66a"} err="failed to get container status \"bb30dcf6d32601770b806bec4e4cd4de417cccf017f4cfe539bcbfab2b84d66a\": rpc error: code = NotFound desc = could not find container \"bb30dcf6d32601770b806bec4e4cd4de417cccf017f4cfe539bcbfab2b84d66a\": container with ID starting with bb30dcf6d32601770b806bec4e4cd4de417cccf017f4cfe539bcbfab2b84d66a not found: ID does not exist" Apr 22 18:54:20.809594 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.809573 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c646db979-nf47d"] Apr 22 18:54:20.813412 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:20.813392 2535 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7c646db979-nf47d"] Apr 22 18:54:21.258494 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:54:21.258464 2535 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28288801-d78f-4926-a7df-f28e3728ad53" path="/var/lib/kubelet/pods/28288801-d78f-4926-a7df-f28e3728ad53/volumes" Apr 22 18:57:15.577518 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:15.577472 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr"] Apr 22 18:57:15.578065 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:15.577807 2535 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28288801-d78f-4926-a7df-f28e3728ad53" containerName="console" Apr 22 18:57:15.578065 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:15.577822 2535 state_mem.go:107] "Deleted CPUSet assignment" podUID="28288801-d78f-4926-a7df-f28e3728ad53" containerName="console" Apr 22 18:57:15.578065 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:15.577874 2535 memory_manager.go:356] "RemoveStaleState removing state" podUID="28288801-d78f-4926-a7df-f28e3728ad53" containerName="console" Apr 22 18:57:15.580700 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:15.580682 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" Apr 22 18:57:15.582880 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:15.582859 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-1fc88-kube-rbac-proxy-sar-config\"" Apr 22 18:57:15.583008 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:15.582929 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-1fc88-serving-cert\"" Apr 22 18:57:15.583008 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:15.582937 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-j46rb\"" Apr 22 18:57:15.583776 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:15.583762 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:57:15.590476 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:15.590455 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr"] Apr 22 18:57:15.659882 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:15.659856 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d2a0894c-1131-48ed-bd73-f32120aa29a2-proxy-tls\") pod \"model-chainer-raw-1fc88-66db98d96c-4dpgr\" (UID: \"d2a0894c-1131-48ed-bd73-f32120aa29a2\") " pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" Apr 22 18:57:15.660031 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:15.659889 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2a0894c-1131-48ed-bd73-f32120aa29a2-openshift-service-ca-bundle\") pod \"model-chainer-raw-1fc88-66db98d96c-4dpgr\" (UID: \"d2a0894c-1131-48ed-bd73-f32120aa29a2\") " pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" Apr 22 18:57:15.760427 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:15.760394 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d2a0894c-1131-48ed-bd73-f32120aa29a2-proxy-tls\") pod \"model-chainer-raw-1fc88-66db98d96c-4dpgr\" (UID: \"d2a0894c-1131-48ed-bd73-f32120aa29a2\") " pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" Apr 22 18:57:15.760553 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:15.760440 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2a0894c-1131-48ed-bd73-f32120aa29a2-openshift-service-ca-bundle\") pod \"model-chainer-raw-1fc88-66db98d96c-4dpgr\" (UID: \"d2a0894c-1131-48ed-bd73-f32120aa29a2\") " pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" Apr 22 18:57:15.760553 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:57:15.760539 2535 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-1fc88-serving-cert: secret "model-chainer-raw-1fc88-serving-cert" not found Apr 22 18:57:15.760626 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:57:15.760614 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2a0894c-1131-48ed-bd73-f32120aa29a2-proxy-tls podName:d2a0894c-1131-48ed-bd73-f32120aa29a2 nodeName:}" failed. No retries permitted until 2026-04-22 18:57:16.260594431 +0000 UTC m=+671.624394301 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d2a0894c-1131-48ed-bd73-f32120aa29a2-proxy-tls") pod "model-chainer-raw-1fc88-66db98d96c-4dpgr" (UID: "d2a0894c-1131-48ed-bd73-f32120aa29a2") : secret "model-chainer-raw-1fc88-serving-cert" not found Apr 22 18:57:15.761182 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:15.761163 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2a0894c-1131-48ed-bd73-f32120aa29a2-openshift-service-ca-bundle\") pod \"model-chainer-raw-1fc88-66db98d96c-4dpgr\" (UID: \"d2a0894c-1131-48ed-bd73-f32120aa29a2\") " pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" Apr 22 18:57:16.265373 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:16.265336 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d2a0894c-1131-48ed-bd73-f32120aa29a2-proxy-tls\") pod \"model-chainer-raw-1fc88-66db98d96c-4dpgr\" (UID: \"d2a0894c-1131-48ed-bd73-f32120aa29a2\") " pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" Apr 22 18:57:16.267823 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:16.267798 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d2a0894c-1131-48ed-bd73-f32120aa29a2-proxy-tls\") pod \"model-chainer-raw-1fc88-66db98d96c-4dpgr\" (UID: \"d2a0894c-1131-48ed-bd73-f32120aa29a2\") " pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" Apr 22 18:57:16.492143 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:16.492113 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" Apr 22 18:57:16.615352 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:16.615291 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr"] Apr 22 18:57:16.615675 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:57:16.615571 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2a0894c_1131_48ed_bd73_f32120aa29a2.slice/crio-4a70b8ad7fa54ac23cbc7ce87787177a71d08747be87ff86d1397a6b624dd780 WatchSource:0}: Error finding container 4a70b8ad7fa54ac23cbc7ce87787177a71d08747be87ff86d1397a6b624dd780: Status 404 returned error can't find the container with id 4a70b8ad7fa54ac23cbc7ce87787177a71d08747be87ff86d1397a6b624dd780 Apr 22 18:57:16.617446 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:16.617426 2535 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:57:17.319033 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:17.318990 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" event={"ID":"d2a0894c-1131-48ed-bd73-f32120aa29a2","Type":"ContainerStarted","Data":"4a70b8ad7fa54ac23cbc7ce87787177a71d08747be87ff86d1397a6b624dd780"} Apr 22 18:57:19.326931 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:19.326884 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" event={"ID":"d2a0894c-1131-48ed-bd73-f32120aa29a2","Type":"ContainerStarted","Data":"88d7bdd7f6901faa550a638cbd809512c8eed60a9d493762629e793ac87d0725"} Apr 22 18:57:19.327312 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:19.327029 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" Apr 22 18:57:19.343187 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:19.343136 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" podStartSLOduration=2.3398783180000002 podStartE2EDuration="4.34312302s" podCreationTimestamp="2026-04-22 18:57:15 +0000 UTC" firstStartedPulling="2026-04-22 18:57:16.617606475 +0000 UTC m=+671.981406342" lastFinishedPulling="2026-04-22 18:57:18.620851176 +0000 UTC m=+673.984651044" observedRunningTime="2026-04-22 18:57:19.342159452 +0000 UTC m=+674.705959341" watchObservedRunningTime="2026-04-22 18:57:19.34312302 +0000 UTC m=+674.706922909" Apr 22 18:57:25.334834 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:25.334805 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" Apr 22 18:57:25.630517 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:25.630418 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr"] Apr 22 18:57:25.630731 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:25.630654 2535 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" podUID="d2a0894c-1131-48ed-bd73-f32120aa29a2" containerName="model-chainer-raw-1fc88" containerID="cri-o://88d7bdd7f6901faa550a638cbd809512c8eed60a9d493762629e793ac87d0725" gracePeriod=30 Apr 22 18:57:30.333257 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:30.333223 2535 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" podUID="d2a0894c-1131-48ed-bd73-f32120aa29a2" containerName="model-chainer-raw-1fc88" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:57:35.333059 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:35.333018 2535 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" podUID="d2a0894c-1131-48ed-bd73-f32120aa29a2" containerName="model-chainer-raw-1fc88" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:57:40.333838 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:40.333802 2535 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" podUID="d2a0894c-1131-48ed-bd73-f32120aa29a2" containerName="model-chainer-raw-1fc88" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:57:40.334234 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:40.333918 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" Apr 22 18:57:45.333855 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:45.333816 2535 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" podUID="d2a0894c-1131-48ed-bd73-f32120aa29a2" containerName="model-chainer-raw-1fc88" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:57:50.333538 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:50.333496 2535 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" podUID="d2a0894c-1131-48ed-bd73-f32120aa29a2" containerName="model-chainer-raw-1fc88" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:57:55.333511 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:55.333425 2535 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" podUID="d2a0894c-1131-48ed-bd73-f32120aa29a2" containerName="model-chainer-raw-1fc88" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:57:55.773997 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:55.773977 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" Apr 22 18:57:55.849166 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:55.849132 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d2a0894c-1131-48ed-bd73-f32120aa29a2-proxy-tls\") pod \"d2a0894c-1131-48ed-bd73-f32120aa29a2\" (UID: \"d2a0894c-1131-48ed-bd73-f32120aa29a2\") " Apr 22 18:57:55.849322 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:55.849206 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2a0894c-1131-48ed-bd73-f32120aa29a2-openshift-service-ca-bundle\") pod \"d2a0894c-1131-48ed-bd73-f32120aa29a2\" (UID: \"d2a0894c-1131-48ed-bd73-f32120aa29a2\") " Apr 22 18:57:55.849538 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:55.849511 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2a0894c-1131-48ed-bd73-f32120aa29a2-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d2a0894c-1131-48ed-bd73-f32120aa29a2" (UID: "d2a0894c-1131-48ed-bd73-f32120aa29a2"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:57:55.851446 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:55.851424 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a0894c-1131-48ed-bd73-f32120aa29a2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d2a0894c-1131-48ed-bd73-f32120aa29a2" (UID: "d2a0894c-1131-48ed-bd73-f32120aa29a2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:57:55.950019 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:55.949994 2535 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d2a0894c-1131-48ed-bd73-f32120aa29a2-proxy-tls\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:57:55.950019 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:55.950018 2535 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d2a0894c-1131-48ed-bd73-f32120aa29a2-openshift-service-ca-bundle\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:57:56.434698 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:56.434659 2535 generic.go:358] "Generic (PLEG): container finished" podID="d2a0894c-1131-48ed-bd73-f32120aa29a2" containerID="88d7bdd7f6901faa550a638cbd809512c8eed60a9d493762629e793ac87d0725" exitCode=0 Apr 22 18:57:56.435178 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:56.434714 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" event={"ID":"d2a0894c-1131-48ed-bd73-f32120aa29a2","Type":"ContainerDied","Data":"88d7bdd7f6901faa550a638cbd809512c8eed60a9d493762629e793ac87d0725"} Apr 22 18:57:56.435178 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:56.434728 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" Apr 22 18:57:56.435178 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:56.434740 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr" event={"ID":"d2a0894c-1131-48ed-bd73-f32120aa29a2","Type":"ContainerDied","Data":"4a70b8ad7fa54ac23cbc7ce87787177a71d08747be87ff86d1397a6b624dd780"} Apr 22 18:57:56.435178 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:56.434754 2535 scope.go:117] "RemoveContainer" containerID="88d7bdd7f6901faa550a638cbd809512c8eed60a9d493762629e793ac87d0725" Apr 22 18:57:56.443097 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:56.443033 2535 scope.go:117] "RemoveContainer" containerID="88d7bdd7f6901faa550a638cbd809512c8eed60a9d493762629e793ac87d0725" Apr 22 18:57:56.443416 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:57:56.443389 2535 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88d7bdd7f6901faa550a638cbd809512c8eed60a9d493762629e793ac87d0725\": container with ID starting with 88d7bdd7f6901faa550a638cbd809512c8eed60a9d493762629e793ac87d0725 not found: ID does not exist" containerID="88d7bdd7f6901faa550a638cbd809512c8eed60a9d493762629e793ac87d0725" Apr 22 18:57:56.443495 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:56.443423 2535 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88d7bdd7f6901faa550a638cbd809512c8eed60a9d493762629e793ac87d0725"} err="failed to get container status \"88d7bdd7f6901faa550a638cbd809512c8eed60a9d493762629e793ac87d0725\": rpc error: code = NotFound desc = could not find container \"88d7bdd7f6901faa550a638cbd809512c8eed60a9d493762629e793ac87d0725\": container with ID starting with 88d7bdd7f6901faa550a638cbd809512c8eed60a9d493762629e793ac87d0725 not found: ID does not exist" Apr 22 18:57:56.455623 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:56.455594 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr"] Apr 22 18:57:56.459777 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:56.459753 2535 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-1fc88-66db98d96c-4dpgr"] Apr 22 18:57:57.259475 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:57:57.259443 2535 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2a0894c-1131-48ed-bd73-f32120aa29a2" path="/var/lib/kubelet/pods/d2a0894c-1131-48ed-bd73-f32120aa29a2/volumes" Apr 22 18:58:55.868963 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:55.868931 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz"] Apr 22 18:58:55.869347 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:55.869253 2535 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2a0894c-1131-48ed-bd73-f32120aa29a2" containerName="model-chainer-raw-1fc88" Apr 22 18:58:55.869347 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:55.869265 2535 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a0894c-1131-48ed-bd73-f32120aa29a2" containerName="model-chainer-raw-1fc88" Apr 22 18:58:55.869347 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:55.869309 2535 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2a0894c-1131-48ed-bd73-f32120aa29a2" containerName="model-chainer-raw-1fc88" Apr 22 18:58:55.871056 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:55.871038 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" Apr 22 18:58:55.873329 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:55.873304 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:58:55.873477 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:55.873358 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-ac146-serving-cert\"" Apr 22 18:58:55.873477 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:55.873361 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-ac146-kube-rbac-proxy-sar-config\"" Apr 22 18:58:55.873598 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:55.873474 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-j46rb\"" Apr 22 18:58:55.880018 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:55.879993 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz"] Apr 22 18:58:55.991147 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:55.991116 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/030d3e5c-e704-4828-968f-b00ec4102640-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz\" (UID: \"030d3e5c-e704-4828-968f-b00ec4102640\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" Apr 22 18:58:55.991345 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:55.991155 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/030d3e5c-e704-4828-968f-b00ec4102640-proxy-tls\") pod \"model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz\" (UID: \"030d3e5c-e704-4828-968f-b00ec4102640\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" Apr 22 18:58:56.092399 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:56.092366 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/030d3e5c-e704-4828-968f-b00ec4102640-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz\" (UID: \"030d3e5c-e704-4828-968f-b00ec4102640\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" Apr 22 18:58:56.092582 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:56.092406 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/030d3e5c-e704-4828-968f-b00ec4102640-proxy-tls\") pod \"model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz\" (UID: \"030d3e5c-e704-4828-968f-b00ec4102640\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" Apr 22 18:58:56.092582 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:58:56.092517 2535 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-serving-cert: secret "model-chainer-raw-hpa-ac146-serving-cert" not found Apr 22 18:58:56.092582 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:58:56.092581 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/030d3e5c-e704-4828-968f-b00ec4102640-proxy-tls podName:030d3e5c-e704-4828-968f-b00ec4102640 nodeName:}" failed. No retries permitted until 2026-04-22 18:58:56.592563918 +0000 UTC m=+771.956363784 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/030d3e5c-e704-4828-968f-b00ec4102640-proxy-tls") pod "model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" (UID: "030d3e5c-e704-4828-968f-b00ec4102640") : secret "model-chainer-raw-hpa-ac146-serving-cert" not found Apr 22 18:58:56.093025 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:56.093004 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/030d3e5c-e704-4828-968f-b00ec4102640-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz\" (UID: \"030d3e5c-e704-4828-968f-b00ec4102640\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" Apr 22 18:58:56.596299 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:56.596252 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/030d3e5c-e704-4828-968f-b00ec4102640-proxy-tls\") pod \"model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz\" (UID: \"030d3e5c-e704-4828-968f-b00ec4102640\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" Apr 22 18:58:56.598735 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:56.598708 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/030d3e5c-e704-4828-968f-b00ec4102640-proxy-tls\") pod \"model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz\" (UID: \"030d3e5c-e704-4828-968f-b00ec4102640\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" Apr 22 18:58:56.782239 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:56.782209 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" Apr 22 18:58:56.902545 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:56.902509 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz"] Apr 22 18:58:56.905435 ip-10-0-136-85 kubenswrapper[2535]: W0422 18:58:56.905407 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod030d3e5c_e704_4828_968f_b00ec4102640.slice/crio-46c3b7ed55c947fcd6163643eff2c31acca8c3b6b4f14403542692b1b1083da4 WatchSource:0}: Error finding container 46c3b7ed55c947fcd6163643eff2c31acca8c3b6b4f14403542692b1b1083da4: Status 404 returned error can't find the container with id 46c3b7ed55c947fcd6163643eff2c31acca8c3b6b4f14403542692b1b1083da4 Apr 22 18:58:57.616960 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:57.616915 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" event={"ID":"030d3e5c-e704-4828-968f-b00ec4102640","Type":"ContainerStarted","Data":"f1bf250adffc3e79e7bb5f8072e598602e128b2b2ff637161c56d88c3e69e983"} Apr 22 18:58:57.616960 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:57.616961 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" event={"ID":"030d3e5c-e704-4828-968f-b00ec4102640","Type":"ContainerStarted","Data":"46c3b7ed55c947fcd6163643eff2c31acca8c3b6b4f14403542692b1b1083da4"} Apr 22 18:58:57.617158 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:57.617045 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" Apr 22 18:58:57.632522 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:58:57.632475 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" podStartSLOduration=2.632462816 podStartE2EDuration="2.632462816s" podCreationTimestamp="2026-04-22 18:58:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:58:57.630812539 +0000 UTC m=+772.994612427" watchObservedRunningTime="2026-04-22 18:58:57.632462816 +0000 UTC m=+772.996262705" Apr 22 18:59:03.629299 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:03.629262 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" Apr 22 18:59:05.909965 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:05.909932 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz"] Apr 22 18:59:05.910355 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:05.910185 2535 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" podUID="030d3e5c-e704-4828-968f-b00ec4102640" containerName="model-chainer-raw-hpa-ac146" containerID="cri-o://f1bf250adffc3e79e7bb5f8072e598602e128b2b2ff637161c56d88c3e69e983" gracePeriod=30 Apr 22 18:59:08.627498 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:08.627463 2535 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" podUID="030d3e5c-e704-4828-968f-b00ec4102640" containerName="model-chainer-raw-hpa-ac146" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:59:13.626692 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:13.626650 2535 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" podUID="030d3e5c-e704-4828-968f-b00ec4102640" containerName="model-chainer-raw-hpa-ac146" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:59:18.626531 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:18.626491 2535 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" podUID="030d3e5c-e704-4828-968f-b00ec4102640" containerName="model-chainer-raw-hpa-ac146" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:59:18.626992 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:18.626610 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" Apr 22 18:59:23.626874 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:23.626829 2535 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" podUID="030d3e5c-e704-4828-968f-b00ec4102640" containerName="model-chainer-raw-hpa-ac146" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:59:28.627381 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:28.627292 2535 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" podUID="030d3e5c-e704-4828-968f-b00ec4102640" containerName="model-chainer-raw-hpa-ac146" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:59:33.627099 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:33.627050 2535 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" podUID="030d3e5c-e704-4828-968f-b00ec4102640" containerName="model-chainer-raw-hpa-ac146" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:59:36.057803 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:36.057782 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" Apr 22 18:59:36.192876 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:36.192798 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/030d3e5c-e704-4828-968f-b00ec4102640-openshift-service-ca-bundle\") pod \"030d3e5c-e704-4828-968f-b00ec4102640\" (UID: \"030d3e5c-e704-4828-968f-b00ec4102640\") " Apr 22 18:59:36.192876 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:36.192868 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/030d3e5c-e704-4828-968f-b00ec4102640-proxy-tls\") pod \"030d3e5c-e704-4828-968f-b00ec4102640\" (UID: \"030d3e5c-e704-4828-968f-b00ec4102640\") " Apr 22 18:59:36.193175 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:36.193152 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/030d3e5c-e704-4828-968f-b00ec4102640-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "030d3e5c-e704-4828-968f-b00ec4102640" (UID: "030d3e5c-e704-4828-968f-b00ec4102640"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:59:36.194970 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:36.194950 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/030d3e5c-e704-4828-968f-b00ec4102640-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "030d3e5c-e704-4828-968f-b00ec4102640" (UID: "030d3e5c-e704-4828-968f-b00ec4102640"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:59:36.294258 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:36.294227 2535 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/030d3e5c-e704-4828-968f-b00ec4102640-openshift-service-ca-bundle\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:59:36.294258 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:36.294251 2535 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/030d3e5c-e704-4828-968f-b00ec4102640-proxy-tls\") on node \"ip-10-0-136-85.ec2.internal\" DevicePath \"\"" Apr 22 18:59:36.733281 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:36.733241 2535 generic.go:358] "Generic (PLEG): container finished" podID="030d3e5c-e704-4828-968f-b00ec4102640" containerID="f1bf250adffc3e79e7bb5f8072e598602e128b2b2ff637161c56d88c3e69e983" exitCode=0 Apr 22 18:59:36.733433 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:36.733297 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" event={"ID":"030d3e5c-e704-4828-968f-b00ec4102640","Type":"ContainerDied","Data":"f1bf250adffc3e79e7bb5f8072e598602e128b2b2ff637161c56d88c3e69e983"} Apr 22 18:59:36.733433 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:36.733309 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" Apr 22 18:59:36.733433 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:36.733330 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz" event={"ID":"030d3e5c-e704-4828-968f-b00ec4102640","Type":"ContainerDied","Data":"46c3b7ed55c947fcd6163643eff2c31acca8c3b6b4f14403542692b1b1083da4"} Apr 22 18:59:36.733433 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:36.733349 2535 scope.go:117] "RemoveContainer" containerID="f1bf250adffc3e79e7bb5f8072e598602e128b2b2ff637161c56d88c3e69e983" Apr 22 18:59:36.742017 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:36.742000 2535 scope.go:117] "RemoveContainer" containerID="f1bf250adffc3e79e7bb5f8072e598602e128b2b2ff637161c56d88c3e69e983" Apr 22 18:59:36.742267 ip-10-0-136-85 kubenswrapper[2535]: E0422 18:59:36.742252 2535 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1bf250adffc3e79e7bb5f8072e598602e128b2b2ff637161c56d88c3e69e983\": container with ID starting with f1bf250adffc3e79e7bb5f8072e598602e128b2b2ff637161c56d88c3e69e983 not found: ID does not exist" containerID="f1bf250adffc3e79e7bb5f8072e598602e128b2b2ff637161c56d88c3e69e983" Apr 22 18:59:36.742308 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:36.742274 2535 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1bf250adffc3e79e7bb5f8072e598602e128b2b2ff637161c56d88c3e69e983"} err="failed to get container status \"f1bf250adffc3e79e7bb5f8072e598602e128b2b2ff637161c56d88c3e69e983\": rpc error: code = NotFound desc = could not find container \"f1bf250adffc3e79e7bb5f8072e598602e128b2b2ff637161c56d88c3e69e983\": container with ID starting with f1bf250adffc3e79e7bb5f8072e598602e128b2b2ff637161c56d88c3e69e983 not found: ID does not exist" Apr 22 18:59:36.753472 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:36.753451 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz"] Apr 22 18:59:36.756789 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:36.756771 2535 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-ac146-6f757f49f9-jgwrz"] Apr 22 18:59:37.258331 ip-10-0-136-85 kubenswrapper[2535]: I0422 18:59:37.258297 2535 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="030d3e5c-e704-4828-968f-b00ec4102640" path="/var/lib/kubelet/pods/030d3e5c-e704-4828-968f-b00ec4102640/volumes" Apr 22 19:08:18.126937 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:18.126896 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7njpk_8f788b1d-19db-4fcb-a814-0c4d333cdee1/global-pull-secret-syncer/0.log" Apr 22 19:08:18.298833 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:18.298802 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-cmrnt_92f3b87a-b94c-43ff-b9f1-8f66016fc2ce/konnectivity-agent/0.log" Apr 22 19:08:18.395520 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:18.395441 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-85.ec2.internal_e8fe672e26d8f153f55920269ad886fc/haproxy/0.log" Apr 22 19:08:22.421092 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:22.421060 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qt4p8_1d364c8b-db8c-4766-a5ca-5c7ad02e0594/node-exporter/0.log" Apr 22 19:08:22.440659 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:22.440639 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qt4p8_1d364c8b-db8c-4766-a5ca-5c7ad02e0594/kube-rbac-proxy/0.log" Apr 22 19:08:22.461315 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:22.461293 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qt4p8_1d364c8b-db8c-4766-a5ca-5c7ad02e0594/init-textfile/0.log" Apr 22 19:08:22.699326 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:22.699253 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-t2jsf_8db4bc65-ab32-491a-959b-d84aa9db30b1/prometheus-operator/0.log" Apr 22 19:08:22.719778 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:22.719756 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-t2jsf_8db4bc65-ab32-491a-959b-d84aa9db30b1/kube-rbac-proxy/0.log" Apr 22 19:08:22.741344 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:22.741323 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-8vl7f_3e0d1edc-3f84-4d4b-b197-983f0a58e4e9/prometheus-operator-admission-webhook/0.log" Apr 22 19:08:24.854851 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:24.854781 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f5c6bdb9d-llxc5_90e22475-c5a3-49c7-a874-a8337b0db905/console/0.log" Apr 22 19:08:24.881759 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:24.881735 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-xnrl5_04248ee6-0336-4fa6-bef3-9f8c9e646abb/download-server/0.log" Apr 22 19:08:25.137939 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.137827 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8"] Apr 22 19:08:25.138184 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.138170 2535 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="030d3e5c-e704-4828-968f-b00ec4102640" containerName="model-chainer-raw-hpa-ac146" Apr 22 19:08:25.138225 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.138186 2535 state_mem.go:107] "Deleted CPUSet assignment" podUID="030d3e5c-e704-4828-968f-b00ec4102640" containerName="model-chainer-raw-hpa-ac146" Apr 22 19:08:25.138266 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.138243 2535 memory_manager.go:356] "RemoveStaleState removing state" podUID="030d3e5c-e704-4828-968f-b00ec4102640" containerName="model-chainer-raw-hpa-ac146" Apr 22 19:08:25.139977 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.139961 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" Apr 22 19:08:25.142237 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.142216 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-k99vz\"/\"kube-root-ca.crt\"" Apr 22 19:08:25.142364 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.142215 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-k99vz\"/\"default-dockercfg-q7x9l\"" Apr 22 19:08:25.143016 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.143003 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-k99vz\"/\"openshift-service-ca.crt\"" Apr 22 19:08:25.149335 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.149313 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8"] Apr 22 19:08:25.226611 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.226576 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5d377b26-8a53-44e6-b31f-143d33b13201-podres\") pod \"perf-node-gather-daemonset-hzmz8\" (UID: \"5d377b26-8a53-44e6-b31f-143d33b13201\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" Apr 22 19:08:25.226763 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.226618 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlq72\" (UniqueName: \"kubernetes.io/projected/5d377b26-8a53-44e6-b31f-143d33b13201-kube-api-access-vlq72\") pod \"perf-node-gather-daemonset-hzmz8\" (UID: \"5d377b26-8a53-44e6-b31f-143d33b13201\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" Apr 22 19:08:25.226763 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.226687 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5d377b26-8a53-44e6-b31f-143d33b13201-proc\") pod \"perf-node-gather-daemonset-hzmz8\" (UID: \"5d377b26-8a53-44e6-b31f-143d33b13201\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" Apr 22 19:08:25.226763 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.226740 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5d377b26-8a53-44e6-b31f-143d33b13201-lib-modules\") pod \"perf-node-gather-daemonset-hzmz8\" (UID: \"5d377b26-8a53-44e6-b31f-143d33b13201\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" Apr 22 19:08:25.226870 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.226775 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5d377b26-8a53-44e6-b31f-143d33b13201-sys\") pod \"perf-node-gather-daemonset-hzmz8\" (UID: \"5d377b26-8a53-44e6-b31f-143d33b13201\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" Apr 22 19:08:25.327179 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.327147 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5d377b26-8a53-44e6-b31f-143d33b13201-proc\") pod \"perf-node-gather-daemonset-hzmz8\" (UID: \"5d377b26-8a53-44e6-b31f-143d33b13201\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" Apr 22 19:08:25.327179 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.327185 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5d377b26-8a53-44e6-b31f-143d33b13201-lib-modules\") pod \"perf-node-gather-daemonset-hzmz8\" (UID: \"5d377b26-8a53-44e6-b31f-143d33b13201\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" Apr 22 19:08:25.327414 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.327215 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5d377b26-8a53-44e6-b31f-143d33b13201-sys\") pod \"perf-node-gather-daemonset-hzmz8\" (UID: \"5d377b26-8a53-44e6-b31f-143d33b13201\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" Apr 22 19:08:25.327414 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.327243 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5d377b26-8a53-44e6-b31f-143d33b13201-podres\") pod \"perf-node-gather-daemonset-hzmz8\" (UID: \"5d377b26-8a53-44e6-b31f-143d33b13201\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" Apr 22 19:08:25.327414 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.327266 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlq72\" (UniqueName: \"kubernetes.io/projected/5d377b26-8a53-44e6-b31f-143d33b13201-kube-api-access-vlq72\") pod \"perf-node-gather-daemonset-hzmz8\" (UID: \"5d377b26-8a53-44e6-b31f-143d33b13201\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" Apr 22 19:08:25.327414 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.327266 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5d377b26-8a53-44e6-b31f-143d33b13201-proc\") pod \"perf-node-gather-daemonset-hzmz8\" (UID: \"5d377b26-8a53-44e6-b31f-143d33b13201\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" Apr 22 19:08:25.327414 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.327318 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5d377b26-8a53-44e6-b31f-143d33b13201-sys\") pod \"perf-node-gather-daemonset-hzmz8\" (UID: \"5d377b26-8a53-44e6-b31f-143d33b13201\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" Apr 22 19:08:25.327414 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.327329 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5d377b26-8a53-44e6-b31f-143d33b13201-lib-modules\") pod \"perf-node-gather-daemonset-hzmz8\" (UID: \"5d377b26-8a53-44e6-b31f-143d33b13201\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" Apr 22 19:08:25.327622 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.327415 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5d377b26-8a53-44e6-b31f-143d33b13201-podres\") pod \"perf-node-gather-daemonset-hzmz8\" (UID: \"5d377b26-8a53-44e6-b31f-143d33b13201\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" Apr 22 19:08:25.334295 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.334271 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlq72\" (UniqueName: \"kubernetes.io/projected/5d377b26-8a53-44e6-b31f-143d33b13201-kube-api-access-vlq72\") pod \"perf-node-gather-daemonset-hzmz8\" (UID: \"5d377b26-8a53-44e6-b31f-143d33b13201\") " pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" Apr 22 19:08:25.450019 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.449993 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" Apr 22 19:08:25.567202 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.567179 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8"] Apr 22 19:08:25.569865 ip-10-0-136-85 kubenswrapper[2535]: W0422 19:08:25.569831 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5d377b26_8a53_44e6_b31f_143d33b13201.slice/crio-6fdacef2f6962bbce62a15a310c353a7c4d8e8a517a9546b1fe544eec28005fd WatchSource:0}: Error finding container 6fdacef2f6962bbce62a15a310c353a7c4d8e8a517a9546b1fe544eec28005fd: Status 404 returned error can't find the container with id 6fdacef2f6962bbce62a15a310c353a7c4d8e8a517a9546b1fe544eec28005fd Apr 22 19:08:25.571541 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.571522 2535 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:08:25.936644 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.936618 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5xfcn_068518bd-bcf7-445a-954c-83b86af21011/dns/0.log" Apr 22 19:08:25.955266 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:25.955245 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5xfcn_068518bd-bcf7-445a-954c-83b86af21011/kube-rbac-proxy/0.log" Apr 22 19:08:26.059672 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:26.059642 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hgtq9_62c7a7b2-9390-4fb2-84e7-23f981d15afe/dns-node-resolver/0.log" Apr 22 19:08:26.354171 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:26.354081 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" event={"ID":"5d377b26-8a53-44e6-b31f-143d33b13201","Type":"ContainerStarted","Data":"d886907952a7dfe7b0363f04649282d61794b8c511abca558056d39ff5463e71"} Apr 22 19:08:26.354171 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:26.354117 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" event={"ID":"5d377b26-8a53-44e6-b31f-143d33b13201","Type":"ContainerStarted","Data":"6fdacef2f6962bbce62a15a310c353a7c4d8e8a517a9546b1fe544eec28005fd"} Apr 22 19:08:26.354390 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:26.354170 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" Apr 22 19:08:26.369645 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:26.369598 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" podStartSLOduration=1.369582443 podStartE2EDuration="1.369582443s" podCreationTimestamp="2026-04-22 19:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:08:26.368560305 +0000 UTC m=+1341.732360191" watchObservedRunningTime="2026-04-22 19:08:26.369582443 +0000 UTC m=+1341.733382335" Apr 22 19:08:26.483463 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:26.483435 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-58bf448f79-bgqh8_bf5c4059-62e9-440e-9901-5c1a13f99ab4/registry/0.log" Apr 22 19:08:26.501064 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:26.501044 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-642sx_476d713f-1df9-4369-9fcb-94a61680226e/node-ca/0.log" Apr 22 19:08:27.566766 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:27.566737 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-xlf8v_ff946298-48b5-420f-ae4d-8f56d72aebaf/serve-healthcheck-canary/0.log" Apr 22 19:08:27.991714 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:27.991686 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6m5dg_e58478a2-3048-4cd7-9baa-4ab6d6f7f226/kube-rbac-proxy/0.log" Apr 22 19:08:28.011605 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:28.011581 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6m5dg_e58478a2-3048-4cd7-9baa-4ab6d6f7f226/exporter/0.log" Apr 22 19:08:28.030728 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:28.030707 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6m5dg_e58478a2-3048-4cd7-9baa-4ab6d6f7f226/extractor/0.log" Apr 22 19:08:29.938703 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:29.938672 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-6f655776dd-xcjhs_7a0e563c-f2e0-49df-a8b6-91b9d8728e7a/manager/0.log" Apr 22 19:08:29.957051 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:29.957028 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-h4wrj_afa04f76-b491-46ff-b211-46822d1c0128/manager/0.log" Apr 22 19:08:29.976750 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:29.976730 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-nj94k_5383b6a7-aaf9-4d87-a3bf-a95b88093788/server/0.log" Apr 22 19:08:32.367564 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:32.367533 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-k99vz/perf-node-gather-daemonset-hzmz8" Apr 22 19:08:35.063321 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:35.063286 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87xk8_ac730e3f-49e1-4703-9a89-4d82e11d265d/kube-multus-additional-cni-plugins/0.log" Apr 22 19:08:35.079836 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:35.079814 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87xk8_ac730e3f-49e1-4703-9a89-4d82e11d265d/egress-router-binary-copy/0.log" Apr 22 19:08:35.099474 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:35.099453 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87xk8_ac730e3f-49e1-4703-9a89-4d82e11d265d/cni-plugins/0.log" Apr 22 19:08:35.121489 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:35.121469 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87xk8_ac730e3f-49e1-4703-9a89-4d82e11d265d/bond-cni-plugin/0.log" Apr 22 19:08:35.141738 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:35.141713 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87xk8_ac730e3f-49e1-4703-9a89-4d82e11d265d/routeoverride-cni/0.log" Apr 22 19:08:35.163620 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:35.163600 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87xk8_ac730e3f-49e1-4703-9a89-4d82e11d265d/whereabouts-cni-bincopy/0.log" Apr 22 19:08:35.182616 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:35.182596 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-87xk8_ac730e3f-49e1-4703-9a89-4d82e11d265d/whereabouts-cni/0.log" Apr 22 19:08:35.548235 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:35.548205 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pjv5t_2251f062-5650-40fb-b187-729124eb8087/kube-multus/0.log" Apr 22 19:08:35.615512 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:35.615485 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5g7dk_252dfd14-9c83-4928-bbcd-d84b479525bc/network-metrics-daemon/0.log" Apr 22 19:08:35.636324 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:35.636289 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5g7dk_252dfd14-9c83-4928-bbcd-d84b479525bc/kube-rbac-proxy/0.log" Apr 22 19:08:36.446879 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:36.446848 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b8wsf_88bb7a51-9742-46e6-817e-2c17c4357d07/ovn-controller/0.log" Apr 22 19:08:36.471835 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:36.471775 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b8wsf_88bb7a51-9742-46e6-817e-2c17c4357d07/ovn-acl-logging/0.log" Apr 22 19:08:36.488323 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:36.488298 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b8wsf_88bb7a51-9742-46e6-817e-2c17c4357d07/kube-rbac-proxy-node/0.log" Apr 22 19:08:36.510994 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:36.510969 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b8wsf_88bb7a51-9742-46e6-817e-2c17c4357d07/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:08:36.528705 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:36.528688 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b8wsf_88bb7a51-9742-46e6-817e-2c17c4357d07/northd/0.log" Apr 22 19:08:36.546831 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:36.546812 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b8wsf_88bb7a51-9742-46e6-817e-2c17c4357d07/nbdb/0.log" Apr 22 19:08:36.565512 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:36.565496 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b8wsf_88bb7a51-9742-46e6-817e-2c17c4357d07/sbdb/0.log" Apr 22 19:08:36.651939 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:36.651891 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-b8wsf_88bb7a51-9742-46e6-817e-2c17c4357d07/ovnkube-controller/0.log" Apr 22 19:08:38.243406 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:38.243374 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-89stm_85ff2eb7-3fb1-424b-9402-d67103c35bf2/network-check-target-container/0.log" Apr 22 19:08:39.146144 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:39.146118 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-hkwlr_d21b0e10-276b-4b90-8d4c-23aa447f6f29/iptables-alerter/0.log" Apr 22 19:08:39.695293 ip-10-0-136-85 kubenswrapper[2535]: I0422 19:08:39.695264 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-5rsfp_6aedca06-2e88-42cc-a622-3d71dec7b063/tuned/0.log"