Apr 22 17:53:57.100032 ip-10-0-130-112 systemd[1]: Starting Kubernetes Kubelet... Apr 22 17:53:57.513798 ip-10-0-130-112 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:57.513798 ip-10-0-130-112 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 17:53:57.513798 ip-10-0-130-112 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:57.513798 ip-10-0-130-112 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 17:53:57.513798 ip-10-0-130-112 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:57.516081 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.515997 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 17:53:57.520122 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520107 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:57.520122 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520123 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:57.520186 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520128 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:57.520186 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520131 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:57.520186 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520134 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:57.520186 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520137 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:57.520186 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520140 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:57.520186 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520143 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:57.520186 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520146 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:57.520186 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520148 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:57.520186 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520151 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:57.520186 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520154 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:57.520186 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520157 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:57.520186 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520160 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:57.520186 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520162 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:57.520186 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520165 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:57.520186 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520167 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:57.520186 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520171 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:57.520186 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520174 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:57.520186 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520177 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:57.520186 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520179 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:57.520186 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520182 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:57.520653 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520184 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:57.520653 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520187 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:57.520653 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520190 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:57.520653 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520192 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:57.520653 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520195 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:57.520653 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520198 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:57.520653 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520200 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:57.520653 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520203 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:57.520653 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520206 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:57.520653 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520209 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:57.520653 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520211 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:57.520653 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520213 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:57.520653 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520216 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:57.520653 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520220 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:57.520653 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520222 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:57.520653 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520225 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:57.520653 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520227 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:57.520653 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520230 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:57.520653 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520233 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:57.520653 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520235 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:57.521171 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520237 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:57.521171 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520240 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:57.521171 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520242 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:57.521171 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520245 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:57.521171 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520247 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:57.521171 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520249 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:57.521171 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520252 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:57.521171 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520254 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:57.521171 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520256 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:57.521171 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520259 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:57.521171 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520261 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:57.521171 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520264 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:57.521171 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520268 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:57.521171 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520272 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:57.521171 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520276 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:57.521171 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520278 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:57.521171 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520282 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:57.521171 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520284 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:57.521171 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520287 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:57.521625 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520289 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:57.521625 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520291 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:57.521625 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520295 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:57.521625 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520299 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:57.521625 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520303 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:57.521625 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520306 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:57.521625 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520309 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:57.521625 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520312 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:57.521625 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520314 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:57.521625 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520317 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:57.521625 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520319 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:57.521625 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520322 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:57.521625 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520324 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:57.521625 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520327 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:57.521625 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520329 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:57.521625 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520332 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:57.521625 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520335 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:57.521625 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520337 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:57.521625 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520344 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:57.522132 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520347 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:57.522132 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520349 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:57.522132 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520351 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:57.522132 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520354 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:57.522132 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520356 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:57.522132 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520359 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:57.522132 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520768 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:57.522132 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520775 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:57.522132 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520778 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:57.522132 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520782 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:57.522132 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520786 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:57.522132 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520789 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:57.522132 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520792 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:57.522132 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520795 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:57.522132 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520797 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:57.522132 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520800 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:57.522132 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520803 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:57.522132 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520805 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:57.522132 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520808 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:57.522580 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520810 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:57.522580 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520813 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:57.522580 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520815 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:57.522580 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520818 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:57.522580 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520820 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:57.522580 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520823 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:57.522580 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520826 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:57.522580 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520829 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:57.522580 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520831 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:57.522580 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520833 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:57.522580 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520836 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:57.522580 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520839 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:57.522580 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520842 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:57.522580 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520845 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:57.522580 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520847 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:57.522580 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520850 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:57.522580 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520852 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:57.522580 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520855 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:57.522580 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520857 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:57.522580 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520860 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:57.523121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520863 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:57.523121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520865 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:57.523121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520867 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:57.523121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520870 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:57.523121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520872 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:57.523121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520875 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:57.523121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520877 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:57.523121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520879 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:57.523121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520882 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:57.523121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520884 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:57.523121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520886 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:57.523121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520889 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:57.523121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520891 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:57.523121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520894 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:57.523121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520896 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:57.523121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520898 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:57.523121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520901 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:57.523121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520903 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:57.523121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520906 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:57.523121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520910 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:57.523628 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520914 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:57.523628 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520916 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:57.523628 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520919 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:57.523628 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520922 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:57.523628 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520926 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:57.523628 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520929 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:57.523628 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520931 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:57.523628 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520934 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:57.523628 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520936 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:57.523628 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520939 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:57.523628 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520941 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:57.523628 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520944 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:57.523628 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520947 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:57.523628 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520950 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:57.523628 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520952 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:57.523628 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520955 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:57.523628 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520958 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:57.523628 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520960 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:57.523628 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520963 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:57.523628 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520965 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520968 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520970 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520972 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520975 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520977 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520979 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520982 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520984 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520986 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520989 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520991 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520994 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.520996 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522072 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522082 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522088 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522093 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522102 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522105 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522110 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 17:53:57.524134 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522115 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522118 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522121 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522125 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522129 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522132 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522135 2578 flags.go:64] FLAG: --cgroup-root="" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522139 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522142 2578 flags.go:64] FLAG: --client-ca-file="" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522145 2578 flags.go:64] FLAG: --cloud-config="" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522148 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522150 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522155 2578 flags.go:64] FLAG: --cluster-domain="" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522158 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522161 2578 flags.go:64] FLAG: --config-dir="" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522164 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522167 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522172 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522174 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522177 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522181 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522183 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522186 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522189 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522193 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 17:53:57.524643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522197 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522201 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522204 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522207 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522210 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522213 2578 flags.go:64] FLAG: --enable-server="true" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522215 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522220 2578 flags.go:64] FLAG: --event-burst="100" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522223 2578 flags.go:64] FLAG: --event-qps="50" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522226 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522229 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522232 2578 flags.go:64] FLAG: --eviction-hard="" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522236 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522239 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522242 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522245 2578 flags.go:64] FLAG: --eviction-soft="" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522248 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522251 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522254 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522257 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522260 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522263 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522266 2578 flags.go:64] FLAG: --feature-gates="" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522270 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522273 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 17:53:57.525268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522276 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522280 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522283 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522287 2578 flags.go:64] FLAG: --help="false" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522289 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-130-112.ec2.internal" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522293 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522296 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522303 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522307 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522310 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522313 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522316 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522319 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522321 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522324 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522327 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522330 2578 flags.go:64] FLAG: --kube-reserved="" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522333 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522336 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522339 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522341 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522344 2578 flags.go:64] FLAG: --lock-file="" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522347 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522349 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 17:53:57.525874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522353 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522358 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522361 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522364 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522367 2578 flags.go:64] FLAG: --logging-format="text" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522370 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522373 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522376 2578 flags.go:64] FLAG: --manifest-url="" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522379 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522383 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522386 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522390 2578 flags.go:64] FLAG: --max-pods="110" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522393 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522396 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522398 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522403 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522406 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522409 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522412 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522420 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522423 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522427 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522430 2578 flags.go:64] FLAG: --pod-cidr="" Apr 22 17:53:57.526450 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522432 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522438 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522441 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522444 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522447 2578 flags.go:64] FLAG: --port="10250" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522450 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522453 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-090146a1df51da503" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522456 2578 flags.go:64] FLAG: --qos-reserved="" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522459 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522462 2578 flags.go:64] FLAG: --register-node="true" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522465 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522469 2578 flags.go:64] FLAG: --register-with-taints="" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522473 2578 flags.go:64] FLAG: --registry-burst="10" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522475 2578 flags.go:64] FLAG: --registry-qps="5" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522478 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522481 2578 flags.go:64] FLAG: --reserved-memory="" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522484 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522487 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522490 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522493 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522496 2578 flags.go:64] FLAG: --runonce="false" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522499 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522502 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522505 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522508 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522512 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 17:53:57.527018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522515 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522518 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522521 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522524 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522527 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522530 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522533 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522536 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522539 2578 flags.go:64] FLAG: --system-cgroups="" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522542 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522547 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522550 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522553 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522558 2578 flags.go:64] FLAG: --tls-min-version="" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522561 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522564 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522567 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522571 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522574 2578 flags.go:64] FLAG: --v="2" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522578 2578 flags.go:64] FLAG: --version="false" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522582 2578 flags.go:64] FLAG: --vmodule="" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522586 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.522590 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522685 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:57.527634 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522689 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:57.528222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522692 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:57.528222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522695 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:57.528222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522698 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:57.528222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522700 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:57.528222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522703 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:57.528222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522705 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:57.528222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522708 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:57.528222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522711 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:57.528222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522714 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:57.528222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522717 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:57.528222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522719 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:57.528222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522722 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:57.528222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522724 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:57.528222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522727 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:57.528222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522730 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:57.528222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522732 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:57.528222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522736 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:57.528222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522740 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:57.528222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522742 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:57.528222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522745 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:57.528745 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522748 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:57.528745 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522762 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:57.528745 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522765 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:57.528745 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522768 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:57.528745 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522771 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:57.528745 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522774 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:57.528745 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522776 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:57.528745 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522779 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:57.528745 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522781 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:57.528745 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522783 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:57.528745 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522786 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:57.528745 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522788 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:57.528745 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522791 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:57.528745 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522793 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:57.528745 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522795 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:57.528745 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522798 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:57.528745 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522801 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:57.528745 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522803 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:57.528745 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522806 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:57.528745 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522808 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:57.529282 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522811 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:57.529282 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522814 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:57.529282 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522817 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:57.529282 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522819 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:57.529282 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522822 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:57.529282 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522825 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:57.529282 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522827 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:57.529282 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522830 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:57.529282 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522832 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:57.529282 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522835 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:57.529282 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522837 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:57.529282 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522840 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:57.529282 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522842 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:57.529282 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522845 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:57.529282 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522847 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:57.529282 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522850 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:57.529282 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522852 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:57.529282 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522855 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:57.529282 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522858 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:57.529282 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522860 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:57.530233 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522863 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:57.530233 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522865 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:57.530233 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522868 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:57.530233 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522870 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:57.530233 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522873 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:57.530233 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522875 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:57.530233 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522878 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:57.530233 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522882 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:57.530233 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522885 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:57.530233 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522888 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:57.530233 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522890 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:57.530233 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522893 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:57.530233 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522896 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:57.530233 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522898 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:57.530233 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522901 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:57.530233 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522903 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:57.530233 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522905 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:57.530233 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522908 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:57.530233 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522910 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:57.530800 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522913 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:57.530800 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522915 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:57.530800 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522917 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:57.530800 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522920 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:57.530800 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.522922 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:57.530800 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.523482 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:57.530969 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.530896 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 17:53:57.530969 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.530912 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 17:53:57.530969 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.530961 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:57.530969 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.530967 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:57.531073 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.530972 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:57.531073 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.530975 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:57.531073 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.530979 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:57.531073 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.530981 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:57.531073 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.530984 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:57.531073 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.530987 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:57.531073 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.530991 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:57.531073 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.530996 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:57.531073 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.530999 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:57.531073 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531001 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:57.531073 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531004 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:57.531073 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531007 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:57.531073 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531009 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:57.531073 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531012 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:57.531073 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531014 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:57.531073 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531017 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:57.531073 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531020 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:57.531073 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531023 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:57.531073 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531025 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:57.531073 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531028 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:57.531572 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531031 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:57.531572 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531034 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:57.531572 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531037 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:57.531572 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531039 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:57.531572 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531042 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:57.531572 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531044 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:57.531572 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531047 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:57.531572 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531050 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:57.531572 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531052 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:57.531572 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531055 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:57.531572 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531057 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:57.531572 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531060 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:57.531572 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531063 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:57.531572 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531066 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:57.531572 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531068 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:57.531572 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531071 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:57.531572 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531074 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:57.531572 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531077 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:57.531572 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531080 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:57.531572 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531082 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:57.532090 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531085 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:57.532090 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531091 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:57.532090 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531095 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:57.532090 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531097 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:57.532090 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531100 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:57.532090 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531102 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:57.532090 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531105 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:57.532090 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531107 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:57.532090 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531110 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:57.532090 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531112 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:57.532090 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531115 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:57.532090 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531117 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:57.532090 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531121 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:57.532090 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531126 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:57.532090 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531129 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:57.532090 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531133 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:57.532090 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531136 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:57.532090 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531139 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:57.532090 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531141 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:57.532578 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531153 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:57.532578 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531157 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:57.532578 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531160 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:57.532578 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531163 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:57.532578 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531166 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:57.532578 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531168 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:57.532578 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531171 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:57.532578 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531173 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:57.532578 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531176 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:57.532578 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531178 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:57.532578 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531181 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:57.532578 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531183 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:57.532578 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531186 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:57.532578 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531188 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:57.532578 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531191 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:57.532578 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531194 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:57.532578 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531197 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:57.532578 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531199 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:57.532578 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531202 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:57.532578 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531204 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:57.533088 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531207 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:57.533088 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531209 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:57.533088 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531212 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:57.533088 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531214 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:57.533088 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531217 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:57.533088 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.531222 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:57.533088 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531319 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:57.533088 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531324 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:57.533088 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531327 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:57.533088 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531330 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:57.533088 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531334 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:57.533088 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531338 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:57.533088 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531342 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:57.533088 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531345 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:57.533088 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531348 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:57.533464 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531350 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:57.533464 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531353 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:57.533464 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531356 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:57.533464 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531358 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:57.533464 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531361 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:57.533464 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531363 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:57.533464 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531366 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:57.533464 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531368 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:57.533464 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531371 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:57.533464 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531373 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:57.533464 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531376 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:57.533464 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531378 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:57.533464 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531380 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:57.533464 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531384 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:57.533464 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531387 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:57.533464 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531389 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:57.533464 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531392 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:57.533464 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531394 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:57.533464 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531396 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:57.533937 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531399 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:57.533937 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531401 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:57.533937 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531404 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:57.533937 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531406 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:57.533937 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531409 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:57.533937 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531411 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:57.533937 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531414 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:57.533937 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531416 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:57.533937 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531418 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:57.533937 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531420 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:57.533937 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531423 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:57.533937 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531426 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:57.533937 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531429 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:57.533937 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531431 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:57.533937 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531433 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:57.533937 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531436 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:57.533937 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531438 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:57.533937 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531440 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:57.533937 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531444 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:57.534390 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531447 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:57.534390 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531450 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:57.534390 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531452 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:57.534390 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531455 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:57.534390 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531457 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:57.534390 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531459 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:57.534390 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531462 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:57.534390 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531464 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:57.534390 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531467 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:57.534390 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531470 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:57.534390 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531472 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:57.534390 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531475 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:57.534390 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531477 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:57.534390 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531480 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:57.534390 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531482 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:57.534390 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531485 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:57.534390 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531487 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:57.534390 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531490 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:57.534390 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531492 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:57.534390 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531495 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:57.534883 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531497 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:57.534883 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531499 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:57.534883 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531502 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:57.534883 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531504 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:57.534883 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531507 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:57.534883 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531510 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:57.534883 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531512 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:57.534883 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531514 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:57.534883 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531517 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:57.534883 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531520 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:57.534883 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531522 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:57.534883 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531524 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:57.534883 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531527 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:57.534883 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531529 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:57.534883 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531531 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:57.534883 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531534 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:57.534883 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531536 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:57.534883 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531538 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:57.534883 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:57.531541 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:57.535355 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.531545 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:57.535355 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.532233 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 17:53:57.535984 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.535970 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 17:53:57.536915 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.536904 2578 server.go:1019] "Starting client certificate rotation" Apr 22 17:53:57.537025 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.537002 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:53:57.537057 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.537053 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:53:57.560653 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.560633 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:53:57.565355 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.565338 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:53:57.579009 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.578984 2578 log.go:25] "Validated CRI v1 runtime API" Apr 22 17:53:57.585192 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.585176 2578 log.go:25] "Validated CRI v1 image API" Apr 22 17:53:57.586449 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.586423 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 17:53:57.591179 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.591119 2578 fs.go:135] Filesystem UUIDs: map[03e0a75d-a429-40ac-a966-412ceb935b3f:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 c59dde5f-e98b-41bc-8ee5-b77487d4ebf5:/dev/nvme0n1p3] Apr 22 17:53:57.591296 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.591179 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 17:53:57.595448 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.595426 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:53:57.597748 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.597634 2578 manager.go:217] Machine: {Timestamp:2026-04-22 17:53:57.59590037 +0000 UTC m=+0.384490124 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100389 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2754f3a44d6f987c016975b0218bbe SystemUUID:ec2754f3-a44d-6f98-7c01-6975b0218bbe BootID:d60229a6-a429-4fab-ad74-b801bbef182a Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a3:b7:d7:ee:83 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a3:b7:d7:ee:83 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:c2:b8:1b:6a:8b:46 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 17:53:57.597748 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.597736 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 17:53:57.597923 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.597867 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 17:53:57.600352 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.600325 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 17:53:57.600519 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.600355 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-112.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 17:53:57.600600 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.600533 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 17:53:57.600600 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.600546 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 17:53:57.600600 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.600565 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:53:57.601254 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.601241 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:53:57.601975 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.601962 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:53:57.602101 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.602091 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 17:53:57.604349 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.604337 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 22 17:53:57.604414 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.604354 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 17:53:57.604414 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.604371 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 17:53:57.604414 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.604384 2578 kubelet.go:397] "Adding apiserver pod source" Apr 22 17:53:57.604414 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.604396 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 17:53:57.605298 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.605286 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:53:57.605371 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.605308 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:53:57.608136 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.608121 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 17:53:57.611479 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.611459 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pb9v6" Apr 22 17:53:57.619861 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.619844 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pb9v6" Apr 22 17:53:57.620286 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.620271 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 17:53:57.621661 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.621647 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 17:53:57.621703 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.621672 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 17:53:57.621703 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.621679 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 17:53:57.621703 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.621687 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 17:53:57.621703 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.621692 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 17:53:57.621703 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.621698 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 17:53:57.621703 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.621703 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 17:53:57.621874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.621709 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 17:53:57.621874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.621716 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 17:53:57.621874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.621722 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 17:53:57.621874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.621738 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 17:53:57.621874 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.621747 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 17:53:57.623661 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.623648 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 17:53:57.623701 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.623664 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 17:53:57.628574 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.628559 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 17:53:57.628641 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.628599 2578 server.go:1295] "Started kubelet" Apr 22 17:53:57.628729 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.628706 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 17:53:57.628856 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.628819 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 17:53:57.628893 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.628880 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 17:53:57.629523 ip-10-0-130-112 systemd[1]: Started Kubernetes Kubelet. Apr 22 17:53:57.630061 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.629978 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 17:53:57.630190 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.630182 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:57.631980 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.631958 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:57.632177 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.632155 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 22 17:53:57.635074 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.635055 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-112.ec2.internal" not found Apr 22 17:53:57.636772 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:57.636734 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 17:53:57.637058 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.637038 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 17:53:57.637058 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.637047 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 17:53:57.637669 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.637655 2578 factory.go:55] Registering systemd factory Apr 22 17:53:57.637739 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.637673 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 17:53:57.637739 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.637678 2578 factory.go:223] Registration of the systemd container factory successfully Apr 22 17:53:57.637739 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.637681 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 17:53:57.637739 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.637703 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 17:53:57.638114 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.637795 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 22 17:53:57.638114 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.637804 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 22 17:53:57.638114 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:57.637843 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-112.ec2.internal\" not found" Apr 22 17:53:57.638114 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.637960 2578 factory.go:153] Registering CRI-O factory Apr 22 17:53:57.638114 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.637971 2578 factory.go:223] Registration of the crio container factory successfully Apr 22 17:53:57.638114 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.638014 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 17:53:57.638114 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.638034 2578 factory.go:103] Registering Raw factory Apr 22 17:53:57.638114 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.638046 2578 manager.go:1196] Started watching for new ooms in manager Apr 22 17:53:57.639104 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.639085 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:57.639295 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.639277 2578 manager.go:319] Starting recovery of all containers Apr 22 17:53:57.643733 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:57.643713 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-112.ec2.internal\" not found" node="ip-10-0-130-112.ec2.internal" Apr 22 17:53:57.649157 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.649038 2578 manager.go:324] Recovery completed Apr 22 17:53:57.650663 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.650644 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-112.ec2.internal" not found Apr 22 17:53:57.653349 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.653336 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:57.655227 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.655213 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-112.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:57.655283 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.655244 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-112.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:57.655283 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.655257 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-112.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:57.655794 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.655780 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 17:53:57.655851 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.655793 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 17:53:57.655851 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.655813 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:53:57.658099 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.658084 2578 policy_none.go:49] "None policy: Start" Apr 22 17:53:57.658099 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.658101 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 17:53:57.658204 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.658112 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 22 17:53:57.697073 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.693353 2578 manager.go:341] "Starting Device Plugin manager" Apr 22 17:53:57.697073 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:57.693385 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 17:53:57.697073 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.693394 2578 server.go:85] "Starting device plugin registration server" Apr 22 17:53:57.697073 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.693649 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 17:53:57.697073 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.693665 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 17:53:57.697073 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.694255 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 17:53:57.697073 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.694349 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 17:53:57.697073 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.694358 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 17:53:57.697073 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:57.694412 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 17:53:57.697073 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:57.694448 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-112.ec2.internal\" not found" Apr 22 17:53:57.708906 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.708888 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-112.ec2.internal" not found Apr 22 17:53:57.760015 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.759976 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 17:53:57.761444 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.761428 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 17:53:57.761496 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.761465 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 17:53:57.761496 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.761484 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 17:53:57.761496 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.761494 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 17:53:57.761593 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:57.761533 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 17:53:57.765168 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.764547 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:57.793851 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.793819 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:57.794844 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.794829 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-112.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:57.794925 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.794860 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-112.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:57.794925 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.794871 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-112.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:57.794925 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.794895 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-112.ec2.internal" Apr 22 17:53:57.804155 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.804132 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-112.ec2.internal" Apr 22 17:53:57.804276 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:57.804164 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-112.ec2.internal\": node \"ip-10-0-130-112.ec2.internal\" not found" Apr 22 17:53:57.862089 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.862038 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-112.ec2.internal"] Apr 22 17:53:57.864190 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.864162 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-112.ec2.internal" Apr 22 17:53:57.864190 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.864183 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal" Apr 22 17:53:57.895397 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.895375 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-112.ec2.internal" Apr 22 17:53:57.899742 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.899728 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal" Apr 22 17:53:57.906816 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.906801 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:53:57.910121 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:57.910108 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:53:58.038877 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.038802 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/900252a3f62a7cd1f973133b1b7ebceb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal\" (UID: \"900252a3f62a7cd1f973133b1b7ebceb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal" Apr 22 17:53:58.038877 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.038830 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/900252a3f62a7cd1f973133b1b7ebceb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal\" (UID: \"900252a3f62a7cd1f973133b1b7ebceb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal" Apr 22 17:53:58.038877 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.038858 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e8a719b01c36f6eedbdfdd5602a1c99a-config\") pod \"kube-apiserver-proxy-ip-10-0-130-112.ec2.internal\" (UID: \"e8a719b01c36f6eedbdfdd5602a1c99a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-112.ec2.internal" Apr 22 17:53:58.139719 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.139690 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/900252a3f62a7cd1f973133b1b7ebceb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal\" (UID: \"900252a3f62a7cd1f973133b1b7ebceb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal" Apr 22 17:53:58.139860 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.139727 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/900252a3f62a7cd1f973133b1b7ebceb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal\" (UID: \"900252a3f62a7cd1f973133b1b7ebceb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal" Apr 22 17:53:58.139860 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.139745 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e8a719b01c36f6eedbdfdd5602a1c99a-config\") pod \"kube-apiserver-proxy-ip-10-0-130-112.ec2.internal\" (UID: \"e8a719b01c36f6eedbdfdd5602a1c99a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-112.ec2.internal" Apr 22 17:53:58.139860 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.139809 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e8a719b01c36f6eedbdfdd5602a1c99a-config\") pod \"kube-apiserver-proxy-ip-10-0-130-112.ec2.internal\" (UID: \"e8a719b01c36f6eedbdfdd5602a1c99a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-112.ec2.internal" Apr 22 17:53:58.139860 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.139814 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/900252a3f62a7cd1f973133b1b7ebceb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal\" (UID: \"900252a3f62a7cd1f973133b1b7ebceb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal" Apr 22 17:53:58.139860 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.139814 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/900252a3f62a7cd1f973133b1b7ebceb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal\" (UID: \"900252a3f62a7cd1f973133b1b7ebceb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal" Apr 22 17:53:58.208905 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.208872 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-112.ec2.internal" Apr 22 17:53:58.212415 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.212394 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal" Apr 22 17:53:58.536362 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.536338 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 17:53:58.536997 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.536493 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:53:58.536997 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.536499 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:53:58.536997 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.536542 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:53:58.604772 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.604721 2578 apiserver.go:52] "Watching apiserver" Apr 22 17:53:58.614993 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.614961 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 17:53:58.615297 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.615271 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-rcvv7","kube-system/kube-apiserver-proxy-ip-10-0-130-112.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t","openshift-cluster-node-tuning-operator/tuned-xvsrm","openshift-image-registry/node-ca-w5cfx","openshift-multus/multus-additional-cni-plugins-f496z","openshift-multus/multus-dgpsk","openshift-network-operator/iptables-alerter-rzm27","openshift-dns/node-resolver-6d4jf","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal","openshift-multus/network-metrics-daemon-4jgqt","openshift-network-diagnostics/network-check-target-bnklt","openshift-ovn-kubernetes/ovnkube-node-6jx4b"] Apr 22 17:53:58.618122 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.618104 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rcvv7" Apr 22 17:53:58.620631 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.620612 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 17:53:58.620900 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.620881 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 17:53:58.620957 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.620902 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-p7w9b\"" Apr 22 17:53:58.621451 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.621431 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.621714 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.621638 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 17:48:57 +0000 UTC" deadline="2027-10-22 07:12:46.98994908 +0000 UTC" Apr 22 17:53:58.621714 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.621669 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13141h18m48.368285175s" Apr 22 17:53:58.623603 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.623583 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.623994 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.623975 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 17:53:58.623994 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.623983 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 17:53:58.624122 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.624006 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-l5nc5\"" Apr 22 17:53:58.624122 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.624047 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 17:53:58.625621 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.625602 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-w5cfx" Apr 22 17:53:58.625940 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.625922 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:53:58.626527 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.626514 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 17:53:58.626622 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.626608 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-nqhxk\"" Apr 22 17:53:58.627997 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.627982 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 17:53:58.628643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.628046 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 17:53:58.628741 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.628727 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 17:53:58.628822 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.628767 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.630365 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.628889 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-btfrc\"" Apr 22 17:53:58.631585 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.631569 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 17:53:58.632146 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.631736 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wdwr4\"" Apr 22 17:53:58.632503 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.632486 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 17:53:58.632582 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.632507 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 17:53:58.632582 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.632550 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 17:53:58.632671 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.632582 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 17:53:58.633472 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.633452 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.635780 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.635747 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rzm27" Apr 22 17:53:58.635978 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.635852 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-scn75\"" Apr 22 17:53:58.636137 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.635866 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 17:53:58.637888 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.637869 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 17:53:58.638500 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.638236 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6d4jf" Apr 22 17:53:58.640388 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.640366 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:53:58.640483 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.640421 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kkmxm\"" Apr 22 17:53:58.640671 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.640655 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 17:53:58.642172 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642150 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0ebf5cfe-98a9-4a57-9613-e936af7bade0-socket-dir\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.642231 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642188 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-system-cni-dir\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.642279 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642244 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-multus-socket-dir-parent\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.642322 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642301 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-lib-modules\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.642371 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642354 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsd4c\" (UniqueName: \"kubernetes.io/projected/77641058-c12b-451a-a008-3394e1d05e7f-kube-api-access-nsd4c\") pod \"node-ca-w5cfx\" (UID: \"77641058-c12b-451a-a008-3394e1d05e7f\") " pod="openshift-image-registry/node-ca-w5cfx" Apr 22 17:53:58.642411 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642383 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0ebf5cfe-98a9-4a57-9613-e936af7bade0-etc-selinux\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.642451 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642409 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-host-run-netns\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.642451 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642431 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/60af7321-9798-4032-9193-93cd4606e87a-multus-daemon-config\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.642526 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642452 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ebf5cfe-98a9-4a57-9613-e936af7bade0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.642526 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642477 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-hostroot\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.642526 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642509 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-host-run-multus-certs\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.642651 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642537 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d8999947-ef96-4a70-8257-7e319d5967db-system-cni-dir\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.642651 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642562 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg75m\" (UniqueName: \"kubernetes.io/projected/d8999947-ef96-4a70-8257-7e319d5967db-kube-api-access-cg75m\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.642651 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642590 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htx95\" (UniqueName: \"kubernetes.io/projected/5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb-kube-api-access-htx95\") pod \"node-resolver-6d4jf\" (UID: \"5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb\") " pod="openshift-dns/node-resolver-6d4jf" Apr 22 17:53:58.642651 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642614 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/60af7321-9798-4032-9193-93cd4606e87a-cni-binary-copy\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.642651 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642637 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f2a82d37-d50f-4213-8068-442b3abeb6b3-host-slash\") pod \"iptables-alerter-rzm27\" (UID: \"f2a82d37-d50f-4213-8068-442b3abeb6b3\") " pod="openshift-network-operator/iptables-alerter-rzm27" Apr 22 17:53:58.642897 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642639 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:53:58.642897 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642661 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cec375e4-f4c2-456e-ae9c-d41aec34f6d1-agent-certs\") pod \"konnectivity-agent-rcvv7\" (UID: \"cec375e4-f4c2-456e-ae9c-d41aec34f6d1\") " pod="kube-system/konnectivity-agent-rcvv7" Apr 22 17:53:58.642897 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642684 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-modprobe-d\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.642897 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642707 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0ebf5cfe-98a9-4a57-9613-e936af7bade0-registration-dir\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.642897 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642729 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-host-var-lib-kubelet\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.642897 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642784 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d8999947-ef96-4a70-8257-7e319d5967db-cnibin\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.642897 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:58.642734 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jgqt" podUID="8beb4b83-ece9-44df-bc80-fea79bf050d5" Apr 22 17:53:58.642897 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642803 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:53:58.642897 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642807 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-host-var-lib-cni-bin\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.642897 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642830 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh4lr\" (UniqueName: \"kubernetes.io/projected/60af7321-9798-4032-9193-93cd4606e87a-kube-api-access-jh4lr\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.642897 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642852 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-systemd\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.642897 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:58.642859 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnklt" podUID="76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1" Apr 22 17:53:58.642897 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642874 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-cnibin\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.642897 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642899 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f2a82d37-d50f-4213-8068-442b3abeb6b3-iptables-alerter-script\") pod \"iptables-alerter-rzm27\" (UID: \"f2a82d37-d50f-4213-8068-442b3abeb6b3\") " pod="openshift-network-operator/iptables-alerter-rzm27" Apr 22 17:53:58.643482 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642920 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cec375e4-f4c2-456e-ae9c-d41aec34f6d1-konnectivity-ca\") pod \"konnectivity-agent-rcvv7\" (UID: \"cec375e4-f4c2-456e-ae9c-d41aec34f6d1\") " pod="kube-system/konnectivity-agent-rcvv7" Apr 22 17:53:58.643482 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642950 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-var-lib-kubelet\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.643482 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.642987 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d8999947-ef96-4a70-8257-7e319d5967db-cni-binary-copy\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.643482 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643022 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d8999947-ef96-4a70-8257-7e319d5967db-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.643482 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643048 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d8999947-ef96-4a70-8257-7e319d5967db-os-release\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.643482 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643052 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 17:53:58.643482 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643066 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 17:53:58.643482 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643073 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0ebf5cfe-98a9-4a57-9613-e936af7bade0-device-dir\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.643482 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643080 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-2cspk\"" Apr 22 17:53:58.643482 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643099 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-kubernetes\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.643482 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643123 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-os-release\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.643482 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643126 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 17:53:58.643482 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643153 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-sys\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.643482 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643175 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77641058-c12b-451a-a008-3394e1d05e7f-host\") pod \"node-ca-w5cfx\" (UID: \"77641058-c12b-451a-a008-3394e1d05e7f\") " pod="openshift-image-registry/node-ca-w5cfx" Apr 22 17:53:58.643482 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643219 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d8999947-ef96-4a70-8257-7e319d5967db-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.643482 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643244 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkq2w\" (UniqueName: \"kubernetes.io/projected/0ebf5cfe-98a9-4a57-9613-e936af7bade0-kube-api-access-mkq2w\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.643482 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643270 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-multus-cni-dir\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.643482 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643288 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mgd9\" (UniqueName: \"kubernetes.io/projected/f2a82d37-d50f-4213-8068-442b3abeb6b3-kube-api-access-9mgd9\") pod \"iptables-alerter-rzm27\" (UID: \"f2a82d37-d50f-4213-8068-442b3abeb6b3\") " pod="openshift-network-operator/iptables-alerter-rzm27" Apr 22 17:53:58.644197 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643302 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77641058-c12b-451a-a008-3394e1d05e7f-serviceca\") pod \"node-ca-w5cfx\" (UID: \"77641058-c12b-451a-a008-3394e1d05e7f\") " pod="openshift-image-registry/node-ca-w5cfx" Apr 22 17:53:58.644197 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643337 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0ebf5cfe-98a9-4a57-9613-e936af7bade0-sys-fs\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.644197 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643366 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-etc-kubernetes\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.644197 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643423 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-tuned\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.644197 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643468 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-host-var-lib-cni-multus\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.644197 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643493 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-sysctl-d\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.644197 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643517 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-tmp\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.644197 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643568 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwkfx\" (UniqueName: \"kubernetes.io/projected/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-kube-api-access-lwkfx\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.644197 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643610 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d8999947-ef96-4a70-8257-7e319d5967db-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.644197 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643635 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb-hosts-file\") pod \"node-resolver-6d4jf\" (UID: \"5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb\") " pod="openshift-dns/node-resolver-6d4jf" Apr 22 17:53:58.644197 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643663 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-multus-conf-dir\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.644197 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643688 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-host-run-k8s-cni-cncf-io\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.644197 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643712 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-sysconfig\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.644197 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643738 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-sysctl-conf\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.644197 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643780 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-run\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.644197 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643809 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-host\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.644197 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.643832 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb-tmp-dir\") pod \"node-resolver-6d4jf\" (UID: \"5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb\") " pod="openshift-dns/node-resolver-6d4jf" Apr 22 17:53:58.645235 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.645217 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.647574 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.647555 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-twsbx\"" Apr 22 17:53:58.647860 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.647825 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 17:53:58.647860 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.647840 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 17:53:58.647860 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.647850 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 17:53:58.647860 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.647859 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 17:53:58.648924 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.648901 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 17:53:58.648980 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.648922 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 17:53:58.663311 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.663288 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:53:58.685846 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.685821 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-84znr" Apr 22 17:53:58.710789 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.710748 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-84znr" Apr 22 17:53:58.738920 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.738902 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 17:53:58.744468 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744438 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-run-ovn-kubernetes\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.744590 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744486 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/90656894-8740-4f11-8fc1-932678cddd3c-ovnkube-script-lib\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.744590 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744520 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-host-run-k8s-cni-cncf-io\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.744590 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744548 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-sysconfig\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.744590 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744574 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-sysctl-conf\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.744771 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744595 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-host-run-k8s-cni-cncf-io\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.744771 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744598 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-run\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.744771 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744628 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-sysconfig\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.744771 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744643 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-run\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.744771 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744647 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-host\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.744771 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744680 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb-tmp-dir\") pod \"node-resolver-6d4jf\" (UID: \"5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb\") " pod="openshift-dns/node-resolver-6d4jf" Apr 22 17:53:58.744771 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744683 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-host\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.744771 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744700 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-sysctl-conf\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.744771 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744709 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z9bg\" (UniqueName: \"kubernetes.io/projected/8beb4b83-ece9-44df-bc80-fea79bf050d5-kube-api-access-4z9bg\") pod \"network-metrics-daemon-4jgqt\" (UID: \"8beb4b83-ece9-44df-bc80-fea79bf050d5\") " pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:53:58.744771 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744770 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-systemd-units\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.745185 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744793 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0ebf5cfe-98a9-4a57-9613-e936af7bade0-socket-dir\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.745185 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744808 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-system-cni-dir\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.745185 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744822 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-multus-socket-dir-parent\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.745185 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744838 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-lib-modules\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.745185 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744853 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsd4c\" (UniqueName: \"kubernetes.io/projected/77641058-c12b-451a-a008-3394e1d05e7f-kube-api-access-nsd4c\") pod \"node-ca-w5cfx\" (UID: \"77641058-c12b-451a-a008-3394e1d05e7f\") " pod="openshift-image-registry/node-ca-w5cfx" Apr 22 17:53:58.745185 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744871 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-cni-bin\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.745185 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744876 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-system-cni-dir\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.745185 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744894 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0ebf5cfe-98a9-4a57-9613-e936af7bade0-etc-selinux\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.745185 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744894 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-multus-socket-dir-parent\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.745185 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744899 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0ebf5cfe-98a9-4a57-9613-e936af7bade0-socket-dir\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.745185 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744915 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-host-run-netns\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.745185 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744946 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-host-run-netns\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.745185 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744951 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/60af7321-9798-4032-9193-93cd4606e87a-multus-daemon-config\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.745185 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744980 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-slash\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.745185 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.744984 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-lib-modules\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.745185 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745008 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-var-lib-openvswitch\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.745185 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0ebf5cfe-98a9-4a57-9613-e936af7bade0-etc-selinux\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.745939 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745037 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ebf5cfe-98a9-4a57-9613-e936af7bade0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.745939 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745060 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-hostroot\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.745939 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745085 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-host-run-multus-certs\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.745939 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745108 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d8999947-ef96-4a70-8257-7e319d5967db-system-cni-dir\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.745939 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745136 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cg75m\" (UniqueName: \"kubernetes.io/projected/d8999947-ef96-4a70-8257-7e319d5967db-kube-api-access-cg75m\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.745939 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745183 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-hostroot\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.745939 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745226 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-htx95\" (UniqueName: \"kubernetes.io/projected/5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb-kube-api-access-htx95\") pod \"node-resolver-6d4jf\" (UID: \"5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb\") " pod="openshift-dns/node-resolver-6d4jf" Apr 22 17:53:58.745939 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745257 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/60af7321-9798-4032-9193-93cd4606e87a-cni-binary-copy\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.745939 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745268 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-host-run-multus-certs\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.745939 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745281 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f2a82d37-d50f-4213-8068-442b3abeb6b3-host-slash\") pod \"iptables-alerter-rzm27\" (UID: \"f2a82d37-d50f-4213-8068-442b3abeb6b3\") " pod="openshift-network-operator/iptables-alerter-rzm27" Apr 22 17:53:58.745939 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745326 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d8999947-ef96-4a70-8257-7e319d5967db-system-cni-dir\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.745939 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745354 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb-tmp-dir\") pod \"node-resolver-6d4jf\" (UID: \"5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb\") " pod="openshift-dns/node-resolver-6d4jf" Apr 22 17:53:58.745939 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745357 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cec375e4-f4c2-456e-ae9c-d41aec34f6d1-agent-certs\") pod \"konnectivity-agent-rcvv7\" (UID: \"cec375e4-f4c2-456e-ae9c-d41aec34f6d1\") " pod="kube-system/konnectivity-agent-rcvv7" Apr 22 17:53:58.745939 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745370 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f2a82d37-d50f-4213-8068-442b3abeb6b3-host-slash\") pod \"iptables-alerter-rzm27\" (UID: \"f2a82d37-d50f-4213-8068-442b3abeb6b3\") " pod="openshift-network-operator/iptables-alerter-rzm27" Apr 22 17:53:58.745939 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745462 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ebf5cfe-98a9-4a57-9613-e936af7bade0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.745939 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745486 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-modprobe-d\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.745939 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745521 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0ebf5cfe-98a9-4a57-9613-e936af7bade0-registration-dir\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.746618 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745557 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-host-var-lib-kubelet\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.746618 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745581 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d8999947-ef96-4a70-8257-7e319d5967db-cnibin\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.746618 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745583 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/60af7321-9798-4032-9193-93cd4606e87a-multus-daemon-config\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.746618 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745602 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-host-var-lib-cni-bin\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.746618 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745625 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jh4lr\" (UniqueName: \"kubernetes.io/projected/60af7321-9798-4032-9193-93cd4606e87a-kube-api-access-jh4lr\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.746618 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745634 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-modprobe-d\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.746618 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745645 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-systemd\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.746618 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745667 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-cnibin\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.746618 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745670 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-host-var-lib-kubelet\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.746618 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745683 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-host-var-lib-cni-bin\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.746618 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745715 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-cnibin\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.746618 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745724 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d8999947-ef96-4a70-8257-7e319d5967db-cnibin\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.746618 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745721 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 17:53:58.746618 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745784 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0ebf5cfe-98a9-4a57-9613-e936af7bade0-registration-dir\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.746618 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745796 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-systemd\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.746618 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745818 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f2a82d37-d50f-4213-8068-442b3abeb6b3-iptables-alerter-script\") pod \"iptables-alerter-rzm27\" (UID: \"f2a82d37-d50f-4213-8068-442b3abeb6b3\") " pod="openshift-network-operator/iptables-alerter-rzm27" Apr 22 17:53:58.746618 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745841 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/60af7321-9798-4032-9193-93cd4606e87a-cni-binary-copy\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.746618 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745848 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cec375e4-f4c2-456e-ae9c-d41aec34f6d1-konnectivity-ca\") pod \"konnectivity-agent-rcvv7\" (UID: \"cec375e4-f4c2-456e-ae9c-d41aec34f6d1\") " pod="kube-system/konnectivity-agent-rcvv7" Apr 22 17:53:58.747613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745873 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-var-lib-kubelet\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.747613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745896 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d8999947-ef96-4a70-8257-7e319d5967db-cni-binary-copy\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.747613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745923 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d8999947-ef96-4a70-8257-7e319d5967db-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.747613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745955 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-run-netns\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.747613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.745994 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-cni-netd\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.747613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746023 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d8999947-ef96-4a70-8257-7e319d5967db-os-release\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.747613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746050 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0ebf5cfe-98a9-4a57-9613-e936af7bade0-device-dir\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.747613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746072 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-kubernetes\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.747613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746099 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-run-openvswitch\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.747613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746125 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.747613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746153 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/90656894-8740-4f11-8fc1-932678cddd3c-ovn-node-metrics-cert\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.747613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746181 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-os-release\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.747613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746204 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-sys\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.747613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746226 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77641058-c12b-451a-a008-3394e1d05e7f-host\") pod \"node-ca-w5cfx\" (UID: \"77641058-c12b-451a-a008-3394e1d05e7f\") " pod="openshift-image-registry/node-ca-w5cfx" Apr 22 17:53:58.747613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746251 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs\") pod \"network-metrics-daemon-4jgqt\" (UID: \"8beb4b83-ece9-44df-bc80-fea79bf050d5\") " pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:53:58.747613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746277 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/90656894-8740-4f11-8fc1-932678cddd3c-env-overrides\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.747613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746304 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d8999947-ef96-4a70-8257-7e319d5967db-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.748353 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746321 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f2a82d37-d50f-4213-8068-442b3abeb6b3-iptables-alerter-script\") pod \"iptables-alerter-rzm27\" (UID: \"f2a82d37-d50f-4213-8068-442b3abeb6b3\") " pod="openshift-network-operator/iptables-alerter-rzm27" Apr 22 17:53:58.748353 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746330 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f92ls\" (UniqueName: \"kubernetes.io/projected/90656894-8740-4f11-8fc1-932678cddd3c-kube-api-access-f92ls\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.748353 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746360 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkq2w\" (UniqueName: \"kubernetes.io/projected/0ebf5cfe-98a9-4a57-9613-e936af7bade0-kube-api-access-mkq2w\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.748353 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746381 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cec375e4-f4c2-456e-ae9c-d41aec34f6d1-konnectivity-ca\") pod \"konnectivity-agent-rcvv7\" (UID: \"cec375e4-f4c2-456e-ae9c-d41aec34f6d1\") " pod="kube-system/konnectivity-agent-rcvv7" Apr 22 17:53:58.748353 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746395 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-multus-cni-dir\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.748353 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746410 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-var-lib-kubelet\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.748353 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746424 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mgd9\" (UniqueName: \"kubernetes.io/projected/f2a82d37-d50f-4213-8068-442b3abeb6b3-kube-api-access-9mgd9\") pod \"iptables-alerter-rzm27\" (UID: \"f2a82d37-d50f-4213-8068-442b3abeb6b3\") " pod="openshift-network-operator/iptables-alerter-rzm27" Apr 22 17:53:58.748353 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746457 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/90656894-8740-4f11-8fc1-932678cddd3c-ovnkube-config\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.748353 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746515 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d8999947-ef96-4a70-8257-7e319d5967db-os-release\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.748353 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746525 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77641058-c12b-451a-a008-3394e1d05e7f-serviceca\") pod \"node-ca-w5cfx\" (UID: \"77641058-c12b-451a-a008-3394e1d05e7f\") " pod="openshift-image-registry/node-ca-w5cfx" Apr 22 17:53:58.748353 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746552 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-run-ovn\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.748353 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746572 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0ebf5cfe-98a9-4a57-9613-e936af7bade0-device-dir\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.748353 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746578 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-log-socket\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.748353 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746607 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0ebf5cfe-98a9-4a57-9613-e936af7bade0-sys-fs\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.748353 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746623 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-kubernetes\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.748353 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746636 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-etc-kubernetes\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.748353 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746670 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-tuned\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.749112 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746773 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-sys\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.749112 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.746946 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d8999947-ef96-4a70-8257-7e319d5967db-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.749112 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747015 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0ebf5cfe-98a9-4a57-9613-e936af7bade0-sys-fs\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.749112 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747019 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-etc-kubernetes\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.749112 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747046 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77641058-c12b-451a-a008-3394e1d05e7f-host\") pod \"node-ca-w5cfx\" (UID: \"77641058-c12b-451a-a008-3394e1d05e7f\") " pod="openshift-image-registry/node-ca-w5cfx" Apr 22 17:53:58.749112 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747084 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-kubelet\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.749112 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747113 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-run-systemd\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.749112 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747159 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-host-var-lib-cni-multus\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.749112 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747185 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-sysctl-d\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.749112 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747219 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d8999947-ef96-4a70-8257-7e319d5967db-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.749112 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747241 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-tmp\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.749112 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747332 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-os-release\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.749112 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747374 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-multus-cni-dir\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.749112 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747409 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-sysctl-d\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.749112 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747444 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-host-var-lib-cni-multus\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.749112 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747485 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwkfx\" (UniqueName: \"kubernetes.io/projected/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-kube-api-access-lwkfx\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.749112 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747514 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d8999947-ef96-4a70-8257-7e319d5967db-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.749112 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747556 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb-hosts-file\") pod \"node-resolver-6d4jf\" (UID: \"5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb\") " pod="openshift-dns/node-resolver-6d4jf" Apr 22 17:53:58.749951 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747588 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwkkx\" (UniqueName: \"kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx\") pod \"network-check-target-bnklt\" (UID: \"76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1\") " pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:53:58.749951 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747619 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-multus-conf-dir\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.749951 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747645 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-etc-openvswitch\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.749951 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747683 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-node-log\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.749951 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.747842 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d8999947-ef96-4a70-8257-7e319d5967db-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.749951 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.748011 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb-hosts-file\") pod \"node-resolver-6d4jf\" (UID: \"5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb\") " pod="openshift-dns/node-resolver-6d4jf" Apr 22 17:53:58.749951 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.748061 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/60af7321-9798-4032-9193-93cd4606e87a-multus-conf-dir\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.749951 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.748413 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77641058-c12b-451a-a008-3394e1d05e7f-serviceca\") pod \"node-ca-w5cfx\" (UID: \"77641058-c12b-451a-a008-3394e1d05e7f\") " pod="openshift-image-registry/node-ca-w5cfx" Apr 22 17:53:58.749951 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.748525 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d8999947-ef96-4a70-8257-7e319d5967db-cni-binary-copy\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.749951 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.749566 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-etc-tuned\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.749951 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.749677 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-tmp\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.749951 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.749842 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cec375e4-f4c2-456e-ae9c-d41aec34f6d1-agent-certs\") pod \"konnectivity-agent-rcvv7\" (UID: \"cec375e4-f4c2-456e-ae9c-d41aec34f6d1\") " pod="kube-system/konnectivity-agent-rcvv7" Apr 22 17:53:58.754331 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.754311 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsd4c\" (UniqueName: \"kubernetes.io/projected/77641058-c12b-451a-a008-3394e1d05e7f-kube-api-access-nsd4c\") pod \"node-ca-w5cfx\" (UID: \"77641058-c12b-451a-a008-3394e1d05e7f\") " pod="openshift-image-registry/node-ca-w5cfx" Apr 22 17:53:58.756383 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.756356 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh4lr\" (UniqueName: \"kubernetes.io/projected/60af7321-9798-4032-9193-93cd4606e87a-kube-api-access-jh4lr\") pod \"multus-dgpsk\" (UID: \"60af7321-9798-4032-9193-93cd4606e87a\") " pod="openshift-multus/multus-dgpsk" Apr 22 17:53:58.757073 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.757051 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mgd9\" (UniqueName: \"kubernetes.io/projected/f2a82d37-d50f-4213-8068-442b3abeb6b3-kube-api-access-9mgd9\") pod \"iptables-alerter-rzm27\" (UID: \"f2a82d37-d50f-4213-8068-442b3abeb6b3\") " pod="openshift-network-operator/iptables-alerter-rzm27" Apr 22 17:53:58.757484 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.757461 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg75m\" (UniqueName: \"kubernetes.io/projected/d8999947-ef96-4a70-8257-7e319d5967db-kube-api-access-cg75m\") pod \"multus-additional-cni-plugins-f496z\" (UID: \"d8999947-ef96-4a70-8257-7e319d5967db\") " pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:58.757877 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.757857 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwkfx\" (UniqueName: \"kubernetes.io/projected/ef7bdbb8-6d5c-428b-9aa0-57c42271876a-kube-api-access-lwkfx\") pod \"tuned-xvsrm\" (UID: \"ef7bdbb8-6d5c-428b-9aa0-57c42271876a\") " pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.758138 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.758110 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-htx95\" (UniqueName: \"kubernetes.io/projected/5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb-kube-api-access-htx95\") pod \"node-resolver-6d4jf\" (UID: \"5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb\") " pod="openshift-dns/node-resolver-6d4jf" Apr 22 17:53:58.758228 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.758208 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkq2w\" (UniqueName: \"kubernetes.io/projected/0ebf5cfe-98a9-4a57-9613-e936af7bade0-kube-api-access-mkq2w\") pod \"aws-ebs-csi-driver-node-c2p8t\" (UID: \"0ebf5cfe-98a9-4a57-9613-e936af7bade0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.760172 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:58.760143 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8a719b01c36f6eedbdfdd5602a1c99a.slice/crio-803c1742360153eaf1078e9c3ef792957ecc814979a686fdf02bfd71b4951ebc WatchSource:0}: Error finding container 803c1742360153eaf1078e9c3ef792957ecc814979a686fdf02bfd71b4951ebc: Status 404 returned error can't find the container with id 803c1742360153eaf1078e9c3ef792957ecc814979a686fdf02bfd71b4951ebc Apr 22 17:53:58.760470 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:58.760457 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod900252a3f62a7cd1f973133b1b7ebceb.slice/crio-a07d1a86cf22b700f2ec65e2ea8f76fed57ee0b3bd76ab74be6abb9b17ebd70a WatchSource:0}: Error finding container a07d1a86cf22b700f2ec65e2ea8f76fed57ee0b3bd76ab74be6abb9b17ebd70a: Status 404 returned error can't find the container with id a07d1a86cf22b700f2ec65e2ea8f76fed57ee0b3bd76ab74be6abb9b17ebd70a Apr 22 17:53:58.761320 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.761306 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6d4jf" Apr 22 17:53:58.765720 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.765700 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:53:58.769361 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:58.769338 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bfed2dc_0dd0_4e54_bfc5_dba8108bcdeb.slice/crio-00d128e755564eaa30008f607f291b60b8e8e291ea167c8335e7fed6f1a69f9b WatchSource:0}: Error finding container 00d128e755564eaa30008f607f291b60b8e8e291ea167c8335e7fed6f1a69f9b: Status 404 returned error can't find the container with id 00d128e755564eaa30008f607f291b60b8e8e291ea167c8335e7fed6f1a69f9b Apr 22 17:53:58.848871 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.848843 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849019 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.848873 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/90656894-8740-4f11-8fc1-932678cddd3c-ovn-node-metrics-cert\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849019 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.848902 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs\") pod \"network-metrics-daemon-4jgqt\" (UID: \"8beb4b83-ece9-44df-bc80-fea79bf050d5\") " pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:53:58.849019 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.848917 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/90656894-8740-4f11-8fc1-932678cddd3c-env-overrides\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849019 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.848935 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f92ls\" (UniqueName: \"kubernetes.io/projected/90656894-8740-4f11-8fc1-932678cddd3c-kube-api-access-f92ls\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849019 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.848959 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/90656894-8740-4f11-8fc1-932678cddd3c-ovnkube-config\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849019 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.848969 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849019 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.848981 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-run-ovn\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849019 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.849004 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-log-socket\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849386 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:58.849028 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:58.849386 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:58.849101 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs podName:8beb4b83-ece9-44df-bc80-fea79bf050d5 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:59.349081346 +0000 UTC m=+2.137671114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs") pod "network-metrics-daemon-4jgqt" (UID: "8beb4b83-ece9-44df-bc80-fea79bf050d5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:58.849386 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.849029 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-kubelet\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849386 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.849141 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-run-systemd\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849386 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.849173 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwkkx\" (UniqueName: \"kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx\") pod \"network-check-target-bnklt\" (UID: \"76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1\") " pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:53:58.849386 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.849240 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-etc-openvswitch\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849386 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.849282 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-run-ovn\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849386 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.849309 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-node-log\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849386 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.849331 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-run-systemd\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849386 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.849335 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-run-ovn-kubernetes\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849386 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.849332 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-etc-openvswitch\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849386 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.849367 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-kubelet\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849386 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.849360 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/90656894-8740-4f11-8fc1-932678cddd3c-ovnkube-script-lib\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849386 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.849239 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-log-socket\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849386 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.849391 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-run-ovn-kubernetes\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849987 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.849406 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4z9bg\" (UniqueName: \"kubernetes.io/projected/8beb4b83-ece9-44df-bc80-fea79bf050d5-kube-api-access-4z9bg\") pod \"network-metrics-daemon-4jgqt\" (UID: \"8beb4b83-ece9-44df-bc80-fea79bf050d5\") " pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:53:58.849987 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.849432 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-node-log\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849987 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.849449 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-systemd-units\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849987 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.849851 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-cni-bin\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849987 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.849888 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-slash\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.849987 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.849924 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-var-lib-openvswitch\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.850238 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.850014 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-run-netns\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.850238 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.850048 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-cni-netd\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.850238 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.850049 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-var-lib-openvswitch\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.850238 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.850089 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-cni-bin\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.850238 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.850105 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/90656894-8740-4f11-8fc1-932678cddd3c-env-overrides\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.850238 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.850137 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-slash\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.850494 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.850414 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/90656894-8740-4f11-8fc1-932678cddd3c-ovnkube-script-lib\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.850494 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.850449 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/90656894-8740-4f11-8fc1-932678cddd3c-ovnkube-config\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.850581 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.850541 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-cni-netd\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.850628 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.850606 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-run-openvswitch\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.850725 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.850703 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-host-run-netns\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.852721 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.850715 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-run-openvswitch\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.852721 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.850786 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/90656894-8740-4f11-8fc1-932678cddd3c-systemd-units\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.852721 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.851658 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/90656894-8740-4f11-8fc1-932678cddd3c-ovn-node-metrics-cert\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.857006 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:58.856988 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:58.857006 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:58.857008 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:58.857113 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:58.857018 2578 projected.go:194] Error preparing data for projected volume kube-api-access-jwkkx for pod openshift-network-diagnostics/network-check-target-bnklt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:58.857113 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:58.857076 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx podName:76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:59.357062617 +0000 UTC m=+2.145652389 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jwkkx" (UniqueName: "kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx") pod "network-check-target-bnklt" (UID: "76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:58.862719 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.862696 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z9bg\" (UniqueName: \"kubernetes.io/projected/8beb4b83-ece9-44df-bc80-fea79bf050d5-kube-api-access-4z9bg\") pod \"network-metrics-daemon-4jgqt\" (UID: \"8beb4b83-ece9-44df-bc80-fea79bf050d5\") " pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:53:58.863037 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.863021 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f92ls\" (UniqueName: \"kubernetes.io/projected/90656894-8740-4f11-8fc1-932678cddd3c-kube-api-access-f92ls\") pod \"ovnkube-node-6jx4b\" (UID: \"90656894-8740-4f11-8fc1-932678cddd3c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:58.949164 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.949130 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rcvv7" Apr 22 17:53:58.956148 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:58.956121 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcec375e4_f4c2_456e_ae9c_d41aec34f6d1.slice/crio-77a492ab263b3f69231e3868ad7b77f55a49b2d8c13af66d27f37241f6438a81 WatchSource:0}: Error finding container 77a492ab263b3f69231e3868ad7b77f55a49b2d8c13af66d27f37241f6438a81: Status 404 returned error can't find the container with id 77a492ab263b3f69231e3868ad7b77f55a49b2d8c13af66d27f37241f6438a81 Apr 22 17:53:58.957103 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.957080 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" Apr 22 17:53:58.963069 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:58.963044 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ebf5cfe_98a9_4a57_9613_e936af7bade0.slice/crio-755b0e418ad1fa407742dbe1157ab2e17217def0a14836b9a6313c6886a5ad2e WatchSource:0}: Error finding container 755b0e418ad1fa407742dbe1157ab2e17217def0a14836b9a6313c6886a5ad2e: Status 404 returned error can't find the container with id 755b0e418ad1fa407742dbe1157ab2e17217def0a14836b9a6313c6886a5ad2e Apr 22 17:53:58.983852 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.983822 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" Apr 22 17:53:58.991926 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:58.991900 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef7bdbb8_6d5c_428b_9aa0_57c42271876a.slice/crio-07199d38eea40f4a2322210c5c8756a9315f36bda363064200e58da7be9da1e5 WatchSource:0}: Error finding container 07199d38eea40f4a2322210c5c8756a9315f36bda363064200e58da7be9da1e5: Status 404 returned error can't find the container with id 07199d38eea40f4a2322210c5c8756a9315f36bda363064200e58da7be9da1e5 Apr 22 17:53:58.999837 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:58.999818 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-w5cfx" Apr 22 17:53:59.005694 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:59.005671 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77641058_c12b_451a_a008_3394e1d05e7f.slice/crio-42200057dd9b48fb71c15ca6e0d5d01cbdf71d54da88919d55445b1f86263a5c WatchSource:0}: Error finding container 42200057dd9b48fb71c15ca6e0d5d01cbdf71d54da88919d55445b1f86263a5c: Status 404 returned error can't find the container with id 42200057dd9b48fb71c15ca6e0d5d01cbdf71d54da88919d55445b1f86263a5c Apr 22 17:53:59.015744 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.015727 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-f496z" Apr 22 17:53:59.021357 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:59.021331 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8999947_ef96_4a70_8257_7e319d5967db.slice/crio-e8d951db94b6c8c29f73bbcd9f27f81dbbf629e9bf3a6b2d82956643dd7071b1 WatchSource:0}: Error finding container e8d951db94b6c8c29f73bbcd9f27f81dbbf629e9bf3a6b2d82956643dd7071b1: Status 404 returned error can't find the container with id e8d951db94b6c8c29f73bbcd9f27f81dbbf629e9bf3a6b2d82956643dd7071b1 Apr 22 17:53:59.032032 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.032011 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dgpsk" Apr 22 17:53:59.037689 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:59.037669 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60af7321_9798_4032_9193_93cd4606e87a.slice/crio-5679fc74a748424248c36c8fadd742f8aeb80ba614ff13a8e9d601f691be495c WatchSource:0}: Error finding container 5679fc74a748424248c36c8fadd742f8aeb80ba614ff13a8e9d601f691be495c: Status 404 returned error can't find the container with id 5679fc74a748424248c36c8fadd742f8aeb80ba614ff13a8e9d601f691be495c Apr 22 17:53:59.046300 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.046285 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rzm27" Apr 22 17:53:59.052121 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:59.052100 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2a82d37_d50f_4213_8068_442b3abeb6b3.slice/crio-26fc74c89ad004e16f403f84f4adc992461b67c2e35b52560f2e48ec5760c84e WatchSource:0}: Error finding container 26fc74c89ad004e16f403f84f4adc992461b67c2e35b52560f2e48ec5760c84e: Status 404 returned error can't find the container with id 26fc74c89ad004e16f403f84f4adc992461b67c2e35b52560f2e48ec5760c84e Apr 22 17:53:59.068540 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.068516 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:53:59.074744 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:53:59.074722 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90656894_8740_4f11_8fc1_932678cddd3c.slice/crio-ff3abbdfbd07fb078a6f6087db51f38c4d1a5dbd944028bfbf5254f3d777dfc4 WatchSource:0}: Error finding container ff3abbdfbd07fb078a6f6087db51f38c4d1a5dbd944028bfbf5254f3d777dfc4: Status 404 returned error can't find the container with id ff3abbdfbd07fb078a6f6087db51f38c4d1a5dbd944028bfbf5254f3d777dfc4 Apr 22 17:53:59.354291 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.354075 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs\") pod \"network-metrics-daemon-4jgqt\" (UID: \"8beb4b83-ece9-44df-bc80-fea79bf050d5\") " pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:53:59.354291 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:59.354233 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:59.354291 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:59.354299 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs podName:8beb4b83-ece9-44df-bc80-fea79bf050d5 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:00.354280152 +0000 UTC m=+3.142869895 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs") pod "network-metrics-daemon-4jgqt" (UID: "8beb4b83-ece9-44df-bc80-fea79bf050d5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:59.444632 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.444595 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:59.455477 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.454931 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwkkx\" (UniqueName: \"kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx\") pod \"network-check-target-bnklt\" (UID: \"76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1\") " pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:53:59.455477 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:59.455080 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:59.455477 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:59.455097 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:59.455477 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:59.455108 2578 projected.go:194] Error preparing data for projected volume kube-api-access-jwkkx for pod openshift-network-diagnostics/network-check-target-bnklt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:59.455477 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:53:59.455173 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx podName:76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:00.455153014 +0000 UTC m=+3.243742814 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jwkkx" (UniqueName: "kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx") pod "network-check-target-bnklt" (UID: "76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:59.712019 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.711918 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:48:58 +0000 UTC" deadline="2028-01-24 22:44:20.847216548 +0000 UTC" Apr 22 17:53:59.712019 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.711960 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15412h50m21.135260496s" Apr 22 17:53:59.789122 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.789059 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rcvv7" event={"ID":"cec375e4-f4c2-456e-ae9c-d41aec34f6d1","Type":"ContainerStarted","Data":"77a492ab263b3f69231e3868ad7b77f55a49b2d8c13af66d27f37241f6438a81"} Apr 22 17:53:59.806049 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.805975 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal" event={"ID":"900252a3f62a7cd1f973133b1b7ebceb","Type":"ContainerStarted","Data":"a07d1a86cf22b700f2ec65e2ea8f76fed57ee0b3bd76ab74be6abb9b17ebd70a"} Apr 22 17:53:59.818612 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.818577 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rzm27" event={"ID":"f2a82d37-d50f-4213-8068-442b3abeb6b3","Type":"ContainerStarted","Data":"26fc74c89ad004e16f403f84f4adc992461b67c2e35b52560f2e48ec5760c84e"} Apr 22 17:53:59.827902 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.827877 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:59.852575 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.852531 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f496z" event={"ID":"d8999947-ef96-4a70-8257-7e319d5967db","Type":"ContainerStarted","Data":"e8d951db94b6c8c29f73bbcd9f27f81dbbf629e9bf3a6b2d82956643dd7071b1"} Apr 22 17:53:59.865792 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.865742 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-w5cfx" event={"ID":"77641058-c12b-451a-a008-3394e1d05e7f","Type":"ContainerStarted","Data":"42200057dd9b48fb71c15ca6e0d5d01cbdf71d54da88919d55445b1f86263a5c"} Apr 22 17:53:59.885457 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.885428 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:59.888331 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.888277 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6d4jf" event={"ID":"5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb","Type":"ContainerStarted","Data":"00d128e755564eaa30008f607f291b60b8e8e291ea167c8335e7fed6f1a69f9b"} Apr 22 17:53:59.908777 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.901882 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-112.ec2.internal" event={"ID":"e8a719b01c36f6eedbdfdd5602a1c99a","Type":"ContainerStarted","Data":"803c1742360153eaf1078e9c3ef792957ecc814979a686fdf02bfd71b4951ebc"} Apr 22 17:53:59.929183 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.929129 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" event={"ID":"90656894-8740-4f11-8fc1-932678cddd3c","Type":"ContainerStarted","Data":"ff3abbdfbd07fb078a6f6087db51f38c4d1a5dbd944028bfbf5254f3d777dfc4"} Apr 22 17:53:59.950061 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.949986 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dgpsk" event={"ID":"60af7321-9798-4032-9193-93cd4606e87a","Type":"ContainerStarted","Data":"5679fc74a748424248c36c8fadd742f8aeb80ba614ff13a8e9d601f691be495c"} Apr 22 17:53:59.977167 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.977071 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" event={"ID":"ef7bdbb8-6d5c-428b-9aa0-57c42271876a","Type":"ContainerStarted","Data":"07199d38eea40f4a2322210c5c8756a9315f36bda363064200e58da7be9da1e5"} Apr 22 17:53:59.998721 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:53:59.998459 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" event={"ID":"0ebf5cfe-98a9-4a57-9613-e936af7bade0","Type":"ContainerStarted","Data":"755b0e418ad1fa407742dbe1157ab2e17217def0a14836b9a6313c6886a5ad2e"} Apr 22 17:54:00.363641 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:00.363606 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs\") pod \"network-metrics-daemon-4jgqt\" (UID: \"8beb4b83-ece9-44df-bc80-fea79bf050d5\") " pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:00.363868 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:00.363804 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:54:00.363868 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:00.363865 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs podName:8beb4b83-ece9-44df-bc80-fea79bf050d5 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:02.363848115 +0000 UTC m=+5.152437871 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs") pod "network-metrics-daemon-4jgqt" (UID: "8beb4b83-ece9-44df-bc80-fea79bf050d5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:54:00.464614 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:00.464566 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwkkx\" (UniqueName: \"kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx\") pod \"network-check-target-bnklt\" (UID: \"76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1\") " pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:00.464841 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:00.464743 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:54:00.464841 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:00.464781 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:54:00.464841 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:00.464794 2578 projected.go:194] Error preparing data for projected volume kube-api-access-jwkkx for pod openshift-network-diagnostics/network-check-target-bnklt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:54:00.465020 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:00.464852 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx podName:76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:02.464833497 +0000 UTC m=+5.253423254 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jwkkx" (UniqueName: "kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx") pod "network-check-target-bnklt" (UID: "76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:54:00.712743 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:00.712629 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:48:58 +0000 UTC" deadline="2027-10-02 23:42:35.724967248 +0000 UTC" Apr 22 17:54:00.712743 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:00.712671 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12677h48m35.012301036s" Apr 22 17:54:00.762356 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:00.762317 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:00.762546 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:00.762446 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnklt" podUID="76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1" Apr 22 17:54:00.762934 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:00.762908 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:00.763054 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:00.763033 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jgqt" podUID="8beb4b83-ece9-44df-bc80-fea79bf050d5" Apr 22 17:54:02.381044 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:02.381008 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs\") pod \"network-metrics-daemon-4jgqt\" (UID: \"8beb4b83-ece9-44df-bc80-fea79bf050d5\") " pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:02.381493 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:02.381204 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:54:02.381493 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:02.381268 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs podName:8beb4b83-ece9-44df-bc80-fea79bf050d5 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:06.381249948 +0000 UTC m=+9.169839704 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs") pod "network-metrics-daemon-4jgqt" (UID: "8beb4b83-ece9-44df-bc80-fea79bf050d5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:54:02.482911 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:02.482320 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwkkx\" (UniqueName: \"kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx\") pod \"network-check-target-bnklt\" (UID: \"76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1\") " pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:02.482911 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:02.482478 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:54:02.482911 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:02.482502 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:54:02.482911 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:02.482516 2578 projected.go:194] Error preparing data for projected volume kube-api-access-jwkkx for pod openshift-network-diagnostics/network-check-target-bnklt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:54:02.482911 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:02.482571 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx podName:76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:06.482556995 +0000 UTC m=+9.271146759 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-jwkkx" (UniqueName: "kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx") pod "network-check-target-bnklt" (UID: "76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:54:02.762449 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:02.762412 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:02.762623 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:02.762538 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnklt" podUID="76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1" Apr 22 17:54:02.763000 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:02.762979 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:02.763108 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:02.763088 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jgqt" podUID="8beb4b83-ece9-44df-bc80-fea79bf050d5" Apr 22 17:54:04.762602 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:04.762567 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:04.763084 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:04.762695 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnklt" podUID="76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1" Apr 22 17:54:04.763084 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:04.762740 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:04.763084 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:04.762861 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jgqt" podUID="8beb4b83-ece9-44df-bc80-fea79bf050d5" Apr 22 17:54:06.414645 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:06.414605 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs\") pod \"network-metrics-daemon-4jgqt\" (UID: \"8beb4b83-ece9-44df-bc80-fea79bf050d5\") " pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:06.415071 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:06.414793 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:54:06.415071 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:06.414863 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs podName:8beb4b83-ece9-44df-bc80-fea79bf050d5 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:14.414840602 +0000 UTC m=+17.203430349 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs") pod "network-metrics-daemon-4jgqt" (UID: "8beb4b83-ece9-44df-bc80-fea79bf050d5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:54:06.515694 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:06.515649 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwkkx\" (UniqueName: \"kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx\") pod \"network-check-target-bnklt\" (UID: \"76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1\") " pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:06.515899 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:06.515864 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:54:06.515899 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:06.515881 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:54:06.515899 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:06.515894 2578 projected.go:194] Error preparing data for projected volume kube-api-access-jwkkx for pod openshift-network-diagnostics/network-check-target-bnklt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:54:06.516072 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:06.515953 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx podName:76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:14.515932704 +0000 UTC m=+17.304522452 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-jwkkx" (UniqueName: "kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx") pod "network-check-target-bnklt" (UID: "76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:54:06.761943 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:06.761907 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:06.762119 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:06.761911 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:06.762185 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:06.762026 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnklt" podUID="76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1" Apr 22 17:54:06.762241 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:06.762184 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jgqt" podUID="8beb4b83-ece9-44df-bc80-fea79bf050d5" Apr 22 17:54:08.762561 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:08.762527 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:08.762963 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:08.762585 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:08.762963 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:08.762672 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jgqt" podUID="8beb4b83-ece9-44df-bc80-fea79bf050d5" Apr 22 17:54:08.762963 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:08.762777 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnklt" podUID="76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1" Apr 22 17:54:10.762214 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:10.762176 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:10.762214 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:10.762190 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:10.762683 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:10.762302 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnklt" podUID="76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1" Apr 22 17:54:10.762683 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:10.762442 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jgqt" podUID="8beb4b83-ece9-44df-bc80-fea79bf050d5" Apr 22 17:54:12.762255 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:12.762222 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:12.762611 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:12.762221 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:12.762611 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:12.762331 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnklt" podUID="76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1" Apr 22 17:54:12.762611 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:12.762436 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jgqt" podUID="8beb4b83-ece9-44df-bc80-fea79bf050d5" Apr 22 17:54:14.471118 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:14.471080 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs\") pod \"network-metrics-daemon-4jgqt\" (UID: \"8beb4b83-ece9-44df-bc80-fea79bf050d5\") " pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:14.471625 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:14.471262 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:54:14.471625 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:14.471352 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs podName:8beb4b83-ece9-44df-bc80-fea79bf050d5 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:30.47132982 +0000 UTC m=+33.259919564 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs") pod "network-metrics-daemon-4jgqt" (UID: "8beb4b83-ece9-44df-bc80-fea79bf050d5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:54:14.572321 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:14.572288 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwkkx\" (UniqueName: \"kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx\") pod \"network-check-target-bnklt\" (UID: \"76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1\") " pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:14.572477 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:14.572421 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:54:14.572477 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:14.572435 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:54:14.572477 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:14.572444 2578 projected.go:194] Error preparing data for projected volume kube-api-access-jwkkx for pod openshift-network-diagnostics/network-check-target-bnklt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:54:14.572571 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:14.572492 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx podName:76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:30.572478215 +0000 UTC m=+33.361067957 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-jwkkx" (UniqueName: "kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx") pod "network-check-target-bnklt" (UID: "76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:54:14.762244 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:14.762040 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:14.762418 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:14.762048 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:14.762418 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:14.762321 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnklt" podUID="76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1" Apr 22 17:54:14.762515 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:14.762416 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jgqt" podUID="8beb4b83-ece9-44df-bc80-fea79bf050d5" Apr 22 17:54:16.762660 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:16.762631 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:16.763068 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:16.762631 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:16.763068 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:16.762779 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jgqt" podUID="8beb4b83-ece9-44df-bc80-fea79bf050d5" Apr 22 17:54:16.763068 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:16.762838 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnklt" podUID="76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1" Apr 22 17:54:18.048071 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:18.047882 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-112.ec2.internal" event={"ID":"e8a719b01c36f6eedbdfdd5602a1c99a","Type":"ContainerStarted","Data":"a52b5763af075854a2abcfca7980a8b5281e99ab64f4960b09c650ddbb08171f"} Apr 22 17:54:18.059469 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:18.059431 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" event={"ID":"90656894-8740-4f11-8fc1-932678cddd3c","Type":"ContainerStarted","Data":"695f1df1be148c045a36e7fa287aaa6f9907f146bc29dfb1dec8ebe26530d210"} Apr 22 17:54:18.059574 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:18.059483 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" event={"ID":"90656894-8740-4f11-8fc1-932678cddd3c","Type":"ContainerStarted","Data":"f6c4cd0d20a7eeb4856b892153bb7eb10be82ae4d3049c8f1950e5831f60badd"} Apr 22 17:54:18.059574 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:18.059499 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" event={"ID":"90656894-8740-4f11-8fc1-932678cddd3c","Type":"ContainerStarted","Data":"2736b2636fdc660163f8b3bb3ea156cc40b287ed42e5aa7e08ab34e2121f05f3"} Apr 22 17:54:18.059574 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:18.059513 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" event={"ID":"90656894-8740-4f11-8fc1-932678cddd3c","Type":"ContainerStarted","Data":"25ef48f4a207ff70a8289216bdb9aeadd4df690bb610616998ed417b6b6eb332"} Apr 22 17:54:18.062631 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:18.062165 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dgpsk" event={"ID":"60af7321-9798-4032-9193-93cd4606e87a","Type":"ContainerStarted","Data":"6889ce890b8970de21751943dab5f0b112618c66d2c734df7c7d2a53c1d112ee"} Apr 22 17:54:18.069779 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:18.069740 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" event={"ID":"ef7bdbb8-6d5c-428b-9aa0-57c42271876a","Type":"ContainerStarted","Data":"354a6be4681eb78055d2c5608bd46376bd20cbb5b75fad4b18b16b2f2d40be23"} Apr 22 17:54:18.085283 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:18.085197 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-112.ec2.internal" podStartSLOduration=21.085182405 podStartE2EDuration="21.085182405s" podCreationTimestamp="2026-04-22 17:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:54:18.065869303 +0000 UTC m=+20.854459070" watchObservedRunningTime="2026-04-22 17:54:18.085182405 +0000 UTC m=+20.873772197" Apr 22 17:54:18.085600 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:18.085570 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dgpsk" podStartSLOduration=2.767039142 podStartE2EDuration="21.085562503s" podCreationTimestamp="2026-04-22 17:53:57 +0000 UTC" firstStartedPulling="2026-04-22 17:53:59.039163234 +0000 UTC m=+1.827752977" lastFinishedPulling="2026-04-22 17:54:17.357686596 +0000 UTC m=+20.146276338" observedRunningTime="2026-04-22 17:54:18.085123286 +0000 UTC m=+20.873713051" watchObservedRunningTime="2026-04-22 17:54:18.085562503 +0000 UTC m=+20.874152268" Apr 22 17:54:18.103272 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:18.103213 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-xvsrm" podStartSLOduration=2.7476640469999998 podStartE2EDuration="21.103199851s" podCreationTimestamp="2026-04-22 17:53:57 +0000 UTC" firstStartedPulling="2026-04-22 17:53:58.993436613 +0000 UTC m=+1.782026356" lastFinishedPulling="2026-04-22 17:54:17.348972408 +0000 UTC m=+20.137562160" observedRunningTime="2026-04-22 17:54:18.102858092 +0000 UTC m=+20.891447858" watchObservedRunningTime="2026-04-22 17:54:18.103199851 +0000 UTC m=+20.891789610" Apr 22 17:54:18.762167 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:18.762137 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:18.762317 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:18.762249 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnklt" podUID="76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1" Apr 22 17:54:18.762317 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:18.762303 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:18.762407 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:18.762390 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jgqt" podUID="8beb4b83-ece9-44df-bc80-fea79bf050d5" Apr 22 17:54:19.073212 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:19.073178 2578 generic.go:358] "Generic (PLEG): container finished" podID="d8999947-ef96-4a70-8257-7e319d5967db" containerID="93192554a96eeacf8295841c710704bc71654ae239896e86286c018d0c74cd65" exitCode=0 Apr 22 17:54:19.073817 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:19.073254 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f496z" event={"ID":"d8999947-ef96-4a70-8257-7e319d5967db","Type":"ContainerDied","Data":"93192554a96eeacf8295841c710704bc71654ae239896e86286c018d0c74cd65"} Apr 22 17:54:19.074993 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:19.074750 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-w5cfx" event={"ID":"77641058-c12b-451a-a008-3394e1d05e7f","Type":"ContainerStarted","Data":"fb771f7db6347057a5abfa7bfc9d9d9e82ed139cda8e4bd28aa2d7fbf6ed75f4"} Apr 22 17:54:19.076658 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:19.076630 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6d4jf" event={"ID":"5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb","Type":"ContainerStarted","Data":"3b1c51292a9c0524ea5f06598afdd416acc82be1366c0ad9c117d4eed5c39ffd"} Apr 22 17:54:19.079722 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:19.079667 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" event={"ID":"90656894-8740-4f11-8fc1-932678cddd3c","Type":"ContainerStarted","Data":"963ea1d252c9226fed4baeea0023286716aa7a0eb2469731c7ff9b622ad28452"} Apr 22 17:54:19.079722 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:19.079701 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" event={"ID":"90656894-8740-4f11-8fc1-932678cddd3c","Type":"ContainerStarted","Data":"ebca79a7381feac6f4a518555549956230e4dfff943053597c6f722f1cd50822"} Apr 22 17:54:19.084494 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:19.084466 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" event={"ID":"0ebf5cfe-98a9-4a57-9613-e936af7bade0","Type":"ContainerStarted","Data":"8c7db27a1fa5efa134efe01b8fd1acefed45d79fd88dff38058a924442de9adb"} Apr 22 17:54:19.085784 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:19.085737 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rcvv7" event={"ID":"cec375e4-f4c2-456e-ae9c-d41aec34f6d1","Type":"ContainerStarted","Data":"70ecce70ac4c3f66e4a4c2fc84b7ea28b76330a97762469b048f7206416ecc4b"} Apr 22 17:54:19.087262 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:19.087239 2578 generic.go:358] "Generic (PLEG): container finished" podID="900252a3f62a7cd1f973133b1b7ebceb" containerID="7c5421a3542462dfa1f75891abc111aa14369cc259ad0243c7d7b701a8eb0ee1" exitCode=0 Apr 22 17:54:19.087486 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:19.087461 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal" event={"ID":"900252a3f62a7cd1f973133b1b7ebceb","Type":"ContainerDied","Data":"7c5421a3542462dfa1f75891abc111aa14369cc259ad0243c7d7b701a8eb0ee1"} Apr 22 17:54:19.116056 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:19.116009 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-rcvv7" podStartSLOduration=3.749346592 podStartE2EDuration="22.115995005s" podCreationTimestamp="2026-04-22 17:53:57 +0000 UTC" firstStartedPulling="2026-04-22 17:53:58.957628415 +0000 UTC m=+1.746218161" lastFinishedPulling="2026-04-22 17:54:17.324276829 +0000 UTC m=+20.112866574" observedRunningTime="2026-04-22 17:54:19.115716784 +0000 UTC m=+21.904306549" watchObservedRunningTime="2026-04-22 17:54:19.115995005 +0000 UTC m=+21.904584769" Apr 22 17:54:19.130282 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:19.130231 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-w5cfx" podStartSLOduration=3.812631621 podStartE2EDuration="22.130213012s" podCreationTimestamp="2026-04-22 17:53:57 +0000 UTC" firstStartedPulling="2026-04-22 17:53:59.007206023 +0000 UTC m=+1.795795765" lastFinishedPulling="2026-04-22 17:54:17.324787399 +0000 UTC m=+20.113377156" observedRunningTime="2026-04-22 17:54:19.129909916 +0000 UTC m=+21.918499679" watchObservedRunningTime="2026-04-22 17:54:19.130213012 +0000 UTC m=+21.918802811" Apr 22 17:54:19.145801 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:19.145725 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6d4jf" podStartSLOduration=3.592773599 podStartE2EDuration="22.145712251s" podCreationTimestamp="2026-04-22 17:53:57 +0000 UTC" firstStartedPulling="2026-04-22 17:53:58.771303549 +0000 UTC m=+1.559893295" lastFinishedPulling="2026-04-22 17:54:17.324242205 +0000 UTC m=+20.112831947" observedRunningTime="2026-04-22 17:54:19.145519689 +0000 UTC m=+21.934109453" watchObservedRunningTime="2026-04-22 17:54:19.145712251 +0000 UTC m=+21.934302014" Apr 22 17:54:19.436746 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:19.436719 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 17:54:19.707259 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:19.707111 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T17:54:19.436741965Z","UUID":"83d4cc3f-12a1-4828-ae18-e6108baf7e7a","Handler":null,"Name":"","Endpoint":""} Apr 22 17:54:19.710834 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:19.710807 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 17:54:19.710834 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:19.710839 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 17:54:20.091691 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:20.091652 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rzm27" event={"ID":"f2a82d37-d50f-4213-8068-442b3abeb6b3","Type":"ContainerStarted","Data":"b7d94659aaa01925198f479b1fa6fda29f81327c3054b978d75cf58f88f52f57"} Apr 22 17:54:20.093916 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:20.093881 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" event={"ID":"0ebf5cfe-98a9-4a57-9613-e936af7bade0","Type":"ContainerStarted","Data":"3924fe05003bca02eae03edbf1a39f2f71e6652216d5c37cc206b323c8cb4d09"} Apr 22 17:54:20.095975 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:20.095948 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal" event={"ID":"900252a3f62a7cd1f973133b1b7ebceb","Type":"ContainerStarted","Data":"d6ce8972cdbaf9171c3c030a1c34e25a199bd48ab7c82afff37eebfcda5bee0c"} Apr 22 17:54:20.126191 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:20.126148 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rzm27" podStartSLOduration=4.82755704 podStartE2EDuration="23.126132748s" podCreationTimestamp="2026-04-22 17:53:57 +0000 UTC" firstStartedPulling="2026-04-22 17:53:59.053590977 +0000 UTC m=+1.842180719" lastFinishedPulling="2026-04-22 17:54:17.352166682 +0000 UTC m=+20.140756427" observedRunningTime="2026-04-22 17:54:20.109780676 +0000 UTC m=+22.898370440" watchObservedRunningTime="2026-04-22 17:54:20.126132748 +0000 UTC m=+22.914722511" Apr 22 17:54:20.126379 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:20.126275 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-112.ec2.internal" podStartSLOduration=23.126270722 podStartE2EDuration="23.126270722s" podCreationTimestamp="2026-04-22 17:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:54:20.125922499 +0000 UTC m=+22.914512454" watchObservedRunningTime="2026-04-22 17:54:20.126270722 +0000 UTC m=+22.914860486" Apr 22 17:54:20.157823 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:20.157791 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-rcvv7" Apr 22 17:54:20.762725 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:20.762696 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:20.762930 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:20.762707 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:20.762930 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:20.762856 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jgqt" podUID="8beb4b83-ece9-44df-bc80-fea79bf050d5" Apr 22 17:54:20.763026 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:20.762960 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnklt" podUID="76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1" Apr 22 17:54:21.101241 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:21.101155 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" event={"ID":"90656894-8740-4f11-8fc1-932678cddd3c","Type":"ContainerStarted","Data":"5d7840bf900529a9a3bc3ef75535997e61f81ee010a8a7ac9018fc3c972f0362"} Apr 22 17:54:21.103268 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:21.103239 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" event={"ID":"0ebf5cfe-98a9-4a57-9613-e936af7bade0","Type":"ContainerStarted","Data":"ea5d391f808ce39c60f00ee5cafb6747c285b3a585867c2257c670edabee91ee"} Apr 22 17:54:21.127547 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:21.127494 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c2p8t" podStartSLOduration=2.904392738 podStartE2EDuration="24.127474039s" podCreationTimestamp="2026-04-22 17:53:57 +0000 UTC" firstStartedPulling="2026-04-22 17:53:58.964550503 +0000 UTC m=+1.753140245" lastFinishedPulling="2026-04-22 17:54:20.187631804 +0000 UTC m=+22.976221546" observedRunningTime="2026-04-22 17:54:21.127159928 +0000 UTC m=+23.915749691" watchObservedRunningTime="2026-04-22 17:54:21.127474039 +0000 UTC m=+23.916063804" Apr 22 17:54:22.762044 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:22.762021 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:22.762488 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:22.762021 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:22.762488 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:22.762155 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnklt" podUID="76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1" Apr 22 17:54:22.762488 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:22.762240 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jgqt" podUID="8beb4b83-ece9-44df-bc80-fea79bf050d5" Apr 22 17:54:23.111840 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:23.111331 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" event={"ID":"90656894-8740-4f11-8fc1-932678cddd3c","Type":"ContainerStarted","Data":"f7865afc83e391310cbcab83eeb4b439f21241db71317d3de1aeabeda92e7f8b"} Apr 22 17:54:23.111840 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:23.111654 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:54:23.128658 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:23.128535 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:54:23.141220 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:23.141168 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" podStartSLOduration=7.45281085 podStartE2EDuration="26.141154983s" podCreationTimestamp="2026-04-22 17:53:57 +0000 UTC" firstStartedPulling="2026-04-22 17:53:59.076611222 +0000 UTC m=+1.865200978" lastFinishedPulling="2026-04-22 17:54:17.764955362 +0000 UTC m=+20.553545111" observedRunningTime="2026-04-22 17:54:23.140635644 +0000 UTC m=+25.929225407" watchObservedRunningTime="2026-04-22 17:54:23.141154983 +0000 UTC m=+25.929744746" Apr 22 17:54:23.198952 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:23.198910 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-rcvv7" Apr 22 17:54:23.199557 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:23.199534 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-rcvv7" Apr 22 17:54:24.114791 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:24.114738 2578 generic.go:358] "Generic (PLEG): container finished" podID="d8999947-ef96-4a70-8257-7e319d5967db" containerID="c0e9ca109f3509d8fa70322672f61f5eb16568e2129b6c45ab28b7da4e74b00d" exitCode=0 Apr 22 17:54:24.115577 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:24.114854 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f496z" event={"ID":"d8999947-ef96-4a70-8257-7e319d5967db","Type":"ContainerDied","Data":"c0e9ca109f3509d8fa70322672f61f5eb16568e2129b6c45ab28b7da4e74b00d"} Apr 22 17:54:24.115784 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:24.115688 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:54:24.115784 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:24.115717 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:54:24.115784 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:24.115774 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-rcvv7" Apr 22 17:54:24.130652 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:24.130606 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:54:24.762297 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:24.762231 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:24.762621 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:24.762383 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnklt" podUID="76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1" Apr 22 17:54:24.762621 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:24.762453 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:24.762621 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:24.762554 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jgqt" podUID="8beb4b83-ece9-44df-bc80-fea79bf050d5" Apr 22 17:54:24.988545 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:24.988349 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4jgqt"] Apr 22 17:54:24.989129 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:24.989103 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bnklt"] Apr 22 17:54:25.116634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:25.116546 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:25.116634 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:25.116578 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:25.117107 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:25.116636 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnklt" podUID="76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1" Apr 22 17:54:25.117107 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:25.116836 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jgqt" podUID="8beb4b83-ece9-44df-bc80-fea79bf050d5" Apr 22 17:54:26.121803 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:26.121746 2578 generic.go:358] "Generic (PLEG): container finished" podID="d8999947-ef96-4a70-8257-7e319d5967db" containerID="8353e1be8dafdf17dc87a3cec176d21988c585a151d31a60d212c351f1e6ff38" exitCode=0 Apr 22 17:54:26.122250 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:26.121848 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f496z" event={"ID":"d8999947-ef96-4a70-8257-7e319d5967db","Type":"ContainerDied","Data":"8353e1be8dafdf17dc87a3cec176d21988c585a151d31a60d212c351f1e6ff38"} Apr 22 17:54:26.761824 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:26.761788 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:26.761999 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:26.761831 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:26.761999 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:26.761920 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jgqt" podUID="8beb4b83-ece9-44df-bc80-fea79bf050d5" Apr 22 17:54:26.762119 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:26.762040 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnklt" podUID="76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1" Apr 22 17:54:27.128061 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:27.128028 2578 generic.go:358] "Generic (PLEG): container finished" podID="d8999947-ef96-4a70-8257-7e319d5967db" containerID="e9fbe2141cd06ace4a3dbd662be9feb653d2301d3ee1b15f871ff7dd5573cc9c" exitCode=0 Apr 22 17:54:27.128428 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:27.128077 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f496z" event={"ID":"d8999947-ef96-4a70-8257-7e319d5967db","Type":"ContainerDied","Data":"e9fbe2141cd06ace4a3dbd662be9feb653d2301d3ee1b15f871ff7dd5573cc9c"} Apr 22 17:54:28.762766 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:28.762719 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:28.763303 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:28.762854 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnklt" podUID="76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1" Apr 22 17:54:28.763303 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:28.762891 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:28.763303 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:28.763020 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jgqt" podUID="8beb4b83-ece9-44df-bc80-fea79bf050d5" Apr 22 17:54:30.490519 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:30.490481 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs\") pod \"network-metrics-daemon-4jgqt\" (UID: \"8beb4b83-ece9-44df-bc80-fea79bf050d5\") " pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:30.490953 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:30.490612 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:54:30.490953 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:30.490675 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs podName:8beb4b83-ece9-44df-bc80-fea79bf050d5 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:02.49065417 +0000 UTC m=+65.279243939 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs") pod "network-metrics-daemon-4jgqt" (UID: "8beb4b83-ece9-44df-bc80-fea79bf050d5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:54:30.590947 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:30.590914 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwkkx\" (UniqueName: \"kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx\") pod \"network-check-target-bnklt\" (UID: \"76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1\") " pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:30.591109 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:30.591032 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:54:30.591109 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:30.591045 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:54:30.591109 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:30.591053 2578 projected.go:194] Error preparing data for projected volume kube-api-access-jwkkx for pod openshift-network-diagnostics/network-check-target-bnklt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:54:30.591109 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:30.591100 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx podName:76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:02.59108595 +0000 UTC m=+65.379675692 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-jwkkx" (UniqueName: "kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx") pod "network-check-target-bnklt" (UID: "76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:54:30.762428 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:30.762398 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:30.762592 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:30.762398 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:30.762592 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:30.762514 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnklt" podUID="76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1" Apr 22 17:54:30.762695 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:30.762588 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jgqt" podUID="8beb4b83-ece9-44df-bc80-fea79bf050d5" Apr 22 17:54:31.587452 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.587424 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-112.ec2.internal" event="NodeReady" Apr 22 17:54:31.587934 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.587572 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 17:54:31.638088 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.638051 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-w9kfc"] Apr 22 17:54:31.661846 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.661806 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9h2ht"] Apr 22 17:54:31.662017 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.661973 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w9kfc" Apr 22 17:54:31.665161 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.664916 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 17:54:31.665161 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.664993 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 17:54:31.665161 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.664992 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-w7jfh\"" Apr 22 17:54:31.665408 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.665201 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 17:54:31.674546 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.674526 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-w9kfc"] Apr 22 17:54:31.674637 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.674553 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9h2ht"] Apr 22 17:54:31.674685 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.674651 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9h2ht" Apr 22 17:54:31.678101 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.677434 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 17:54:31.678101 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.677445 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-q8m4n\"" Apr 22 17:54:31.678101 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.677712 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 17:54:31.800029 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.799994 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgmgn\" (UniqueName: \"kubernetes.io/projected/8201d0cd-3948-40c0-a905-1339af44c0e0-kube-api-access-dgmgn\") pod \"dns-default-9h2ht\" (UID: \"8201d0cd-3948-40c0-a905-1339af44c0e0\") " pod="openshift-dns/dns-default-9h2ht" Apr 22 17:54:31.800215 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.800059 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls\") pod \"dns-default-9h2ht\" (UID: \"8201d0cd-3948-40c0-a905-1339af44c0e0\") " pod="openshift-dns/dns-default-9h2ht" Apr 22 17:54:31.800215 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.800085 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert\") pod \"ingress-canary-w9kfc\" (UID: \"7faa2f31-ff6f-410e-94f4-9f9b7810616b\") " pod="openshift-ingress-canary/ingress-canary-w9kfc" Apr 22 17:54:31.800215 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.800121 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8201d0cd-3948-40c0-a905-1339af44c0e0-tmp-dir\") pod \"dns-default-9h2ht\" (UID: \"8201d0cd-3948-40c0-a905-1339af44c0e0\") " pod="openshift-dns/dns-default-9h2ht" Apr 22 17:54:31.800215 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.800158 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8201d0cd-3948-40c0-a905-1339af44c0e0-config-volume\") pod \"dns-default-9h2ht\" (UID: \"8201d0cd-3948-40c0-a905-1339af44c0e0\") " pod="openshift-dns/dns-default-9h2ht" Apr 22 17:54:31.800215 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.800183 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdcsq\" (UniqueName: \"kubernetes.io/projected/7faa2f31-ff6f-410e-94f4-9f9b7810616b-kube-api-access-hdcsq\") pod \"ingress-canary-w9kfc\" (UID: \"7faa2f31-ff6f-410e-94f4-9f9b7810616b\") " pod="openshift-ingress-canary/ingress-canary-w9kfc" Apr 22 17:54:31.900844 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.900767 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8201d0cd-3948-40c0-a905-1339af44c0e0-tmp-dir\") pod \"dns-default-9h2ht\" (UID: \"8201d0cd-3948-40c0-a905-1339af44c0e0\") " pod="openshift-dns/dns-default-9h2ht" Apr 22 17:54:31.900844 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.900827 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8201d0cd-3948-40c0-a905-1339af44c0e0-config-volume\") pod \"dns-default-9h2ht\" (UID: \"8201d0cd-3948-40c0-a905-1339af44c0e0\") " pod="openshift-dns/dns-default-9h2ht" Apr 22 17:54:31.901046 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.900853 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdcsq\" (UniqueName: \"kubernetes.io/projected/7faa2f31-ff6f-410e-94f4-9f9b7810616b-kube-api-access-hdcsq\") pod \"ingress-canary-w9kfc\" (UID: \"7faa2f31-ff6f-410e-94f4-9f9b7810616b\") " pod="openshift-ingress-canary/ingress-canary-w9kfc" Apr 22 17:54:31.901046 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.900890 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgmgn\" (UniqueName: \"kubernetes.io/projected/8201d0cd-3948-40c0-a905-1339af44c0e0-kube-api-access-dgmgn\") pod \"dns-default-9h2ht\" (UID: \"8201d0cd-3948-40c0-a905-1339af44c0e0\") " pod="openshift-dns/dns-default-9h2ht" Apr 22 17:54:31.901046 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.900942 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls\") pod \"dns-default-9h2ht\" (UID: \"8201d0cd-3948-40c0-a905-1339af44c0e0\") " pod="openshift-dns/dns-default-9h2ht" Apr 22 17:54:31.901046 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.900969 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert\") pod \"ingress-canary-w9kfc\" (UID: \"7faa2f31-ff6f-410e-94f4-9f9b7810616b\") " pod="openshift-ingress-canary/ingress-canary-w9kfc" Apr 22 17:54:31.901228 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:31.901075 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:31.901228 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:31.901091 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:31.901228 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:31.901144 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert podName:7faa2f31-ff6f-410e-94f4-9f9b7810616b nodeName:}" failed. No retries permitted until 2026-04-22 17:54:32.401124092 +0000 UTC m=+35.189713838 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert") pod "ingress-canary-w9kfc" (UID: "7faa2f31-ff6f-410e-94f4-9f9b7810616b") : secret "canary-serving-cert" not found Apr 22 17:54:31.901228 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:31.901164 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls podName:8201d0cd-3948-40c0-a905-1339af44c0e0 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:32.401154418 +0000 UTC m=+35.189744165 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls") pod "dns-default-9h2ht" (UID: "8201d0cd-3948-40c0-a905-1339af44c0e0") : secret "dns-default-metrics-tls" not found Apr 22 17:54:31.901228 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.901180 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8201d0cd-3948-40c0-a905-1339af44c0e0-tmp-dir\") pod \"dns-default-9h2ht\" (UID: \"8201d0cd-3948-40c0-a905-1339af44c0e0\") " pod="openshift-dns/dns-default-9h2ht" Apr 22 17:54:31.901423 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.901339 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8201d0cd-3948-40c0-a905-1339af44c0e0-config-volume\") pod \"dns-default-9h2ht\" (UID: \"8201d0cd-3948-40c0-a905-1339af44c0e0\") " pod="openshift-dns/dns-default-9h2ht" Apr 22 17:54:31.912515 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.912485 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgmgn\" (UniqueName: \"kubernetes.io/projected/8201d0cd-3948-40c0-a905-1339af44c0e0-kube-api-access-dgmgn\") pod \"dns-default-9h2ht\" (UID: \"8201d0cd-3948-40c0-a905-1339af44c0e0\") " pod="openshift-dns/dns-default-9h2ht" Apr 22 17:54:31.913893 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:31.912539 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdcsq\" (UniqueName: \"kubernetes.io/projected/7faa2f31-ff6f-410e-94f4-9f9b7810616b-kube-api-access-hdcsq\") pod \"ingress-canary-w9kfc\" (UID: \"7faa2f31-ff6f-410e-94f4-9f9b7810616b\") " pod="openshift-ingress-canary/ingress-canary-w9kfc" Apr 22 17:54:32.404979 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:32.404936 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls\") pod \"dns-default-9h2ht\" (UID: \"8201d0cd-3948-40c0-a905-1339af44c0e0\") " pod="openshift-dns/dns-default-9h2ht" Apr 22 17:54:32.404979 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:32.404985 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert\") pod \"ingress-canary-w9kfc\" (UID: \"7faa2f31-ff6f-410e-94f4-9f9b7810616b\") " pod="openshift-ingress-canary/ingress-canary-w9kfc" Apr 22 17:54:32.405223 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:32.405094 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:32.405223 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:32.405094 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:32.405223 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:32.405161 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert podName:7faa2f31-ff6f-410e-94f4-9f9b7810616b nodeName:}" failed. No retries permitted until 2026-04-22 17:54:33.40514531 +0000 UTC m=+36.193735071 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert") pod "ingress-canary-w9kfc" (UID: "7faa2f31-ff6f-410e-94f4-9f9b7810616b") : secret "canary-serving-cert" not found Apr 22 17:54:32.405223 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:32.405176 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls podName:8201d0cd-3948-40c0-a905-1339af44c0e0 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:33.40516996 +0000 UTC m=+36.193759702 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls") pod "dns-default-9h2ht" (UID: "8201d0cd-3948-40c0-a905-1339af44c0e0") : secret "dns-default-metrics-tls" not found Apr 22 17:54:32.762209 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:32.762170 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:54:32.762746 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:32.762392 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:54:32.765317 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:32.765291 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:54:32.766689 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:32.766668 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bvxbv\"" Apr 22 17:54:32.766824 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:32.766741 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lcf4z\"" Apr 22 17:54:32.766824 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:32.766806 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:54:32.766912 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:32.766903 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:54:33.415046 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:33.414996 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls\") pod \"dns-default-9h2ht\" (UID: \"8201d0cd-3948-40c0-a905-1339af44c0e0\") " pod="openshift-dns/dns-default-9h2ht" Apr 22 17:54:33.415046 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:33.415049 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert\") pod \"ingress-canary-w9kfc\" (UID: \"7faa2f31-ff6f-410e-94f4-9f9b7810616b\") " pod="openshift-ingress-canary/ingress-canary-w9kfc" Apr 22 17:54:33.415303 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:33.415155 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:33.415303 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:33.415189 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:33.415303 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:33.415237 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls podName:8201d0cd-3948-40c0-a905-1339af44c0e0 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:35.415218004 +0000 UTC m=+38.203807746 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls") pod "dns-default-9h2ht" (UID: "8201d0cd-3948-40c0-a905-1339af44c0e0") : secret "dns-default-metrics-tls" not found Apr 22 17:54:33.415303 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:33.415260 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert podName:7faa2f31-ff6f-410e-94f4-9f9b7810616b nodeName:}" failed. No retries permitted until 2026-04-22 17:54:35.415249527 +0000 UTC m=+38.203839270 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert") pod "ingress-canary-w9kfc" (UID: "7faa2f31-ff6f-410e-94f4-9f9b7810616b") : secret "canary-serving-cert" not found Apr 22 17:54:35.146890 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:35.146858 2578 generic.go:358] "Generic (PLEG): container finished" podID="d8999947-ef96-4a70-8257-7e319d5967db" containerID="d6fa0d0587809af267e268a190f7882e61cc82ea503d228bbdae1ad5dd71f811" exitCode=0 Apr 22 17:54:35.147411 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:35.146906 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f496z" event={"ID":"d8999947-ef96-4a70-8257-7e319d5967db","Type":"ContainerDied","Data":"d6fa0d0587809af267e268a190f7882e61cc82ea503d228bbdae1ad5dd71f811"} Apr 22 17:54:35.430683 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:35.430591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls\") pod \"dns-default-9h2ht\" (UID: \"8201d0cd-3948-40c0-a905-1339af44c0e0\") " pod="openshift-dns/dns-default-9h2ht" Apr 22 17:54:35.430683 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:35.430637 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert\") pod \"ingress-canary-w9kfc\" (UID: \"7faa2f31-ff6f-410e-94f4-9f9b7810616b\") " pod="openshift-ingress-canary/ingress-canary-w9kfc" Apr 22 17:54:35.430926 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:35.430742 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:35.430926 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:35.430744 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:35.430926 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:35.430830 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert podName:7faa2f31-ff6f-410e-94f4-9f9b7810616b nodeName:}" failed. No retries permitted until 2026-04-22 17:54:39.430810643 +0000 UTC m=+42.219400385 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert") pod "ingress-canary-w9kfc" (UID: "7faa2f31-ff6f-410e-94f4-9f9b7810616b") : secret "canary-serving-cert" not found Apr 22 17:54:35.430926 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:35.430847 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls podName:8201d0cd-3948-40c0-a905-1339af44c0e0 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:39.430839665 +0000 UTC m=+42.219429410 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls") pod "dns-default-9h2ht" (UID: "8201d0cd-3948-40c0-a905-1339af44c0e0") : secret "dns-default-metrics-tls" not found Apr 22 17:54:36.151536 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:36.151503 2578 generic.go:358] "Generic (PLEG): container finished" podID="d8999947-ef96-4a70-8257-7e319d5967db" containerID="d9c26bc2b9500722e5cf1e74478675fefb03b75455d08d1c0cdba6196c2cd7f1" exitCode=0 Apr 22 17:54:36.152026 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:36.151559 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f496z" event={"ID":"d8999947-ef96-4a70-8257-7e319d5967db","Type":"ContainerDied","Data":"d9c26bc2b9500722e5cf1e74478675fefb03b75455d08d1c0cdba6196c2cd7f1"} Apr 22 17:54:37.156360 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:37.156323 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f496z" event={"ID":"d8999947-ef96-4a70-8257-7e319d5967db","Type":"ContainerStarted","Data":"390468fa3e1af47e259236660b19de1d6241ccc409a5dd55dc652350038cd80a"} Apr 22 17:54:37.183764 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:37.183716 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-f496z" podStartSLOduration=4.898507439 podStartE2EDuration="40.183702614s" podCreationTimestamp="2026-04-22 17:53:57 +0000 UTC" firstStartedPulling="2026-04-22 17:53:59.022809061 +0000 UTC m=+1.811398808" lastFinishedPulling="2026-04-22 17:54:34.308004238 +0000 UTC m=+37.096593983" observedRunningTime="2026-04-22 17:54:37.182490264 +0000 UTC m=+39.971080027" watchObservedRunningTime="2026-04-22 17:54:37.183702614 +0000 UTC m=+39.972292379" Apr 22 17:54:39.458262 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:39.458212 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls\") pod \"dns-default-9h2ht\" (UID: \"8201d0cd-3948-40c0-a905-1339af44c0e0\") " pod="openshift-dns/dns-default-9h2ht" Apr 22 17:54:39.458262 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:39.458258 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert\") pod \"ingress-canary-w9kfc\" (UID: \"7faa2f31-ff6f-410e-94f4-9f9b7810616b\") " pod="openshift-ingress-canary/ingress-canary-w9kfc" Apr 22 17:54:39.458672 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:39.458363 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:39.458672 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:39.458369 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:39.458672 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:39.458425 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls podName:8201d0cd-3948-40c0-a905-1339af44c0e0 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:47.458409261 +0000 UTC m=+50.246999003 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls") pod "dns-default-9h2ht" (UID: "8201d0cd-3948-40c0-a905-1339af44c0e0") : secret "dns-default-metrics-tls" not found Apr 22 17:54:39.458672 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:39.458440 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert podName:7faa2f31-ff6f-410e-94f4-9f9b7810616b nodeName:}" failed. No retries permitted until 2026-04-22 17:54:47.458433001 +0000 UTC m=+50.247022745 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert") pod "ingress-canary-w9kfc" (UID: "7faa2f31-ff6f-410e-94f4-9f9b7810616b") : secret "canary-serving-cert" not found Apr 22 17:54:46.720125 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:46.720087 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s"] Apr 22 17:54:46.753226 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:46.753195 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s"] Apr 22 17:54:46.753387 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:46.753312 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s" Apr 22 17:54:46.755966 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:46.755942 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 17:54:46.756102 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:46.756048 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 17:54:46.757382 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:46.757359 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 17:54:46.757382 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:46.757372 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 17:54:46.910793 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:46.910731 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/469dbdac-5689-4f0b-aa89-27b7a7f0f395-tmp\") pod \"klusterlet-addon-workmgr-d5c445fb4-rdw8s\" (UID: \"469dbdac-5689-4f0b-aa89-27b7a7f0f395\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s" Apr 22 17:54:46.910973 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:46.910852 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6mbb\" (UniqueName: \"kubernetes.io/projected/469dbdac-5689-4f0b-aa89-27b7a7f0f395-kube-api-access-c6mbb\") pod \"klusterlet-addon-workmgr-d5c445fb4-rdw8s\" (UID: \"469dbdac-5689-4f0b-aa89-27b7a7f0f395\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s" Apr 22 17:54:46.910973 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:46.910922 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/469dbdac-5689-4f0b-aa89-27b7a7f0f395-klusterlet-config\") pod \"klusterlet-addon-workmgr-d5c445fb4-rdw8s\" (UID: \"469dbdac-5689-4f0b-aa89-27b7a7f0f395\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s" Apr 22 17:54:47.011333 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:47.011303 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6mbb\" (UniqueName: \"kubernetes.io/projected/469dbdac-5689-4f0b-aa89-27b7a7f0f395-kube-api-access-c6mbb\") pod \"klusterlet-addon-workmgr-d5c445fb4-rdw8s\" (UID: \"469dbdac-5689-4f0b-aa89-27b7a7f0f395\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s" Apr 22 17:54:47.011429 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:47.011371 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/469dbdac-5689-4f0b-aa89-27b7a7f0f395-klusterlet-config\") pod \"klusterlet-addon-workmgr-d5c445fb4-rdw8s\" (UID: \"469dbdac-5689-4f0b-aa89-27b7a7f0f395\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s" Apr 22 17:54:47.011429 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:47.011415 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/469dbdac-5689-4f0b-aa89-27b7a7f0f395-tmp\") pod \"klusterlet-addon-workmgr-d5c445fb4-rdw8s\" (UID: \"469dbdac-5689-4f0b-aa89-27b7a7f0f395\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s" Apr 22 17:54:47.011850 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:47.011830 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/469dbdac-5689-4f0b-aa89-27b7a7f0f395-tmp\") pod \"klusterlet-addon-workmgr-d5c445fb4-rdw8s\" (UID: \"469dbdac-5689-4f0b-aa89-27b7a7f0f395\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s" Apr 22 17:54:47.014242 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:47.014211 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/469dbdac-5689-4f0b-aa89-27b7a7f0f395-klusterlet-config\") pod \"klusterlet-addon-workmgr-d5c445fb4-rdw8s\" (UID: \"469dbdac-5689-4f0b-aa89-27b7a7f0f395\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s" Apr 22 17:54:47.021832 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:47.021810 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6mbb\" (UniqueName: \"kubernetes.io/projected/469dbdac-5689-4f0b-aa89-27b7a7f0f395-kube-api-access-c6mbb\") pod \"klusterlet-addon-workmgr-d5c445fb4-rdw8s\" (UID: \"469dbdac-5689-4f0b-aa89-27b7a7f0f395\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s" Apr 22 17:54:47.062082 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:47.062046 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s" Apr 22 17:54:47.237666 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:47.237637 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s"] Apr 22 17:54:47.241783 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:54:47.241731 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod469dbdac_5689_4f0b_aa89_27b7a7f0f395.slice/crio-c72a3633eb5f3fb5abc759176e15da6747e67108f38c6dafbe56ffcdfd67eb53 WatchSource:0}: Error finding container c72a3633eb5f3fb5abc759176e15da6747e67108f38c6dafbe56ffcdfd67eb53: Status 404 returned error can't find the container with id c72a3633eb5f3fb5abc759176e15da6747e67108f38c6dafbe56ffcdfd67eb53 Apr 22 17:54:47.515727 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:47.515685 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls\") pod \"dns-default-9h2ht\" (UID: \"8201d0cd-3948-40c0-a905-1339af44c0e0\") " pod="openshift-dns/dns-default-9h2ht" Apr 22 17:54:47.515727 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:47.515729 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert\") pod \"ingress-canary-w9kfc\" (UID: \"7faa2f31-ff6f-410e-94f4-9f9b7810616b\") " pod="openshift-ingress-canary/ingress-canary-w9kfc" Apr 22 17:54:47.515963 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:47.515843 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:47.515963 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:47.515866 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:47.515963 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:47.515908 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert podName:7faa2f31-ff6f-410e-94f4-9f9b7810616b nodeName:}" failed. No retries permitted until 2026-04-22 17:55:03.51589462 +0000 UTC m=+66.304484363 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert") pod "ingress-canary-w9kfc" (UID: "7faa2f31-ff6f-410e-94f4-9f9b7810616b") : secret "canary-serving-cert" not found Apr 22 17:54:47.515963 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:54:47.515934 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls podName:8201d0cd-3948-40c0-a905-1339af44c0e0 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:03.515919659 +0000 UTC m=+66.304509401 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls") pod "dns-default-9h2ht" (UID: "8201d0cd-3948-40c0-a905-1339af44c0e0") : secret "dns-default-metrics-tls" not found Apr 22 17:54:48.179962 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:48.179922 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s" event={"ID":"469dbdac-5689-4f0b-aa89-27b7a7f0f395","Type":"ContainerStarted","Data":"c72a3633eb5f3fb5abc759176e15da6747e67108f38c6dafbe56ffcdfd67eb53"} Apr 22 17:54:51.186696 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:51.186657 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s" event={"ID":"469dbdac-5689-4f0b-aa89-27b7a7f0f395","Type":"ContainerStarted","Data":"e5a59db22e011e7aba878e9f7df74ac0bff6eac6a8c46c867561653b02a64d27"} Apr 22 17:54:51.187103 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:51.186873 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s" Apr 22 17:54:51.188463 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:51.188443 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s" Apr 22 17:54:51.205046 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:51.205005 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s" podStartSLOduration=1.423438402 podStartE2EDuration="5.204993745s" podCreationTimestamp="2026-04-22 17:54:46 +0000 UTC" firstStartedPulling="2026-04-22 17:54:47.24309836 +0000 UTC m=+50.031688102" lastFinishedPulling="2026-04-22 17:54:51.024653686 +0000 UTC m=+53.813243445" observedRunningTime="2026-04-22 17:54:51.204332726 +0000 UTC m=+53.992922516" watchObservedRunningTime="2026-04-22 17:54:51.204993745 +0000 UTC m=+53.993583508" Apr 22 17:54:56.133124 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:54:56.133087 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6jx4b" Apr 22 17:55:02.523844 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:55:02.523801 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs\") pod \"network-metrics-daemon-4jgqt\" (UID: \"8beb4b83-ece9-44df-bc80-fea79bf050d5\") " pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:55:02.526913 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:55:02.526895 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:55:02.534503 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:55:02.534480 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:55:02.534573 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:55:02.534540 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs podName:8beb4b83-ece9-44df-bc80-fea79bf050d5 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:06.534524759 +0000 UTC m=+129.323114502 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs") pod "network-metrics-daemon-4jgqt" (UID: "8beb4b83-ece9-44df-bc80-fea79bf050d5") : secret "metrics-daemon-secret" not found Apr 22 17:55:02.624679 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:55:02.624651 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwkkx\" (UniqueName: \"kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx\") pod \"network-check-target-bnklt\" (UID: \"76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1\") " pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:55:02.627994 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:55:02.627972 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:55:02.638745 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:55:02.638720 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:55:02.648676 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:55:02.648650 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwkkx\" (UniqueName: \"kubernetes.io/projected/76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1-kube-api-access-jwkkx\") pod \"network-check-target-bnklt\" (UID: \"76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1\") " pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:55:02.776673 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:55:02.776588 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lcf4z\"" Apr 22 17:55:02.783897 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:55:02.783866 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:55:02.897315 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:55:02.897275 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bnklt"] Apr 22 17:55:02.901280 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:55:02.901255 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76eee7a4_cad7_46a8_bca3_2eb3e2d99cd1.slice/crio-252c70feb5d75ba7d0f77f22ccb0efbfc78eb939fc78687888f7c8afeca78478 WatchSource:0}: Error finding container 252c70feb5d75ba7d0f77f22ccb0efbfc78eb939fc78687888f7c8afeca78478: Status 404 returned error can't find the container with id 252c70feb5d75ba7d0f77f22ccb0efbfc78eb939fc78687888f7c8afeca78478 Apr 22 17:55:03.210120 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:55:03.210013 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bnklt" event={"ID":"76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1","Type":"ContainerStarted","Data":"252c70feb5d75ba7d0f77f22ccb0efbfc78eb939fc78687888f7c8afeca78478"} Apr 22 17:55:03.530784 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:55:03.530724 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls\") pod \"dns-default-9h2ht\" (UID: \"8201d0cd-3948-40c0-a905-1339af44c0e0\") " pod="openshift-dns/dns-default-9h2ht" Apr 22 17:55:03.530784 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:55:03.530791 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert\") pod \"ingress-canary-w9kfc\" (UID: \"7faa2f31-ff6f-410e-94f4-9f9b7810616b\") " pod="openshift-ingress-canary/ingress-canary-w9kfc" Apr 22 17:55:03.531217 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:55:03.530872 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:55:03.531217 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:55:03.530908 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:55:03.531217 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:55:03.530936 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls podName:8201d0cd-3948-40c0-a905-1339af44c0e0 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:35.530920475 +0000 UTC m=+98.319510217 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls") pod "dns-default-9h2ht" (UID: "8201d0cd-3948-40c0-a905-1339af44c0e0") : secret "dns-default-metrics-tls" not found Apr 22 17:55:03.531217 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:55:03.530950 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert podName:7faa2f31-ff6f-410e-94f4-9f9b7810616b nodeName:}" failed. No retries permitted until 2026-04-22 17:55:35.530944813 +0000 UTC m=+98.319534556 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert") pod "ingress-canary-w9kfc" (UID: "7faa2f31-ff6f-410e-94f4-9f9b7810616b") : secret "canary-serving-cert" not found Apr 22 17:55:06.217462 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:55:06.217376 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bnklt" event={"ID":"76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1","Type":"ContainerStarted","Data":"c24d7f73e3368d331f55fd37650bd5d4ba3701de4066ea4e1fdcb23337436c55"} Apr 22 17:55:06.217924 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:55:06.217498 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:55:06.233842 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:55:06.233799 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bnklt" podStartSLOduration=66.324026574 podStartE2EDuration="1m9.233785368s" podCreationTimestamp="2026-04-22 17:53:57 +0000 UTC" firstStartedPulling="2026-04-22 17:55:02.903229401 +0000 UTC m=+65.691819146" lastFinishedPulling="2026-04-22 17:55:05.812988199 +0000 UTC m=+68.601577940" observedRunningTime="2026-04-22 17:55:06.233305624 +0000 UTC m=+69.021895387" watchObservedRunningTime="2026-04-22 17:55:06.233785368 +0000 UTC m=+69.022375132" Apr 22 17:55:35.564447 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:55:35.564395 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls\") pod \"dns-default-9h2ht\" (UID: \"8201d0cd-3948-40c0-a905-1339af44c0e0\") " pod="openshift-dns/dns-default-9h2ht" Apr 22 17:55:35.564447 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:55:35.564450 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert\") pod \"ingress-canary-w9kfc\" (UID: \"7faa2f31-ff6f-410e-94f4-9f9b7810616b\") " pod="openshift-ingress-canary/ingress-canary-w9kfc" Apr 22 17:55:35.565039 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:55:35.564911 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:55:35.565094 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:55:35.565061 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls podName:8201d0cd-3948-40c0-a905-1339af44c0e0 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:39.565038918 +0000 UTC m=+162.353628677 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls") pod "dns-default-9h2ht" (UID: "8201d0cd-3948-40c0-a905-1339af44c0e0") : secret "dns-default-metrics-tls" not found Apr 22 17:55:35.565505 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:55:35.564911 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:55:35.565564 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:55:35.565555 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert podName:7faa2f31-ff6f-410e-94f4-9f9b7810616b nodeName:}" failed. No retries permitted until 2026-04-22 17:56:39.565535634 +0000 UTC m=+162.354125390 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert") pod "ingress-canary-w9kfc" (UID: "7faa2f31-ff6f-410e-94f4-9f9b7810616b") : secret "canary-serving-cert" not found Apr 22 17:55:37.222605 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:55:37.222573 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bnklt" Apr 22 17:56:06.580833 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:06.580776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs\") pod \"network-metrics-daemon-4jgqt\" (UID: \"8beb4b83-ece9-44df-bc80-fea79bf050d5\") " pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:56:06.581321 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:06.580935 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:56:06.581321 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:06.581013 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs podName:8beb4b83-ece9-44df-bc80-fea79bf050d5 nodeName:}" failed. No retries permitted until 2026-04-22 17:58:08.580995699 +0000 UTC m=+251.369585441 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs") pod "network-metrics-daemon-4jgqt" (UID: "8beb4b83-ece9-44df-bc80-fea79bf050d5") : secret "metrics-daemon-secret" not found Apr 22 17:56:10.034691 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:10.034656 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-t2gkt"] Apr 22 17:56:10.037534 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:10.037515 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-t2gkt" Apr 22 17:56:10.040259 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:10.040235 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:56:10.041329 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:10.041308 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 17:56:10.041425 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:10.041324 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-fvmk9\"" Apr 22 17:56:10.046350 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:10.046310 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-t2gkt"] Apr 22 17:56:10.105615 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:10.105570 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64w4v\" (UniqueName: \"kubernetes.io/projected/1c982a62-ab1c-4e4a-973c-aafc17ef396c-kube-api-access-64w4v\") pod \"volume-data-source-validator-7c6cbb6c87-t2gkt\" (UID: \"1c982a62-ab1c-4e4a-973c-aafc17ef396c\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-t2gkt" Apr 22 17:56:10.205991 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:10.205937 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64w4v\" (UniqueName: \"kubernetes.io/projected/1c982a62-ab1c-4e4a-973c-aafc17ef396c-kube-api-access-64w4v\") pod \"volume-data-source-validator-7c6cbb6c87-t2gkt\" (UID: \"1c982a62-ab1c-4e4a-973c-aafc17ef396c\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-t2gkt" Apr 22 17:56:10.214855 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:10.214820 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64w4v\" (UniqueName: \"kubernetes.io/projected/1c982a62-ab1c-4e4a-973c-aafc17ef396c-kube-api-access-64w4v\") pod \"volume-data-source-validator-7c6cbb6c87-t2gkt\" (UID: \"1c982a62-ab1c-4e4a-973c-aafc17ef396c\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-t2gkt" Apr 22 17:56:10.347833 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:10.347736 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-t2gkt" Apr 22 17:56:10.463825 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:10.463792 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-t2gkt"] Apr 22 17:56:10.467222 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:56:10.467191 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c982a62_ab1c_4e4a_973c_aafc17ef396c.slice/crio-912acc38135a8e2cdae3a9ce40267e060f8bf7173415bf56c4b610c14b503362 WatchSource:0}: Error finding container 912acc38135a8e2cdae3a9ce40267e060f8bf7173415bf56c4b610c14b503362: Status 404 returned error can't find the container with id 912acc38135a8e2cdae3a9ce40267e060f8bf7173415bf56c4b610c14b503362 Apr 22 17:56:11.341366 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:11.341323 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-t2gkt" event={"ID":"1c982a62-ab1c-4e4a-973c-aafc17ef396c","Type":"ContainerStarted","Data":"912acc38135a8e2cdae3a9ce40267e060f8bf7173415bf56c4b610c14b503362"} Apr 22 17:56:12.344529 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:12.344491 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-t2gkt" event={"ID":"1c982a62-ab1c-4e4a-973c-aafc17ef396c","Type":"ContainerStarted","Data":"4031251453c20aaf4e59277b2ac546f475e4662b63c0f6e1c22d0be4d8098e8d"} Apr 22 17:56:12.360444 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:12.360396 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-t2gkt" podStartSLOduration=0.545182613 podStartE2EDuration="2.360381028s" podCreationTimestamp="2026-04-22 17:56:10 +0000 UTC" firstStartedPulling="2026-04-22 17:56:10.469025785 +0000 UTC m=+133.257615527" lastFinishedPulling="2026-04-22 17:56:12.284224195 +0000 UTC m=+135.072813942" observedRunningTime="2026-04-22 17:56:12.359989544 +0000 UTC m=+135.148579308" watchObservedRunningTime="2026-04-22 17:56:12.360381028 +0000 UTC m=+135.148970789" Apr 22 17:56:13.023490 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:13.023457 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7"] Apr 22 17:56:13.026310 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:13.026294 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7" Apr 22 17:56:13.029046 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:13.029026 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 17:56:13.030123 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:13.030100 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-cp2vn\"" Apr 22 17:56:13.030250 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:13.030121 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 17:56:13.030250 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:13.030136 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:56:13.037245 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:13.037224 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7"] Apr 22 17:56:13.129347 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:13.129317 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkr6r\" (UniqueName: \"kubernetes.io/projected/57424c14-a626-4d2e-9057-40a634e4ecd2-kube-api-access-pkr6r\") pod \"cluster-samples-operator-6dc5bdb6b4-xv6r7\" (UID: \"57424c14-a626-4d2e-9057-40a634e4ecd2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7" Apr 22 17:56:13.129499 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:13.129379 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xv6r7\" (UID: \"57424c14-a626-4d2e-9057-40a634e4ecd2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7" Apr 22 17:56:13.230599 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:13.230561 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xv6r7\" (UID: \"57424c14-a626-4d2e-9057-40a634e4ecd2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7" Apr 22 17:56:13.230789 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:13.230619 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkr6r\" (UniqueName: \"kubernetes.io/projected/57424c14-a626-4d2e-9057-40a634e4ecd2-kube-api-access-pkr6r\") pod \"cluster-samples-operator-6dc5bdb6b4-xv6r7\" (UID: \"57424c14-a626-4d2e-9057-40a634e4ecd2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7" Apr 22 17:56:13.230789 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:13.230712 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:56:13.230885 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:13.230807 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls podName:57424c14-a626-4d2e-9057-40a634e4ecd2 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:13.730789407 +0000 UTC m=+136.519379148 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xv6r7" (UID: "57424c14-a626-4d2e-9057-40a634e4ecd2") : secret "samples-operator-tls" not found Apr 22 17:56:13.239986 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:13.239963 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkr6r\" (UniqueName: \"kubernetes.io/projected/57424c14-a626-4d2e-9057-40a634e4ecd2-kube-api-access-pkr6r\") pod \"cluster-samples-operator-6dc5bdb6b4-xv6r7\" (UID: \"57424c14-a626-4d2e-9057-40a634e4ecd2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7" Apr 22 17:56:13.734322 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:13.734262 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xv6r7\" (UID: \"57424c14-a626-4d2e-9057-40a634e4ecd2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7" Apr 22 17:56:13.734713 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:13.734390 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:56:13.734713 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:13.734445 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls podName:57424c14-a626-4d2e-9057-40a634e4ecd2 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:14.734431124 +0000 UTC m=+137.523020866 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xv6r7" (UID: "57424c14-a626-4d2e-9057-40a634e4ecd2") : secret "samples-operator-tls" not found Apr 22 17:56:14.741106 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:14.741061 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xv6r7\" (UID: \"57424c14-a626-4d2e-9057-40a634e4ecd2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7" Apr 22 17:56:14.741491 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:14.741178 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:56:14.741491 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:14.741233 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls podName:57424c14-a626-4d2e-9057-40a634e4ecd2 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:16.741218204 +0000 UTC m=+139.529807946 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xv6r7" (UID: "57424c14-a626-4d2e-9057-40a634e4ecd2") : secret "samples-operator-tls" not found Apr 22 17:56:15.026318 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.026285 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7mmh4"] Apr 22 17:56:15.029238 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.029222 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:56:15.031814 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.031788 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 17:56:15.032961 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.032940 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 17:56:15.032961 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.032953 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:56:15.033134 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.032953 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 17:56:15.033134 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.032941 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-9c22g\"" Apr 22 17:56:15.038188 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.038163 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 17:56:15.038961 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.038944 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7mmh4"] Apr 22 17:56:15.066533 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.066505 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6d4jf_5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb/dns-node-resolver/0.log" Apr 22 17:56:15.144005 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.143974 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/938d50a9-283b-4aa2-b8ec-60b629bd4253-config\") pod \"console-operator-9d4b6777b-7mmh4\" (UID: \"938d50a9-283b-4aa2-b8ec-60b629bd4253\") " pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:56:15.144160 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.144009 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/938d50a9-283b-4aa2-b8ec-60b629bd4253-serving-cert\") pod \"console-operator-9d4b6777b-7mmh4\" (UID: \"938d50a9-283b-4aa2-b8ec-60b629bd4253\") " pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:56:15.144160 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.144111 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/938d50a9-283b-4aa2-b8ec-60b629bd4253-trusted-ca\") pod \"console-operator-9d4b6777b-7mmh4\" (UID: \"938d50a9-283b-4aa2-b8ec-60b629bd4253\") " pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:56:15.144160 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.144142 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49sk2\" (UniqueName: \"kubernetes.io/projected/938d50a9-283b-4aa2-b8ec-60b629bd4253-kube-api-access-49sk2\") pod \"console-operator-9d4b6777b-7mmh4\" (UID: \"938d50a9-283b-4aa2-b8ec-60b629bd4253\") " pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:56:15.244595 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.244545 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49sk2\" (UniqueName: \"kubernetes.io/projected/938d50a9-283b-4aa2-b8ec-60b629bd4253-kube-api-access-49sk2\") pod \"console-operator-9d4b6777b-7mmh4\" (UID: \"938d50a9-283b-4aa2-b8ec-60b629bd4253\") " pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:56:15.244595 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.244602 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/938d50a9-283b-4aa2-b8ec-60b629bd4253-config\") pod \"console-operator-9d4b6777b-7mmh4\" (UID: \"938d50a9-283b-4aa2-b8ec-60b629bd4253\") " pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:56:15.244891 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.244632 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/938d50a9-283b-4aa2-b8ec-60b629bd4253-serving-cert\") pod \"console-operator-9d4b6777b-7mmh4\" (UID: \"938d50a9-283b-4aa2-b8ec-60b629bd4253\") " pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:56:15.244891 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.244745 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/938d50a9-283b-4aa2-b8ec-60b629bd4253-trusted-ca\") pod \"console-operator-9d4b6777b-7mmh4\" (UID: \"938d50a9-283b-4aa2-b8ec-60b629bd4253\") " pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:56:15.245328 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.245308 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/938d50a9-283b-4aa2-b8ec-60b629bd4253-config\") pod \"console-operator-9d4b6777b-7mmh4\" (UID: \"938d50a9-283b-4aa2-b8ec-60b629bd4253\") " pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:56:15.245932 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.245910 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/938d50a9-283b-4aa2-b8ec-60b629bd4253-trusted-ca\") pod \"console-operator-9d4b6777b-7mmh4\" (UID: \"938d50a9-283b-4aa2-b8ec-60b629bd4253\") " pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:56:15.247191 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.247162 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/938d50a9-283b-4aa2-b8ec-60b629bd4253-serving-cert\") pod \"console-operator-9d4b6777b-7mmh4\" (UID: \"938d50a9-283b-4aa2-b8ec-60b629bd4253\") " pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:56:15.253494 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.253476 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49sk2\" (UniqueName: \"kubernetes.io/projected/938d50a9-283b-4aa2-b8ec-60b629bd4253-kube-api-access-49sk2\") pod \"console-operator-9d4b6777b-7mmh4\" (UID: \"938d50a9-283b-4aa2-b8ec-60b629bd4253\") " pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:56:15.338581 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.338500 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:56:15.457277 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.457245 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7mmh4"] Apr 22 17:56:15.460025 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:56:15.459984 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod938d50a9_283b_4aa2_b8ec_60b629bd4253.slice/crio-af611ae425c392f73e15fe903f3cfb636a61fecbfecf4d20da7f7d1184a16d02 WatchSource:0}: Error finding container af611ae425c392f73e15fe903f3cfb636a61fecbfecf4d20da7f7d1184a16d02: Status 404 returned error can't find the container with id af611ae425c392f73e15fe903f3cfb636a61fecbfecf4d20da7f7d1184a16d02 Apr 22 17:56:15.864496 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.864458 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-686b9f6c49-s6jhd"] Apr 22 17:56:15.868680 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.868662 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:15.871487 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.871462 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 17:56:15.871487 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.871482 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 17:56:15.871658 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.871514 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 17:56:15.871867 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.871852 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-wcdv8\"" Apr 22 17:56:15.877714 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.877696 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 17:56:15.879066 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.879042 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-686b9f6c49-s6jhd"] Apr 22 17:56:15.948581 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.948548 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/518038bf-0a79-4055-8085-82633fe5df1d-trusted-ca\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:15.948743 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.948587 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/518038bf-0a79-4055-8085-82633fe5df1d-image-registry-private-configuration\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:15.948743 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.948667 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:15.948743 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.948731 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-bound-sa-token\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:15.948868 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.948774 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9cx8\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-kube-api-access-k9cx8\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:15.948868 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.948806 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/518038bf-0a79-4055-8085-82633fe5df1d-ca-trust-extracted\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:15.948868 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.948822 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/518038bf-0a79-4055-8085-82633fe5df1d-installation-pull-secrets\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:15.948958 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:15.948872 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/518038bf-0a79-4055-8085-82633fe5df1d-registry-certificates\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:16.049957 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:16.049921 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/518038bf-0a79-4055-8085-82633fe5df1d-registry-certificates\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:16.050117 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:16.049976 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/518038bf-0a79-4055-8085-82633fe5df1d-trusted-ca\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:16.050117 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:16.050003 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/518038bf-0a79-4055-8085-82633fe5df1d-image-registry-private-configuration\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:16.050117 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:16.050047 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:16.050117 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:16.050073 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-bound-sa-token\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:16.050316 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:16.050104 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9cx8\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-kube-api-access-k9cx8\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:16.050316 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:16.050152 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/518038bf-0a79-4055-8085-82633fe5df1d-ca-trust-extracted\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:16.050316 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:16.050177 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/518038bf-0a79-4055-8085-82633fe5df1d-installation-pull-secrets\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:16.050316 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:16.050201 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:56:16.050316 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:16.050223 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-686b9f6c49-s6jhd: secret "image-registry-tls" not found Apr 22 17:56:16.050316 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:16.050281 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls podName:518038bf-0a79-4055-8085-82633fe5df1d nodeName:}" failed. No retries permitted until 2026-04-22 17:56:16.55026129 +0000 UTC m=+139.338851034 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls") pod "image-registry-686b9f6c49-s6jhd" (UID: "518038bf-0a79-4055-8085-82633fe5df1d") : secret "image-registry-tls" not found Apr 22 17:56:16.050885 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:16.050858 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/518038bf-0a79-4055-8085-82633fe5df1d-ca-trust-extracted\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:16.050991 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:16.050978 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/518038bf-0a79-4055-8085-82633fe5df1d-registry-certificates\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:16.051342 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:16.051321 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/518038bf-0a79-4055-8085-82633fe5df1d-trusted-ca\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:16.053024 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:16.053002 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/518038bf-0a79-4055-8085-82633fe5df1d-image-registry-private-configuration\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:16.053221 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:16.053205 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/518038bf-0a79-4055-8085-82633fe5df1d-installation-pull-secrets\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:16.061967 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:16.061944 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-bound-sa-token\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:16.062298 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:16.062273 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9cx8\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-kube-api-access-k9cx8\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:16.065067 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:16.065046 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-w5cfx_77641058-c12b-451a-a008-3394e1d05e7f/node-ca/0.log" Apr 22 17:56:16.354345 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:16.354305 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" event={"ID":"938d50a9-283b-4aa2-b8ec-60b629bd4253","Type":"ContainerStarted","Data":"af611ae425c392f73e15fe903f3cfb636a61fecbfecf4d20da7f7d1184a16d02"} Apr 22 17:56:16.555091 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:16.555056 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:16.555287 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:16.555239 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:56:16.555287 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:16.555261 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-686b9f6c49-s6jhd: secret "image-registry-tls" not found Apr 22 17:56:16.555396 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:16.555333 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls podName:518038bf-0a79-4055-8085-82633fe5df1d nodeName:}" failed. No retries permitted until 2026-04-22 17:56:17.555311531 +0000 UTC m=+140.343901295 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls") pod "image-registry-686b9f6c49-s6jhd" (UID: "518038bf-0a79-4055-8085-82633fe5df1d") : secret "image-registry-tls" not found Apr 22 17:56:16.756290 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:16.756207 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xv6r7\" (UID: \"57424c14-a626-4d2e-9057-40a634e4ecd2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7" Apr 22 17:56:16.756430 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:16.756354 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:56:16.756430 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:16.756418 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls podName:57424c14-a626-4d2e-9057-40a634e4ecd2 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:20.756401394 +0000 UTC m=+143.544991153 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xv6r7" (UID: "57424c14-a626-4d2e-9057-40a634e4ecd2") : secret "samples-operator-tls" not found Apr 22 17:56:17.358516 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:17.358474 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" event={"ID":"938d50a9-283b-4aa2-b8ec-60b629bd4253","Type":"ContainerStarted","Data":"617ec5bee52d13a4fda61b22021ac8dc8d2357ea2588228613d0cda737bb1054"} Apr 22 17:56:17.358943 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:17.358781 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:56:17.360173 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:17.360138 2578 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-7mmh4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.134.0.10:8443/readyz\": dial tcp 10.134.0.10:8443: connect: connection refused" start-of-body= Apr 22 17:56:17.360295 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:17.360209 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" podUID="938d50a9-283b-4aa2-b8ec-60b629bd4253" containerName="console-operator" probeResult="failure" output="Get \"https://10.134.0.10:8443/readyz\": dial tcp 10.134.0.10:8443: connect: connection refused" Apr 22 17:56:17.375574 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:17.375516 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" podStartSLOduration=0.566956787 podStartE2EDuration="2.375500694s" podCreationTimestamp="2026-04-22 17:56:15 +0000 UTC" firstStartedPulling="2026-04-22 17:56:15.464056132 +0000 UTC m=+138.252645876" lastFinishedPulling="2026-04-22 17:56:17.27260003 +0000 UTC m=+140.061189783" observedRunningTime="2026-04-22 17:56:17.374703954 +0000 UTC m=+140.163293717" watchObservedRunningTime="2026-04-22 17:56:17.375500694 +0000 UTC m=+140.164090460" Apr 22 17:56:17.562986 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:17.562897 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:17.563137 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:17.563005 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:56:17.563137 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:17.563017 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-686b9f6c49-s6jhd: secret "image-registry-tls" not found Apr 22 17:56:17.563137 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:17.563069 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls podName:518038bf-0a79-4055-8085-82633fe5df1d nodeName:}" failed. No retries permitted until 2026-04-22 17:56:19.56305533 +0000 UTC m=+142.351645073 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls") pod "image-registry-686b9f6c49-s6jhd" (UID: "518038bf-0a79-4055-8085-82633fe5df1d") : secret "image-registry-tls" not found Apr 22 17:56:18.361847 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:18.361819 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/0.log" Apr 22 17:56:18.362247 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:18.361858 2578 generic.go:358] "Generic (PLEG): container finished" podID="938d50a9-283b-4aa2-b8ec-60b629bd4253" containerID="617ec5bee52d13a4fda61b22021ac8dc8d2357ea2588228613d0cda737bb1054" exitCode=255 Apr 22 17:56:18.362247 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:18.361895 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" event={"ID":"938d50a9-283b-4aa2-b8ec-60b629bd4253","Type":"ContainerDied","Data":"617ec5bee52d13a4fda61b22021ac8dc8d2357ea2588228613d0cda737bb1054"} Apr 22 17:56:18.362247 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:18.362154 2578 scope.go:117] "RemoveContainer" containerID="617ec5bee52d13a4fda61b22021ac8dc8d2357ea2588228613d0cda737bb1054" Apr 22 17:56:19.365543 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:19.365515 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/1.log" Apr 22 17:56:19.365987 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:19.365841 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/0.log" Apr 22 17:56:19.365987 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:19.365872 2578 generic.go:358] "Generic (PLEG): container finished" podID="938d50a9-283b-4aa2-b8ec-60b629bd4253" containerID="c0f5092258656c6267f58c5cf50acc1a3d19e659f0215eb13ad4b0b666b225b1" exitCode=255 Apr 22 17:56:19.365987 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:19.365924 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" event={"ID":"938d50a9-283b-4aa2-b8ec-60b629bd4253","Type":"ContainerDied","Data":"c0f5092258656c6267f58c5cf50acc1a3d19e659f0215eb13ad4b0b666b225b1"} Apr 22 17:56:19.365987 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:19.365951 2578 scope.go:117] "RemoveContainer" containerID="617ec5bee52d13a4fda61b22021ac8dc8d2357ea2588228613d0cda737bb1054" Apr 22 17:56:19.366226 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:19.366207 2578 scope.go:117] "RemoveContainer" containerID="c0f5092258656c6267f58c5cf50acc1a3d19e659f0215eb13ad4b0b666b225b1" Apr 22 17:56:19.366421 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:19.366394 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7mmh4_openshift-console-operator(938d50a9-283b-4aa2-b8ec-60b629bd4253)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" podUID="938d50a9-283b-4aa2-b8ec-60b629bd4253" Apr 22 17:56:19.578300 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:19.578255 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:19.578495 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:19.578406 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:56:19.578495 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:19.578428 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-686b9f6c49-s6jhd: secret "image-registry-tls" not found Apr 22 17:56:19.578495 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:19.578480 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls podName:518038bf-0a79-4055-8085-82633fe5df1d nodeName:}" failed. No retries permitted until 2026-04-22 17:56:23.578465257 +0000 UTC m=+146.367054998 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls") pod "image-registry-686b9f6c49-s6jhd" (UID: "518038bf-0a79-4055-8085-82633fe5df1d") : secret "image-registry-tls" not found Apr 22 17:56:20.208601 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:20.208565 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vpl9w"] Apr 22 17:56:20.213736 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:20.213707 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vpl9w" Apr 22 17:56:20.216875 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:20.216855 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-vhqbs\"" Apr 22 17:56:20.219511 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:20.219485 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vpl9w"] Apr 22 17:56:20.284106 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:20.284068 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chdtk\" (UniqueName: \"kubernetes.io/projected/84ebfb35-ba39-488d-9650-3d97856af9a3-kube-api-access-chdtk\") pod \"network-check-source-8894fc9bd-vpl9w\" (UID: \"84ebfb35-ba39-488d-9650-3d97856af9a3\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vpl9w" Apr 22 17:56:20.369179 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:20.369155 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/1.log" Apr 22 17:56:20.369635 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:20.369583 2578 scope.go:117] "RemoveContainer" containerID="c0f5092258656c6267f58c5cf50acc1a3d19e659f0215eb13ad4b0b666b225b1" Apr 22 17:56:20.369811 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:20.369790 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7mmh4_openshift-console-operator(938d50a9-283b-4aa2-b8ec-60b629bd4253)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" podUID="938d50a9-283b-4aa2-b8ec-60b629bd4253" Apr 22 17:56:20.385351 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:20.385307 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chdtk\" (UniqueName: \"kubernetes.io/projected/84ebfb35-ba39-488d-9650-3d97856af9a3-kube-api-access-chdtk\") pod \"network-check-source-8894fc9bd-vpl9w\" (UID: \"84ebfb35-ba39-488d-9650-3d97856af9a3\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vpl9w" Apr 22 17:56:20.393715 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:20.393683 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chdtk\" (UniqueName: \"kubernetes.io/projected/84ebfb35-ba39-488d-9650-3d97856af9a3-kube-api-access-chdtk\") pod \"network-check-source-8894fc9bd-vpl9w\" (UID: \"84ebfb35-ba39-488d-9650-3d97856af9a3\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vpl9w" Apr 22 17:56:20.522979 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:20.522946 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vpl9w" Apr 22 17:56:20.638967 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:20.638935 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vpl9w"] Apr 22 17:56:20.641852 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:56:20.641810 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84ebfb35_ba39_488d_9650_3d97856af9a3.slice/crio-74da3bfb9708ae57cd599bd1eba5a546fa86ab18bf3dfd4277ae5892dada7c1f WatchSource:0}: Error finding container 74da3bfb9708ae57cd599bd1eba5a546fa86ab18bf3dfd4277ae5892dada7c1f: Status 404 returned error can't find the container with id 74da3bfb9708ae57cd599bd1eba5a546fa86ab18bf3dfd4277ae5892dada7c1f Apr 22 17:56:20.788957 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:20.788856 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xv6r7\" (UID: \"57424c14-a626-4d2e-9057-40a634e4ecd2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7" Apr 22 17:56:20.789119 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:20.789018 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:56:20.789119 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:20.789110 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls podName:57424c14-a626-4d2e-9057-40a634e4ecd2 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:28.789088766 +0000 UTC m=+151.577678521 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xv6r7" (UID: "57424c14-a626-4d2e-9057-40a634e4ecd2") : secret "samples-operator-tls" not found Apr 22 17:56:21.373008 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:21.372967 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vpl9w" event={"ID":"84ebfb35-ba39-488d-9650-3d97856af9a3","Type":"ContainerStarted","Data":"29884bb3f56bd2a78bdfee8a06c7aa5690fb1610b5bd6124e006af7d9afd6d66"} Apr 22 17:56:21.373008 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:21.373012 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vpl9w" event={"ID":"84ebfb35-ba39-488d-9650-3d97856af9a3","Type":"ContainerStarted","Data":"74da3bfb9708ae57cd599bd1eba5a546fa86ab18bf3dfd4277ae5892dada7c1f"} Apr 22 17:56:21.391542 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:21.391497 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vpl9w" podStartSLOduration=1.39148324 podStartE2EDuration="1.39148324s" podCreationTimestamp="2026-04-22 17:56:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:56:21.390893775 +0000 UTC m=+144.179483539" watchObservedRunningTime="2026-04-22 17:56:21.39148324 +0000 UTC m=+144.180073003" Apr 22 17:56:23.609871 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:23.609837 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:23.610251 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:23.609958 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:56:23.610251 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:23.609969 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-686b9f6c49-s6jhd: secret "image-registry-tls" not found Apr 22 17:56:23.610251 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:23.610020 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls podName:518038bf-0a79-4055-8085-82633fe5df1d nodeName:}" failed. No retries permitted until 2026-04-22 17:56:31.61000564 +0000 UTC m=+154.398595382 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls") pod "image-registry-686b9f6c49-s6jhd" (UID: "518038bf-0a79-4055-8085-82633fe5df1d") : secret "image-registry-tls" not found Apr 22 17:56:25.339159 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:25.339119 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:56:25.339541 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:25.339475 2578 scope.go:117] "RemoveContainer" containerID="c0f5092258656c6267f58c5cf50acc1a3d19e659f0215eb13ad4b0b666b225b1" Apr 22 17:56:25.339646 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:25.339629 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7mmh4_openshift-console-operator(938d50a9-283b-4aa2-b8ec-60b629bd4253)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" podUID="938d50a9-283b-4aa2-b8ec-60b629bd4253" Apr 22 17:56:26.157782 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:26.157737 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-wxhjz"] Apr 22 17:56:26.161533 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:26.161518 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-wxhjz" Apr 22 17:56:26.165014 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:26.164987 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 17:56:26.165299 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:26.165274 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 17:56:26.166075 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:26.166047 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 17:56:26.166199 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:26.166180 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 17:56:26.166247 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:26.166212 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-x5hpz\"" Apr 22 17:56:26.168794 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:26.168769 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-wxhjz"] Apr 22 17:56:26.228776 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:26.228724 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/44b17415-0db9-4314-b3f8-455fda8ada29-signing-key\") pod \"service-ca-865cb79987-wxhjz\" (UID: \"44b17415-0db9-4314-b3f8-455fda8ada29\") " pod="openshift-service-ca/service-ca-865cb79987-wxhjz" Apr 22 17:56:26.228776 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:26.228778 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/44b17415-0db9-4314-b3f8-455fda8ada29-signing-cabundle\") pod \"service-ca-865cb79987-wxhjz\" (UID: \"44b17415-0db9-4314-b3f8-455fda8ada29\") " pod="openshift-service-ca/service-ca-865cb79987-wxhjz" Apr 22 17:56:26.228993 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:26.228797 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67h8v\" (UniqueName: \"kubernetes.io/projected/44b17415-0db9-4314-b3f8-455fda8ada29-kube-api-access-67h8v\") pod \"service-ca-865cb79987-wxhjz\" (UID: \"44b17415-0db9-4314-b3f8-455fda8ada29\") " pod="openshift-service-ca/service-ca-865cb79987-wxhjz" Apr 22 17:56:26.329421 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:26.329377 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/44b17415-0db9-4314-b3f8-455fda8ada29-signing-key\") pod \"service-ca-865cb79987-wxhjz\" (UID: \"44b17415-0db9-4314-b3f8-455fda8ada29\") " pod="openshift-service-ca/service-ca-865cb79987-wxhjz" Apr 22 17:56:26.329421 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:26.329423 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/44b17415-0db9-4314-b3f8-455fda8ada29-signing-cabundle\") pod \"service-ca-865cb79987-wxhjz\" (UID: \"44b17415-0db9-4314-b3f8-455fda8ada29\") " pod="openshift-service-ca/service-ca-865cb79987-wxhjz" Apr 22 17:56:26.329565 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:26.329443 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67h8v\" (UniqueName: \"kubernetes.io/projected/44b17415-0db9-4314-b3f8-455fda8ada29-kube-api-access-67h8v\") pod \"service-ca-865cb79987-wxhjz\" (UID: \"44b17415-0db9-4314-b3f8-455fda8ada29\") " pod="openshift-service-ca/service-ca-865cb79987-wxhjz" Apr 22 17:56:26.330068 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:26.330044 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/44b17415-0db9-4314-b3f8-455fda8ada29-signing-cabundle\") pod \"service-ca-865cb79987-wxhjz\" (UID: \"44b17415-0db9-4314-b3f8-455fda8ada29\") " pod="openshift-service-ca/service-ca-865cb79987-wxhjz" Apr 22 17:56:26.332090 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:26.332066 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/44b17415-0db9-4314-b3f8-455fda8ada29-signing-key\") pod \"service-ca-865cb79987-wxhjz\" (UID: \"44b17415-0db9-4314-b3f8-455fda8ada29\") " pod="openshift-service-ca/service-ca-865cb79987-wxhjz" Apr 22 17:56:26.339443 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:26.339414 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67h8v\" (UniqueName: \"kubernetes.io/projected/44b17415-0db9-4314-b3f8-455fda8ada29-kube-api-access-67h8v\") pod \"service-ca-865cb79987-wxhjz\" (UID: \"44b17415-0db9-4314-b3f8-455fda8ada29\") " pod="openshift-service-ca/service-ca-865cb79987-wxhjz" Apr 22 17:56:26.473971 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:26.473883 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-wxhjz" Apr 22 17:56:26.590749 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:26.590716 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-wxhjz"] Apr 22 17:56:26.594561 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:56:26.594517 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44b17415_0db9_4314_b3f8_455fda8ada29.slice/crio-de69178fba49e1d9381e9fdbaa46d7ab2f65a69f184b1a89737e15c94ea64d98 WatchSource:0}: Error finding container de69178fba49e1d9381e9fdbaa46d7ab2f65a69f184b1a89737e15c94ea64d98: Status 404 returned error can't find the container with id de69178fba49e1d9381e9fdbaa46d7ab2f65a69f184b1a89737e15c94ea64d98 Apr 22 17:56:27.358861 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:27.358828 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:56:27.359310 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:27.359254 2578 scope.go:117] "RemoveContainer" containerID="c0f5092258656c6267f58c5cf50acc1a3d19e659f0215eb13ad4b0b666b225b1" Apr 22 17:56:27.359450 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:27.359426 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7mmh4_openshift-console-operator(938d50a9-283b-4aa2-b8ec-60b629bd4253)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" podUID="938d50a9-283b-4aa2-b8ec-60b629bd4253" Apr 22 17:56:27.390034 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:27.389986 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-wxhjz" event={"ID":"44b17415-0db9-4314-b3f8-455fda8ada29","Type":"ContainerStarted","Data":"de69178fba49e1d9381e9fdbaa46d7ab2f65a69f184b1a89737e15c94ea64d98"} Apr 22 17:56:28.393748 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:28.393702 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-wxhjz" event={"ID":"44b17415-0db9-4314-b3f8-455fda8ada29","Type":"ContainerStarted","Data":"3d16fbbb16ef22092c29a7cef7e6d33fe0a567fdf880d8751b0f0d7c1e918639"} Apr 22 17:56:28.411074 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:28.411019 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-wxhjz" podStartSLOduration=0.745845454 podStartE2EDuration="2.41100537s" podCreationTimestamp="2026-04-22 17:56:26 +0000 UTC" firstStartedPulling="2026-04-22 17:56:26.596274568 +0000 UTC m=+149.384864310" lastFinishedPulling="2026-04-22 17:56:28.261434479 +0000 UTC m=+151.050024226" observedRunningTime="2026-04-22 17:56:28.410115378 +0000 UTC m=+151.198705143" watchObservedRunningTime="2026-04-22 17:56:28.41100537 +0000 UTC m=+151.199595133" Apr 22 17:56:28.853495 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:28.853459 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xv6r7\" (UID: \"57424c14-a626-4d2e-9057-40a634e4ecd2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7" Apr 22 17:56:28.853689 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:28.853617 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:56:28.853728 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:28.853689 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls podName:57424c14-a626-4d2e-9057-40a634e4ecd2 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:44.853671309 +0000 UTC m=+167.642261074 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xv6r7" (UID: "57424c14-a626-4d2e-9057-40a634e4ecd2") : secret "samples-operator-tls" not found Apr 22 17:56:31.677711 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:31.677661 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:31.678122 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:31.677839 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:56:31.678122 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:31.677860 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-686b9f6c49-s6jhd: secret "image-registry-tls" not found Apr 22 17:56:31.678122 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:31.677920 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls podName:518038bf-0a79-4055-8085-82633fe5df1d nodeName:}" failed. No retries permitted until 2026-04-22 17:56:47.677904785 +0000 UTC m=+170.466494540 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls") pod "image-registry-686b9f6c49-s6jhd" (UID: "518038bf-0a79-4055-8085-82633fe5df1d") : secret "image-registry-tls" not found Apr 22 17:56:34.674333 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:34.674283 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-w9kfc" podUID="7faa2f31-ff6f-410e-94f4-9f9b7810616b" Apr 22 17:56:34.685444 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:34.685401 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-9h2ht" podUID="8201d0cd-3948-40c0-a905-1339af44c0e0" Apr 22 17:56:35.414871 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:35.414839 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w9kfc" Apr 22 17:56:35.779553 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:35.779517 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-4jgqt" podUID="8beb4b83-ece9-44df-bc80-fea79bf050d5" Apr 22 17:56:39.640908 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:39.640854 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls\") pod \"dns-default-9h2ht\" (UID: \"8201d0cd-3948-40c0-a905-1339af44c0e0\") " pod="openshift-dns/dns-default-9h2ht" Apr 22 17:56:39.640908 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:39.640913 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert\") pod \"ingress-canary-w9kfc\" (UID: \"7faa2f31-ff6f-410e-94f4-9f9b7810616b\") " pod="openshift-ingress-canary/ingress-canary-w9kfc" Apr 22 17:56:39.643422 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:39.643397 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8201d0cd-3948-40c0-a905-1339af44c0e0-metrics-tls\") pod \"dns-default-9h2ht\" (UID: \"8201d0cd-3948-40c0-a905-1339af44c0e0\") " pod="openshift-dns/dns-default-9h2ht" Apr 22 17:56:39.643613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:39.643593 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7faa2f31-ff6f-410e-94f4-9f9b7810616b-cert\") pod \"ingress-canary-w9kfc\" (UID: \"7faa2f31-ff6f-410e-94f4-9f9b7810616b\") " pod="openshift-ingress-canary/ingress-canary-w9kfc" Apr 22 17:56:39.762700 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:39.762664 2578 scope.go:117] "RemoveContainer" containerID="c0f5092258656c6267f58c5cf50acc1a3d19e659f0215eb13ad4b0b666b225b1" Apr 22 17:56:39.918068 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:39.917982 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-w7jfh\"" Apr 22 17:56:39.926265 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:39.926244 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w9kfc" Apr 22 17:56:40.060133 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:40.060101 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-w9kfc"] Apr 22 17:56:40.063527 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:56:40.063489 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7faa2f31_ff6f_410e_94f4_9f9b7810616b.slice/crio-ada244e3e4ace25d41330af415369fea46975443830d85928d0318da452a99dd WatchSource:0}: Error finding container ada244e3e4ace25d41330af415369fea46975443830d85928d0318da452a99dd: Status 404 returned error can't find the container with id ada244e3e4ace25d41330af415369fea46975443830d85928d0318da452a99dd Apr 22 17:56:40.428588 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:40.428561 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 17:56:40.428951 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:40.428935 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/1.log" Apr 22 17:56:40.429060 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:40.428972 2578 generic.go:358] "Generic (PLEG): container finished" podID="938d50a9-283b-4aa2-b8ec-60b629bd4253" containerID="ac7eb162c6dc96734cdfe7c099826cede9339792bffb57d9ac54d773ad341eed" exitCode=255 Apr 22 17:56:40.429060 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:40.429043 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" event={"ID":"938d50a9-283b-4aa2-b8ec-60b629bd4253","Type":"ContainerDied","Data":"ac7eb162c6dc96734cdfe7c099826cede9339792bffb57d9ac54d773ad341eed"} Apr 22 17:56:40.429161 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:40.429075 2578 scope.go:117] "RemoveContainer" containerID="c0f5092258656c6267f58c5cf50acc1a3d19e659f0215eb13ad4b0b666b225b1" Apr 22 17:56:40.429418 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:40.429404 2578 scope.go:117] "RemoveContainer" containerID="ac7eb162c6dc96734cdfe7c099826cede9339792bffb57d9ac54d773ad341eed" Apr 22 17:56:40.429641 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:40.429615 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-7mmh4_openshift-console-operator(938d50a9-283b-4aa2-b8ec-60b629bd4253)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" podUID="938d50a9-283b-4aa2-b8ec-60b629bd4253" Apr 22 17:56:40.430103 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:40.430065 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-w9kfc" event={"ID":"7faa2f31-ff6f-410e-94f4-9f9b7810616b","Type":"ContainerStarted","Data":"ada244e3e4ace25d41330af415369fea46975443830d85928d0318da452a99dd"} Apr 22 17:56:41.434348 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:41.434312 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 17:56:42.438586 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:42.438495 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-w9kfc" event={"ID":"7faa2f31-ff6f-410e-94f4-9f9b7810616b","Type":"ContainerStarted","Data":"ec26f120c7207e4980ec9d3677ad5d84b09a8dbd965dcd22075019edd31571e7"} Apr 22 17:56:42.456303 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:42.456254 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-w9kfc" podStartSLOduration=129.946751569 podStartE2EDuration="2m11.456239783s" podCreationTimestamp="2026-04-22 17:54:31 +0000 UTC" firstStartedPulling="2026-04-22 17:56:40.065474302 +0000 UTC m=+162.854064043" lastFinishedPulling="2026-04-22 17:56:41.574962514 +0000 UTC m=+164.363552257" observedRunningTime="2026-04-22 17:56:42.455936879 +0000 UTC m=+165.244526640" watchObservedRunningTime="2026-04-22 17:56:42.456239783 +0000 UTC m=+165.244829588" Apr 22 17:56:44.883529 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:44.883492 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xv6r7\" (UID: \"57424c14-a626-4d2e-9057-40a634e4ecd2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7" Apr 22 17:56:44.886014 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:44.885988 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/57424c14-a626-4d2e-9057-40a634e4ecd2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xv6r7\" (UID: \"57424c14-a626-4d2e-9057-40a634e4ecd2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7" Apr 22 17:56:45.135365 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:45.135278 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7" Apr 22 17:56:45.255264 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:45.255211 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7"] Apr 22 17:56:45.339063 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:45.339029 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:56:45.339405 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:45.339392 2578 scope.go:117] "RemoveContainer" containerID="ac7eb162c6dc96734cdfe7c099826cede9339792bffb57d9ac54d773ad341eed" Apr 22 17:56:45.339558 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:45.339541 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-7mmh4_openshift-console-operator(938d50a9-283b-4aa2-b8ec-60b629bd4253)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" podUID="938d50a9-283b-4aa2-b8ec-60b629bd4253" Apr 22 17:56:45.446582 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:45.446496 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7" event={"ID":"57424c14-a626-4d2e-9057-40a634e4ecd2","Type":"ContainerStarted","Data":"638dbc474b853869d176e3838ccd8bced11209d394ca758870dc51cbb94252df"} Apr 22 17:56:46.761696 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:46.761665 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:56:47.359841 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:47.359799 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:56:47.360287 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:47.360266 2578 scope.go:117] "RemoveContainer" containerID="ac7eb162c6dc96734cdfe7c099826cede9339792bffb57d9ac54d773ad341eed" Apr 22 17:56:47.360484 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:47.360463 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-7mmh4_openshift-console-operator(938d50a9-283b-4aa2-b8ec-60b629bd4253)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" podUID="938d50a9-283b-4aa2-b8ec-60b629bd4253" Apr 22 17:56:47.452851 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:47.452816 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7" event={"ID":"57424c14-a626-4d2e-9057-40a634e4ecd2","Type":"ContainerStarted","Data":"353bcf68997fa4b892721cf0a993521239611407d211b83685efaece28b03e41"} Apr 22 17:56:47.452851 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:47.452854 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7" event={"ID":"57424c14-a626-4d2e-9057-40a634e4ecd2","Type":"ContainerStarted","Data":"41085322bb0ff2a61c79b972d5e20e3742844a9232e01f0b275d3585702b01b0"} Apr 22 17:56:47.472549 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:47.472500 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xv6r7" podStartSLOduration=32.952737433 podStartE2EDuration="34.472486822s" podCreationTimestamp="2026-04-22 17:56:13 +0000 UTC" firstStartedPulling="2026-04-22 17:56:45.295905927 +0000 UTC m=+168.084495669" lastFinishedPulling="2026-04-22 17:56:46.815655297 +0000 UTC m=+169.604245058" observedRunningTime="2026-04-22 17:56:47.471908467 +0000 UTC m=+170.260498231" watchObservedRunningTime="2026-04-22 17:56:47.472486822 +0000 UTC m=+170.261076600" Apr 22 17:56:47.705152 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:47.705058 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:47.707687 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:47.707661 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls\") pod \"image-registry-686b9f6c49-s6jhd\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:47.977935 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:47.977854 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:48.100513 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.100483 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-686b9f6c49-s6jhd"] Apr 22 17:56:48.104241 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:56:48.104207 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod518038bf_0a79_4055_8085_82633fe5df1d.slice/crio-4835079bd5dd1dff03c77bd9b728dd45f7d152067e3c6c06e55b004553e52ed2 WatchSource:0}: Error finding container 4835079bd5dd1dff03c77bd9b728dd45f7d152067e3c6c06e55b004553e52ed2: Status 404 returned error can't find the container with id 4835079bd5dd1dff03c77bd9b728dd45f7d152067e3c6c06e55b004553e52ed2 Apr 22 17:56:48.254461 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.254379 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6qmlv"] Apr 22 17:56:48.258177 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.258156 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6qmlv" Apr 22 17:56:48.261032 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.261004 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 17:56:48.261237 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.261058 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 17:56:48.261472 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.261445 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 17:56:48.262076 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.262058 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-cl79w\"" Apr 22 17:56:48.262227 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.262061 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 17:56:48.272448 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.272424 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6qmlv"] Apr 22 17:56:48.285285 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.285255 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-686b9f6c49-s6jhd"] Apr 22 17:56:48.310109 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.310057 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/817cb43c-5c91-44cf-b3e9-cc2c4b8b6449-data-volume\") pod \"insights-runtime-extractor-6qmlv\" (UID: \"817cb43c-5c91-44cf-b3e9-cc2c4b8b6449\") " pod="openshift-insights/insights-runtime-extractor-6qmlv" Apr 22 17:56:48.310109 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.310115 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/817cb43c-5c91-44cf-b3e9-cc2c4b8b6449-crio-socket\") pod \"insights-runtime-extractor-6qmlv\" (UID: \"817cb43c-5c91-44cf-b3e9-cc2c4b8b6449\") " pod="openshift-insights/insights-runtime-extractor-6qmlv" Apr 22 17:56:48.310348 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.310178 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/817cb43c-5c91-44cf-b3e9-cc2c4b8b6449-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6qmlv\" (UID: \"817cb43c-5c91-44cf-b3e9-cc2c4b8b6449\") " pod="openshift-insights/insights-runtime-extractor-6qmlv" Apr 22 17:56:48.310348 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.310251 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/817cb43c-5c91-44cf-b3e9-cc2c4b8b6449-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6qmlv\" (UID: \"817cb43c-5c91-44cf-b3e9-cc2c4b8b6449\") " pod="openshift-insights/insights-runtime-extractor-6qmlv" Apr 22 17:56:48.310348 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.310280 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzzd6\" (UniqueName: \"kubernetes.io/projected/817cb43c-5c91-44cf-b3e9-cc2c4b8b6449-kube-api-access-wzzd6\") pod \"insights-runtime-extractor-6qmlv\" (UID: \"817cb43c-5c91-44cf-b3e9-cc2c4b8b6449\") " pod="openshift-insights/insights-runtime-extractor-6qmlv" Apr 22 17:56:48.362655 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.362621 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-75545b9c8c-9xb6c"] Apr 22 17:56:48.365600 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.365575 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.382538 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.382510 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-75545b9c8c-9xb6c"] Apr 22 17:56:48.410816 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.410785 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1337fc7-a968-4b05-8ed3-4921f082016a-registry-certificates\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.410995 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.410830 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1337fc7-a968-4b05-8ed3-4921f082016a-registry-tls\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.410995 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.410867 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1337fc7-a968-4b05-8ed3-4921f082016a-trusted-ca\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.410995 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.410888 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9np6m\" (UniqueName: \"kubernetes.io/projected/d1337fc7-a968-4b05-8ed3-4921f082016a-kube-api-access-9np6m\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.410995 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.410929 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/817cb43c-5c91-44cf-b3e9-cc2c4b8b6449-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6qmlv\" (UID: \"817cb43c-5c91-44cf-b3e9-cc2c4b8b6449\") " pod="openshift-insights/insights-runtime-extractor-6qmlv" Apr 22 17:56:48.410995 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.410954 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1337fc7-a968-4b05-8ed3-4921f082016a-ca-trust-extracted\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.410995 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.410979 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzzd6\" (UniqueName: \"kubernetes.io/projected/817cb43c-5c91-44cf-b3e9-cc2c4b8b6449-kube-api-access-wzzd6\") pod \"insights-runtime-extractor-6qmlv\" (UID: \"817cb43c-5c91-44cf-b3e9-cc2c4b8b6449\") " pod="openshift-insights/insights-runtime-extractor-6qmlv" Apr 22 17:56:48.411283 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.411003 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1337fc7-a968-4b05-8ed3-4921f082016a-bound-sa-token\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.411283 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.411149 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/817cb43c-5c91-44cf-b3e9-cc2c4b8b6449-data-volume\") pod \"insights-runtime-extractor-6qmlv\" (UID: \"817cb43c-5c91-44cf-b3e9-cc2c4b8b6449\") " pod="openshift-insights/insights-runtime-extractor-6qmlv" Apr 22 17:56:48.411283 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.411189 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1337fc7-a968-4b05-8ed3-4921f082016a-installation-pull-secrets\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.411283 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.411223 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1337fc7-a968-4b05-8ed3-4921f082016a-image-registry-private-configuration\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.411451 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.411302 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/817cb43c-5c91-44cf-b3e9-cc2c4b8b6449-crio-socket\") pod \"insights-runtime-extractor-6qmlv\" (UID: \"817cb43c-5c91-44cf-b3e9-cc2c4b8b6449\") " pod="openshift-insights/insights-runtime-extractor-6qmlv" Apr 22 17:56:48.411451 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.411359 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/817cb43c-5c91-44cf-b3e9-cc2c4b8b6449-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6qmlv\" (UID: \"817cb43c-5c91-44cf-b3e9-cc2c4b8b6449\") " pod="openshift-insights/insights-runtime-extractor-6qmlv" Apr 22 17:56:48.411451 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.411410 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/817cb43c-5c91-44cf-b3e9-cc2c4b8b6449-crio-socket\") pod \"insights-runtime-extractor-6qmlv\" (UID: \"817cb43c-5c91-44cf-b3e9-cc2c4b8b6449\") " pod="openshift-insights/insights-runtime-extractor-6qmlv" Apr 22 17:56:48.411451 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.411428 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/817cb43c-5c91-44cf-b3e9-cc2c4b8b6449-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6qmlv\" (UID: \"817cb43c-5c91-44cf-b3e9-cc2c4b8b6449\") " pod="openshift-insights/insights-runtime-extractor-6qmlv" Apr 22 17:56:48.411577 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.411468 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/817cb43c-5c91-44cf-b3e9-cc2c4b8b6449-data-volume\") pod \"insights-runtime-extractor-6qmlv\" (UID: \"817cb43c-5c91-44cf-b3e9-cc2c4b8b6449\") " pod="openshift-insights/insights-runtime-extractor-6qmlv" Apr 22 17:56:48.413846 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.413822 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/817cb43c-5c91-44cf-b3e9-cc2c4b8b6449-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6qmlv\" (UID: \"817cb43c-5c91-44cf-b3e9-cc2c4b8b6449\") " pod="openshift-insights/insights-runtime-extractor-6qmlv" Apr 22 17:56:48.435827 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.435792 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzzd6\" (UniqueName: \"kubernetes.io/projected/817cb43c-5c91-44cf-b3e9-cc2c4b8b6449-kube-api-access-wzzd6\") pod \"insights-runtime-extractor-6qmlv\" (UID: \"817cb43c-5c91-44cf-b3e9-cc2c4b8b6449\") " pod="openshift-insights/insights-runtime-extractor-6qmlv" Apr 22 17:56:48.456577 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.456544 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" event={"ID":"518038bf-0a79-4055-8085-82633fe5df1d","Type":"ContainerStarted","Data":"00ede73f7f0919f9f7f6c2198bce197ef50e0eabe74f4faf72cbe8a83f3ef0aa"} Apr 22 17:56:48.456577 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.456580 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" event={"ID":"518038bf-0a79-4055-8085-82633fe5df1d","Type":"ContainerStarted","Data":"4835079bd5dd1dff03c77bd9b728dd45f7d152067e3c6c06e55b004553e52ed2"} Apr 22 17:56:48.456865 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.456770 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:56:48.483712 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.483526 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" podStartSLOduration=33.48350669 podStartE2EDuration="33.48350669s" podCreationTimestamp="2026-04-22 17:56:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:56:48.483474157 +0000 UTC m=+171.272063939" watchObservedRunningTime="2026-04-22 17:56:48.48350669 +0000 UTC m=+171.272096449" Apr 22 17:56:48.511899 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.511866 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1337fc7-a968-4b05-8ed3-4921f082016a-registry-tls\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.512047 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.511915 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1337fc7-a968-4b05-8ed3-4921f082016a-trusted-ca\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.512205 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.512192 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9np6m\" (UniqueName: \"kubernetes.io/projected/d1337fc7-a968-4b05-8ed3-4921f082016a-kube-api-access-9np6m\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.512258 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.512236 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1337fc7-a968-4b05-8ed3-4921f082016a-ca-trust-extracted\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.512316 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.512262 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1337fc7-a968-4b05-8ed3-4921f082016a-bound-sa-token\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.512375 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.512363 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1337fc7-a968-4b05-8ed3-4921f082016a-installation-pull-secrets\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.512423 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.512399 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1337fc7-a968-4b05-8ed3-4921f082016a-image-registry-private-configuration\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.512516 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.512497 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1337fc7-a968-4b05-8ed3-4921f082016a-registry-certificates\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.512966 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.512934 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d1337fc7-a968-4b05-8ed3-4921f082016a-ca-trust-extracted\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.513127 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.513099 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d1337fc7-a968-4b05-8ed3-4921f082016a-trusted-ca\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.513282 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.513255 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d1337fc7-a968-4b05-8ed3-4921f082016a-registry-certificates\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.515036 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.515011 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d1337fc7-a968-4b05-8ed3-4921f082016a-registry-tls\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.515286 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.515263 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d1337fc7-a968-4b05-8ed3-4921f082016a-installation-pull-secrets\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.515407 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.515388 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d1337fc7-a968-4b05-8ed3-4921f082016a-image-registry-private-configuration\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.526077 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.526047 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d1337fc7-a968-4b05-8ed3-4921f082016a-bound-sa-token\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.526271 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.526252 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9np6m\" (UniqueName: \"kubernetes.io/projected/d1337fc7-a968-4b05-8ed3-4921f082016a-kube-api-access-9np6m\") pod \"image-registry-75545b9c8c-9xb6c\" (UID: \"d1337fc7-a968-4b05-8ed3-4921f082016a\") " pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.568381 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.568291 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6qmlv" Apr 22 17:56:48.674264 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.674222 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:48.689886 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.689857 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6qmlv"] Apr 22 17:56:48.693266 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:56:48.693219 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod817cb43c_5c91_44cf_b3e9_cc2c4b8b6449.slice/crio-090657eeaa4c24f388deeedf3b1de9e2557720d65d9d25f0f840e33484c9aaf5 WatchSource:0}: Error finding container 090657eeaa4c24f388deeedf3b1de9e2557720d65d9d25f0f840e33484c9aaf5: Status 404 returned error can't find the container with id 090657eeaa4c24f388deeedf3b1de9e2557720d65d9d25f0f840e33484c9aaf5 Apr 22 17:56:48.761823 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.761791 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9h2ht" Apr 22 17:56:48.765226 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.764965 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-q8m4n\"" Apr 22 17:56:48.772519 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.772492 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9h2ht" Apr 22 17:56:48.806495 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.806461 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-75545b9c8c-9xb6c"] Apr 22 17:56:48.809736 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:56:48.809709 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1337fc7_a968_4b05_8ed3_4921f082016a.slice/crio-5c389d468167761d0da2aea88e9e4b36a4b813f641d3b03b6fc219e65da8926c WatchSource:0}: Error finding container 5c389d468167761d0da2aea88e9e4b36a4b813f641d3b03b6fc219e65da8926c: Status 404 returned error can't find the container with id 5c389d468167761d0da2aea88e9e4b36a4b813f641d3b03b6fc219e65da8926c Apr 22 17:56:48.902078 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:48.902039 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9h2ht"] Apr 22 17:56:48.905008 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:56:48.904978 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8201d0cd_3948_40c0_a905_1339af44c0e0.slice/crio-f8897b9a149f04c3f2f9a6f80b19b57784b84b2df4fb48c83c69f2740fe0d167 WatchSource:0}: Error finding container f8897b9a149f04c3f2f9a6f80b19b57784b84b2df4fb48c83c69f2740fe0d167: Status 404 returned error can't find the container with id f8897b9a149f04c3f2f9a6f80b19b57784b84b2df4fb48c83c69f2740fe0d167 Apr 22 17:56:49.461656 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:49.461616 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6qmlv" event={"ID":"817cb43c-5c91-44cf-b3e9-cc2c4b8b6449","Type":"ContainerStarted","Data":"cac3d8eda984c16073ef5df777b3c88fc72636dda55627b8230cef496bc5f8dd"} Apr 22 17:56:49.461656 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:49.461663 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6qmlv" event={"ID":"817cb43c-5c91-44cf-b3e9-cc2c4b8b6449","Type":"ContainerStarted","Data":"2ae5237eb31e5cef108f0d501fe9cecd40a1ba21780a29e52524a5e7e32f0d9b"} Apr 22 17:56:49.462172 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:49.461678 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6qmlv" event={"ID":"817cb43c-5c91-44cf-b3e9-cc2c4b8b6449","Type":"ContainerStarted","Data":"090657eeaa4c24f388deeedf3b1de9e2557720d65d9d25f0f840e33484c9aaf5"} Apr 22 17:56:49.462818 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:49.462787 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9h2ht" event={"ID":"8201d0cd-3948-40c0-a905-1339af44c0e0","Type":"ContainerStarted","Data":"f8897b9a149f04c3f2f9a6f80b19b57784b84b2df4fb48c83c69f2740fe0d167"} Apr 22 17:56:49.464244 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:49.464204 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" event={"ID":"d1337fc7-a968-4b05-8ed3-4921f082016a","Type":"ContainerStarted","Data":"7256c473ae189a292e8564db77a68e23eafa0e62e58af061a1a256c0f35d8a1c"} Apr 22 17:56:49.464389 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:49.464251 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" event={"ID":"d1337fc7-a968-4b05-8ed3-4921f082016a","Type":"ContainerStarted","Data":"5c389d468167761d0da2aea88e9e4b36a4b813f641d3b03b6fc219e65da8926c"} Apr 22 17:56:49.464551 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:49.464528 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:56:49.489330 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:49.489286 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" podStartSLOduration=1.489269382 podStartE2EDuration="1.489269382s" podCreationTimestamp="2026-04-22 17:56:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:56:49.488520061 +0000 UTC m=+172.277109826" watchObservedRunningTime="2026-04-22 17:56:49.489269382 +0000 UTC m=+172.277859145" Apr 22 17:56:50.469559 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:50.469518 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9h2ht" event={"ID":"8201d0cd-3948-40c0-a905-1339af44c0e0","Type":"ContainerStarted","Data":"556b72e188ffbc2440e8acc05601c36b88e3fa4299fee07797b3e9c36c454e3c"} Apr 22 17:56:51.323642 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.323604 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ljd7r"] Apr 22 17:56:51.326604 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.326586 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-ljd7r" Apr 22 17:56:51.330909 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.330879 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 17:56:51.331069 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.330879 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 17:56:51.331069 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.330954 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 17:56:51.331069 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.331002 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 17:56:51.331069 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.330967 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-fgrhp\"" Apr 22 17:56:51.331069 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.330890 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 17:56:51.334883 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.334861 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c8442c44-0c99-4d0a-86a7-b9dce45aa069-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ljd7r\" (UID: \"c8442c44-0c99-4d0a-86a7-b9dce45aa069\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ljd7r" Apr 22 17:56:51.334984 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.334961 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c8442c44-0c99-4d0a-86a7-b9dce45aa069-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ljd7r\" (UID: \"c8442c44-0c99-4d0a-86a7-b9dce45aa069\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ljd7r" Apr 22 17:56:51.335029 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.334983 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l7cf\" (UniqueName: \"kubernetes.io/projected/c8442c44-0c99-4d0a-86a7-b9dce45aa069-kube-api-access-2l7cf\") pod \"prometheus-operator-5676c8c784-ljd7r\" (UID: \"c8442c44-0c99-4d0a-86a7-b9dce45aa069\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ljd7r" Apr 22 17:56:51.335029 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.335002 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c8442c44-0c99-4d0a-86a7-b9dce45aa069-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ljd7r\" (UID: \"c8442c44-0c99-4d0a-86a7-b9dce45aa069\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ljd7r" Apr 22 17:56:51.338949 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.338931 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ljd7r"] Apr 22 17:56:51.436116 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.436079 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c8442c44-0c99-4d0a-86a7-b9dce45aa069-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ljd7r\" (UID: \"c8442c44-0c99-4d0a-86a7-b9dce45aa069\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ljd7r" Apr 22 17:56:51.436116 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.436117 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2l7cf\" (UniqueName: \"kubernetes.io/projected/c8442c44-0c99-4d0a-86a7-b9dce45aa069-kube-api-access-2l7cf\") pod \"prometheus-operator-5676c8c784-ljd7r\" (UID: \"c8442c44-0c99-4d0a-86a7-b9dce45aa069\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ljd7r" Apr 22 17:56:51.436379 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.436141 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c8442c44-0c99-4d0a-86a7-b9dce45aa069-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ljd7r\" (UID: \"c8442c44-0c99-4d0a-86a7-b9dce45aa069\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ljd7r" Apr 22 17:56:51.436379 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.436178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c8442c44-0c99-4d0a-86a7-b9dce45aa069-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ljd7r\" (UID: \"c8442c44-0c99-4d0a-86a7-b9dce45aa069\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ljd7r" Apr 22 17:56:51.436379 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:51.436225 2578 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 22 17:56:51.436379 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:51.436292 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8442c44-0c99-4d0a-86a7-b9dce45aa069-prometheus-operator-tls podName:c8442c44-0c99-4d0a-86a7-b9dce45aa069 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:51.936276387 +0000 UTC m=+174.724866145 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/c8442c44-0c99-4d0a-86a7-b9dce45aa069-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-ljd7r" (UID: "c8442c44-0c99-4d0a-86a7-b9dce45aa069") : secret "prometheus-operator-tls" not found Apr 22 17:56:51.437606 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.437584 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c8442c44-0c99-4d0a-86a7-b9dce45aa069-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-ljd7r\" (UID: \"c8442c44-0c99-4d0a-86a7-b9dce45aa069\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ljd7r" Apr 22 17:56:51.438665 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.438641 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c8442c44-0c99-4d0a-86a7-b9dce45aa069-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-ljd7r\" (UID: \"c8442c44-0c99-4d0a-86a7-b9dce45aa069\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ljd7r" Apr 22 17:56:51.445339 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.445318 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l7cf\" (UniqueName: \"kubernetes.io/projected/c8442c44-0c99-4d0a-86a7-b9dce45aa069-kube-api-access-2l7cf\") pod \"prometheus-operator-5676c8c784-ljd7r\" (UID: \"c8442c44-0c99-4d0a-86a7-b9dce45aa069\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ljd7r" Apr 22 17:56:51.474104 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.474076 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9h2ht" event={"ID":"8201d0cd-3948-40c0-a905-1339af44c0e0","Type":"ContainerStarted","Data":"812854f6d64bb7fbc74921b690d5cadc929571bb70df50926553da38dbb2ad3d"} Apr 22 17:56:51.474535 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.474178 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-9h2ht" Apr 22 17:56:51.475429 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.475403 2578 generic.go:358] "Generic (PLEG): container finished" podID="469dbdac-5689-4f0b-aa89-27b7a7f0f395" containerID="e5a59db22e011e7aba878e9f7df74ac0bff6eac6a8c46c867561653b02a64d27" exitCode=1 Apr 22 17:56:51.475555 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.475481 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s" event={"ID":"469dbdac-5689-4f0b-aa89-27b7a7f0f395","Type":"ContainerDied","Data":"e5a59db22e011e7aba878e9f7df74ac0bff6eac6a8c46c867561653b02a64d27"} Apr 22 17:56:51.475812 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.475794 2578 scope.go:117] "RemoveContainer" containerID="e5a59db22e011e7aba878e9f7df74ac0bff6eac6a8c46c867561653b02a64d27" Apr 22 17:56:51.477400 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.477380 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6qmlv" event={"ID":"817cb43c-5c91-44cf-b3e9-cc2c4b8b6449","Type":"ContainerStarted","Data":"602662ae7c9660c87e134cbfc60555ef3461cdc607aa28b06fc46905ec83104e"} Apr 22 17:56:51.515592 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.515449 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9h2ht" podStartSLOduration=139.142698178 podStartE2EDuration="2m20.515431183s" podCreationTimestamp="2026-04-22 17:54:31 +0000 UTC" firstStartedPulling="2026-04-22 17:56:48.906686068 +0000 UTC m=+171.695275811" lastFinishedPulling="2026-04-22 17:56:50.279419072 +0000 UTC m=+173.068008816" observedRunningTime="2026-04-22 17:56:51.494050154 +0000 UTC m=+174.282639919" watchObservedRunningTime="2026-04-22 17:56:51.515431183 +0000 UTC m=+174.304020947" Apr 22 17:56:51.516428 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.516399 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6qmlv" podStartSLOduration=1.487736267 podStartE2EDuration="3.516385991s" podCreationTimestamp="2026-04-22 17:56:48 +0000 UTC" firstStartedPulling="2026-04-22 17:56:48.765458374 +0000 UTC m=+171.554048137" lastFinishedPulling="2026-04-22 17:56:50.794108101 +0000 UTC m=+173.582697861" observedRunningTime="2026-04-22 17:56:51.515245546 +0000 UTC m=+174.303835312" watchObservedRunningTime="2026-04-22 17:56:51.516385991 +0000 UTC m=+174.304975756" Apr 22 17:56:51.940741 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.940703 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c8442c44-0c99-4d0a-86a7-b9dce45aa069-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ljd7r\" (UID: \"c8442c44-0c99-4d0a-86a7-b9dce45aa069\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ljd7r" Apr 22 17:56:51.943744 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:51.943723 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c8442c44-0c99-4d0a-86a7-b9dce45aa069-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-ljd7r\" (UID: \"c8442c44-0c99-4d0a-86a7-b9dce45aa069\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-ljd7r" Apr 22 17:56:52.235351 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:52.235264 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-ljd7r" Apr 22 17:56:52.356803 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:52.356749 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-ljd7r"] Apr 22 17:56:52.359480 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:56:52.359446 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8442c44_0c99_4d0a_86a7_b9dce45aa069.slice/crio-64f1ef039bf4cfa2f55f7e257fcc0bf95de44a38a4b0c161c1c4c2b036cb3dbf WatchSource:0}: Error finding container 64f1ef039bf4cfa2f55f7e257fcc0bf95de44a38a4b0c161c1c4c2b036cb3dbf: Status 404 returned error can't find the container with id 64f1ef039bf4cfa2f55f7e257fcc0bf95de44a38a4b0c161c1c4c2b036cb3dbf Apr 22 17:56:52.482443 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:52.482406 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s" event={"ID":"469dbdac-5689-4f0b-aa89-27b7a7f0f395","Type":"ContainerStarted","Data":"25e7260e4aa027dfcb08fb0f85f8fce8a833bda7c2e075a47b3552e175844508"} Apr 22 17:56:52.482897 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:52.482738 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s" Apr 22 17:56:52.483362 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:52.483342 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-d5c445fb4-rdw8s" Apr 22 17:56:52.483581 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:52.483554 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ljd7r" event={"ID":"c8442c44-0c99-4d0a-86a7-b9dce45aa069","Type":"ContainerStarted","Data":"64f1ef039bf4cfa2f55f7e257fcc0bf95de44a38a4b0c161c1c4c2b036cb3dbf"} Apr 22 17:56:54.491459 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:54.491418 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ljd7r" event={"ID":"c8442c44-0c99-4d0a-86a7-b9dce45aa069","Type":"ContainerStarted","Data":"3349c294013ec3e86fd78524d7da05a4b2df204c86771c0192d845348e523d71"} Apr 22 17:56:54.491459 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:54.491464 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-ljd7r" event={"ID":"c8442c44-0c99-4d0a-86a7-b9dce45aa069","Type":"ContainerStarted","Data":"6b7b9fa5886bb32b5b6427e601143ab2ebf4cbd3948958b5c145e753472dd91c"} Apr 22 17:56:54.510380 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:54.510334 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-ljd7r" podStartSLOduration=2.335557976 podStartE2EDuration="3.510319989s" podCreationTimestamp="2026-04-22 17:56:51 +0000 UTC" firstStartedPulling="2026-04-22 17:56:52.361294712 +0000 UTC m=+175.149884457" lastFinishedPulling="2026-04-22 17:56:53.536056713 +0000 UTC m=+176.324646470" observedRunningTime="2026-04-22 17:56:54.509418528 +0000 UTC m=+177.298008304" watchObservedRunningTime="2026-04-22 17:56:54.510319989 +0000 UTC m=+177.298909752" Apr 22 17:56:56.683262 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.683226 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-pxjc5"] Apr 22 17:56:56.688163 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.688141 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.690711 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.690690 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7s4kf\"" Apr 22 17:56:56.691062 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.691039 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 17:56:56.691231 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.691209 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 17:56:56.694129 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.694108 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 17:56:56.777079 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.777038 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/893114b4-edb1-4f70-a308-a09c361eb8de-node-exporter-wtmp\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.777275 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.777088 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/893114b4-edb1-4f70-a308-a09c361eb8de-metrics-client-ca\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.777275 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.777140 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/893114b4-edb1-4f70-a308-a09c361eb8de-sys\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.777275 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.777179 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/893114b4-edb1-4f70-a308-a09c361eb8de-node-exporter-textfile\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.777275 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.777208 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnhn7\" (UniqueName: \"kubernetes.io/projected/893114b4-edb1-4f70-a308-a09c361eb8de-kube-api-access-rnhn7\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.777493 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.777267 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/893114b4-edb1-4f70-a308-a09c361eb8de-root\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.777493 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.777315 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/893114b4-edb1-4f70-a308-a09c361eb8de-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.777493 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.777363 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/893114b4-edb1-4f70-a308-a09c361eb8de-node-exporter-tls\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.777493 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.777409 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/893114b4-edb1-4f70-a308-a09c361eb8de-node-exporter-accelerators-collector-config\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.878707 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.878667 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/893114b4-edb1-4f70-a308-a09c361eb8de-node-exporter-wtmp\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.878707 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.878713 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/893114b4-edb1-4f70-a308-a09c361eb8de-metrics-client-ca\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.879018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.878750 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/893114b4-edb1-4f70-a308-a09c361eb8de-sys\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.879018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.878801 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/893114b4-edb1-4f70-a308-a09c361eb8de-node-exporter-textfile\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.879018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.878821 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnhn7\" (UniqueName: \"kubernetes.io/projected/893114b4-edb1-4f70-a308-a09c361eb8de-kube-api-access-rnhn7\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.879018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.878847 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/893114b4-edb1-4f70-a308-a09c361eb8de-root\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.879018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.878883 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/893114b4-edb1-4f70-a308-a09c361eb8de-node-exporter-wtmp\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.879018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.878888 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/893114b4-edb1-4f70-a308-a09c361eb8de-sys\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.879018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.878886 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/893114b4-edb1-4f70-a308-a09c361eb8de-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.879018 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.878974 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/893114b4-edb1-4f70-a308-a09c361eb8de-root\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.879411 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.879233 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/893114b4-edb1-4f70-a308-a09c361eb8de-node-exporter-tls\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.879411 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.879276 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/893114b4-edb1-4f70-a308-a09c361eb8de-node-exporter-accelerators-collector-config\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.879411 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:56.879318 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 17:56:56.879411 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:56:56.879380 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/893114b4-edb1-4f70-a308-a09c361eb8de-node-exporter-tls podName:893114b4-edb1-4f70-a308-a09c361eb8de nodeName:}" failed. No retries permitted until 2026-04-22 17:56:57.379359117 +0000 UTC m=+180.167948859 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/893114b4-edb1-4f70-a308-a09c361eb8de-node-exporter-tls") pod "node-exporter-pxjc5" (UID: "893114b4-edb1-4f70-a308-a09c361eb8de") : secret "node-exporter-tls" not found Apr 22 17:56:56.879411 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.879243 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/893114b4-edb1-4f70-a308-a09c361eb8de-node-exporter-textfile\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.879596 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.879435 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/893114b4-edb1-4f70-a308-a09c361eb8de-metrics-client-ca\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.879851 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.879826 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/893114b4-edb1-4f70-a308-a09c361eb8de-node-exporter-accelerators-collector-config\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.881940 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.881910 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/893114b4-edb1-4f70-a308-a09c361eb8de-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:56.890982 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:56.890958 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnhn7\" (UniqueName: \"kubernetes.io/projected/893114b4-edb1-4f70-a308-a09c361eb8de-kube-api-access-rnhn7\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:57.384587 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:57.384547 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/893114b4-edb1-4f70-a308-a09c361eb8de-node-exporter-tls\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:57.387058 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:57.387028 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/893114b4-edb1-4f70-a308-a09c361eb8de-node-exporter-tls\") pod \"node-exporter-pxjc5\" (UID: \"893114b4-edb1-4f70-a308-a09c361eb8de\") " pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:57.598170 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:57.598136 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pxjc5" Apr 22 17:56:57.607220 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:56:57.607191 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod893114b4_edb1_4f70_a308_a09c361eb8de.slice/crio-4981f9894b4382feac17538dd5822653f50a60ee097df0c4f57402586a5fce61 WatchSource:0}: Error finding container 4981f9894b4382feac17538dd5822653f50a60ee097df0c4f57402586a5fce61: Status 404 returned error can't find the container with id 4981f9894b4382feac17538dd5822653f50a60ee097df0c4f57402586a5fce61 Apr 22 17:56:58.504698 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:58.504664 2578 generic.go:358] "Generic (PLEG): container finished" podID="893114b4-edb1-4f70-a308-a09c361eb8de" containerID="50dce02ae7856b4f991361e66776240d4f95916d29a0ce1c5f33a3c7298f98b8" exitCode=0 Apr 22 17:56:58.504698 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:58.504707 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pxjc5" event={"ID":"893114b4-edb1-4f70-a308-a09c361eb8de","Type":"ContainerDied","Data":"50dce02ae7856b4f991361e66776240d4f95916d29a0ce1c5f33a3c7298f98b8"} Apr 22 17:56:58.505078 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:58.504729 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pxjc5" event={"ID":"893114b4-edb1-4f70-a308-a09c361eb8de","Type":"ContainerStarted","Data":"4981f9894b4382feac17538dd5822653f50a60ee097df0c4f57402586a5fce61"} Apr 22 17:56:59.508697 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:59.508656 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pxjc5" event={"ID":"893114b4-edb1-4f70-a308-a09c361eb8de","Type":"ContainerStarted","Data":"beddd645a61d989fcc77979c6b34f547e92461abf253bd1318284f7661fee6c9"} Apr 22 17:56:59.508697 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:59.508697 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pxjc5" event={"ID":"893114b4-edb1-4f70-a308-a09c361eb8de","Type":"ContainerStarted","Data":"758a087f69a23b34987ef3fca348c9f673c18b405596782648a2da05dd8dd2f9"} Apr 22 17:56:59.531994 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:56:59.531936 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-pxjc5" podStartSLOduration=2.836164754 podStartE2EDuration="3.531897165s" podCreationTimestamp="2026-04-22 17:56:56 +0000 UTC" firstStartedPulling="2026-04-22 17:56:57.609157165 +0000 UTC m=+180.397746907" lastFinishedPulling="2026-04-22 17:56:58.304889563 +0000 UTC m=+181.093479318" observedRunningTime="2026-04-22 17:56:59.530881968 +0000 UTC m=+182.319471744" watchObservedRunningTime="2026-04-22 17:56:59.531897165 +0000 UTC m=+182.320486929" Apr 22 17:57:01.485961 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:01.485933 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9h2ht" Apr 22 17:57:02.763011 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:02.762979 2578 scope.go:117] "RemoveContainer" containerID="ac7eb162c6dc96734cdfe7c099826cede9339792bffb57d9ac54d773ad341eed" Apr 22 17:57:03.357036 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:03.356999 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-ktv42"] Apr 22 17:57:03.361926 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:03.361021 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-ktv42" Apr 22 17:57:03.364773 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:03.364736 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-8ndds\"" Apr 22 17:57:03.364921 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:03.364736 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 17:57:03.364921 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:03.364740 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 17:57:03.371671 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:03.371644 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-ktv42"] Apr 22 17:57:03.435211 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:03.435172 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6r7k\" (UniqueName: \"kubernetes.io/projected/51c44193-f38a-4889-91eb-9be344531e77-kube-api-access-h6r7k\") pod \"downloads-6bcc868b7-ktv42\" (UID: \"51c44193-f38a-4889-91eb-9be344531e77\") " pod="openshift-console/downloads-6bcc868b7-ktv42" Apr 22 17:57:03.521677 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:03.521647 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 17:57:03.521866 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:03.521722 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" event={"ID":"938d50a9-283b-4aa2-b8ec-60b629bd4253","Type":"ContainerStarted","Data":"c76bc66c406546b2325730c79d7fa9d9e7250b51e0b6011a8ccc3870fd3f0f91"} Apr 22 17:57:03.522034 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:03.522014 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:57:03.526537 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:03.526512 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-7mmh4" Apr 22 17:57:03.535943 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:03.535924 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6r7k\" (UniqueName: \"kubernetes.io/projected/51c44193-f38a-4889-91eb-9be344531e77-kube-api-access-h6r7k\") pod \"downloads-6bcc868b7-ktv42\" (UID: \"51c44193-f38a-4889-91eb-9be344531e77\") " pod="openshift-console/downloads-6bcc868b7-ktv42" Apr 22 17:57:03.545048 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:03.545023 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6r7k\" (UniqueName: \"kubernetes.io/projected/51c44193-f38a-4889-91eb-9be344531e77-kube-api-access-h6r7k\") pod \"downloads-6bcc868b7-ktv42\" (UID: \"51c44193-f38a-4889-91eb-9be344531e77\") " pod="openshift-console/downloads-6bcc868b7-ktv42" Apr 22 17:57:03.673262 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:03.673170 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-ktv42" Apr 22 17:57:03.804605 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:03.804549 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-ktv42"] Apr 22 17:57:03.808787 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:57:03.808743 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51c44193_f38a_4889_91eb_9be344531e77.slice/crio-85b9303e65d300c66467f4fd4ad5493ef1491eb0c6bb386efa7bdd4589625c7b WatchSource:0}: Error finding container 85b9303e65d300c66467f4fd4ad5493ef1491eb0c6bb386efa7bdd4589625c7b: Status 404 returned error can't find the container with id 85b9303e65d300c66467f4fd4ad5493ef1491eb0c6bb386efa7bdd4589625c7b Apr 22 17:57:04.526817 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:04.526731 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-ktv42" event={"ID":"51c44193-f38a-4889-91eb-9be344531e77","Type":"ContainerStarted","Data":"85b9303e65d300c66467f4fd4ad5493ef1491eb0c6bb386efa7bdd4589625c7b"} Apr 22 17:57:08.679094 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:08.679056 2578 patch_prober.go:28] interesting pod/image-registry-75545b9c8c-9xb6c container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 17:57:08.679710 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:08.679120 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" podUID="d1337fc7-a968-4b05-8ed3-4921f082016a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:57:09.471146 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:09.471112 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:57:10.474488 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:10.474453 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-75545b9c8c-9xb6c" Apr 22 17:57:14.485725 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:14.485649 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" podUID="518038bf-0a79-4055-8085-82633fe5df1d" containerName="registry" containerID="cri-o://00ede73f7f0919f9f7f6c2198bce197ef50e0eabe74f4faf72cbe8a83f3ef0aa" gracePeriod=30 Apr 22 17:57:19.465242 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.465200 2578 patch_prober.go:28] interesting pod/image-registry-686b9f6c49-s6jhd container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.134.0.11:5000/healthz\": dial tcp 10.134.0.11:5000: connect: connection refused" start-of-body= Apr 22 17:57:19.465681 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.465279 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" podUID="518038bf-0a79-4055-8085-82633fe5df1d" containerName="registry" probeResult="failure" output="Get \"https://10.134.0.11:5000/healthz\": dial tcp 10.134.0.11:5000: connect: connection refused" Apr 22 17:57:19.787158 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.787131 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:57:19.892053 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.892022 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls\") pod \"518038bf-0a79-4055-8085-82633fe5df1d\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " Apr 22 17:57:19.892246 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.892086 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/518038bf-0a79-4055-8085-82633fe5df1d-ca-trust-extracted\") pod \"518038bf-0a79-4055-8085-82633fe5df1d\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " Apr 22 17:57:19.892246 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.892113 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/518038bf-0a79-4055-8085-82633fe5df1d-registry-certificates\") pod \"518038bf-0a79-4055-8085-82633fe5df1d\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " Apr 22 17:57:19.892378 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.892276 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9cx8\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-kube-api-access-k9cx8\") pod \"518038bf-0a79-4055-8085-82633fe5df1d\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " Apr 22 17:57:19.892378 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.892318 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/518038bf-0a79-4055-8085-82633fe5df1d-trusted-ca\") pod \"518038bf-0a79-4055-8085-82633fe5df1d\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " Apr 22 17:57:19.892378 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.892343 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-bound-sa-token\") pod \"518038bf-0a79-4055-8085-82633fe5df1d\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " Apr 22 17:57:19.892378 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.892371 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/518038bf-0a79-4055-8085-82633fe5df1d-installation-pull-secrets\") pod \"518038bf-0a79-4055-8085-82633fe5df1d\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " Apr 22 17:57:19.892577 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.892476 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/518038bf-0a79-4055-8085-82633fe5df1d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "518038bf-0a79-4055-8085-82633fe5df1d" (UID: "518038bf-0a79-4055-8085-82633fe5df1d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:19.892577 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.892538 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/518038bf-0a79-4055-8085-82633fe5df1d-image-registry-private-configuration\") pod \"518038bf-0a79-4055-8085-82633fe5df1d\" (UID: \"518038bf-0a79-4055-8085-82633fe5df1d\") " Apr 22 17:57:19.892861 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.892832 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/518038bf-0a79-4055-8085-82633fe5df1d-registry-certificates\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 17:57:19.893271 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.893228 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/518038bf-0a79-4055-8085-82633fe5df1d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "518038bf-0a79-4055-8085-82633fe5df1d" (UID: "518038bf-0a79-4055-8085-82633fe5df1d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:19.895314 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.895277 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "518038bf-0a79-4055-8085-82633fe5df1d" (UID: "518038bf-0a79-4055-8085-82633fe5df1d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:57:19.895568 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.895538 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-kube-api-access-k9cx8" (OuterVolumeSpecName: "kube-api-access-k9cx8") pod "518038bf-0a79-4055-8085-82633fe5df1d" (UID: "518038bf-0a79-4055-8085-82633fe5df1d"). InnerVolumeSpecName "kube-api-access-k9cx8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:57:19.895568 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.895550 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/518038bf-0a79-4055-8085-82633fe5df1d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "518038bf-0a79-4055-8085-82633fe5df1d" (UID: "518038bf-0a79-4055-8085-82633fe5df1d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:19.895713 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.895646 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "518038bf-0a79-4055-8085-82633fe5df1d" (UID: "518038bf-0a79-4055-8085-82633fe5df1d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:57:19.895713 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.895676 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/518038bf-0a79-4055-8085-82633fe5df1d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "518038bf-0a79-4055-8085-82633fe5df1d" (UID: "518038bf-0a79-4055-8085-82633fe5df1d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:19.901460 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.901434 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/518038bf-0a79-4055-8085-82633fe5df1d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "518038bf-0a79-4055-8085-82633fe5df1d" (UID: "518038bf-0a79-4055-8085-82633fe5df1d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:57:19.994262 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.994180 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/518038bf-0a79-4055-8085-82633fe5df1d-ca-trust-extracted\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 17:57:19.994419 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.994267 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k9cx8\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-kube-api-access-k9cx8\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 17:57:19.994419 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.994286 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/518038bf-0a79-4055-8085-82633fe5df1d-trusted-ca\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 17:57:19.994419 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.994298 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-bound-sa-token\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 17:57:19.994419 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.994309 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/518038bf-0a79-4055-8085-82633fe5df1d-installation-pull-secrets\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 17:57:19.994419 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.994322 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/518038bf-0a79-4055-8085-82633fe5df1d-image-registry-private-configuration\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 17:57:19.994419 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:19.994334 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/518038bf-0a79-4055-8085-82633fe5df1d-registry-tls\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 17:57:20.576261 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:20.576217 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-ktv42" event={"ID":"51c44193-f38a-4889-91eb-9be344531e77","Type":"ContainerStarted","Data":"2845b29ecb020378b37f86f173e52172b4fa94c777a2052f21c2ec5dfd0681e0"} Apr 22 17:57:20.576744 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:20.576432 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-ktv42" Apr 22 17:57:20.577562 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:20.577532 2578 generic.go:358] "Generic (PLEG): container finished" podID="518038bf-0a79-4055-8085-82633fe5df1d" containerID="00ede73f7f0919f9f7f6c2198bce197ef50e0eabe74f4faf72cbe8a83f3ef0aa" exitCode=0 Apr 22 17:57:20.577695 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:20.577598 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" Apr 22 17:57:20.577695 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:20.577627 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" event={"ID":"518038bf-0a79-4055-8085-82633fe5df1d","Type":"ContainerDied","Data":"00ede73f7f0919f9f7f6c2198bce197ef50e0eabe74f4faf72cbe8a83f3ef0aa"} Apr 22 17:57:20.577695 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:20.577654 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-686b9f6c49-s6jhd" event={"ID":"518038bf-0a79-4055-8085-82633fe5df1d","Type":"ContainerDied","Data":"4835079bd5dd1dff03c77bd9b728dd45f7d152067e3c6c06e55b004553e52ed2"} Apr 22 17:57:20.577695 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:20.577673 2578 scope.go:117] "RemoveContainer" containerID="00ede73f7f0919f9f7f6c2198bce197ef50e0eabe74f4faf72cbe8a83f3ef0aa" Apr 22 17:57:20.588924 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:20.588898 2578 scope.go:117] "RemoveContainer" containerID="00ede73f7f0919f9f7f6c2198bce197ef50e0eabe74f4faf72cbe8a83f3ef0aa" Apr 22 17:57:20.589332 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:57:20.589297 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00ede73f7f0919f9f7f6c2198bce197ef50e0eabe74f4faf72cbe8a83f3ef0aa\": container with ID starting with 00ede73f7f0919f9f7f6c2198bce197ef50e0eabe74f4faf72cbe8a83f3ef0aa not found: ID does not exist" containerID="00ede73f7f0919f9f7f6c2198bce197ef50e0eabe74f4faf72cbe8a83f3ef0aa" Apr 22 17:57:20.589433 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:20.589345 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00ede73f7f0919f9f7f6c2198bce197ef50e0eabe74f4faf72cbe8a83f3ef0aa"} err="failed to get container status \"00ede73f7f0919f9f7f6c2198bce197ef50e0eabe74f4faf72cbe8a83f3ef0aa\": rpc error: code = NotFound desc = could not find container \"00ede73f7f0919f9f7f6c2198bce197ef50e0eabe74f4faf72cbe8a83f3ef0aa\": container with ID starting with 00ede73f7f0919f9f7f6c2198bce197ef50e0eabe74f4faf72cbe8a83f3ef0aa not found: ID does not exist" Apr 22 17:57:20.591889 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:20.591866 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-ktv42" Apr 22 17:57:20.597577 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:20.597530 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-ktv42" podStartSLOduration=1.685279592 podStartE2EDuration="17.597514515s" podCreationTimestamp="2026-04-22 17:57:03 +0000 UTC" firstStartedPulling="2026-04-22 17:57:03.810598422 +0000 UTC m=+186.599188163" lastFinishedPulling="2026-04-22 17:57:19.722833345 +0000 UTC m=+202.511423086" observedRunningTime="2026-04-22 17:57:20.595703727 +0000 UTC m=+203.384293491" watchObservedRunningTime="2026-04-22 17:57:20.597514515 +0000 UTC m=+203.386104280" Apr 22 17:57:20.628321 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:20.628288 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-686b9f6c49-s6jhd"] Apr 22 17:57:20.635416 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:20.635382 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-686b9f6c49-s6jhd"] Apr 22 17:57:21.767126 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:21.767091 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="518038bf-0a79-4055-8085-82633fe5df1d" path="/var/lib/kubelet/pods/518038bf-0a79-4055-8085-82633fe5df1d/volumes" Apr 22 17:57:21.853401 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:21.853367 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7795c54489-z8ljs"] Apr 22 17:57:21.853745 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:21.853725 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="518038bf-0a79-4055-8085-82633fe5df1d" containerName="registry" Apr 22 17:57:21.853745 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:21.853744 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="518038bf-0a79-4055-8085-82633fe5df1d" containerName="registry" Apr 22 17:57:21.853895 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:21.853823 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="518038bf-0a79-4055-8085-82633fe5df1d" containerName="registry" Apr 22 17:57:21.857619 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:21.857592 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:21.860136 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:21.860112 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 17:57:21.860293 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:21.860272 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 17:57:21.860388 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:21.860222 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-hpgsb\"" Apr 22 17:57:21.860432 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:21.860208 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 17:57:21.861386 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:21.861067 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 17:57:21.861386 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:21.861094 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 17:57:21.865529 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:21.865468 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 17:57:21.867519 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:21.867497 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7795c54489-z8ljs"] Apr 22 17:57:22.011147 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.011100 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-trusted-ca-bundle\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.011332 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.011241 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7eeb74fe-5ae3-43a7-aca8-a17477f24784-console-oauth-config\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.011332 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.011276 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-oauth-serving-cert\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.011438 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.011331 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7eeb74fe-5ae3-43a7-aca8-a17477f24784-console-serving-cert\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.011438 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.011392 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-console-config\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.011438 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.011427 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-service-ca\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.011577 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.011478 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dl9q\" (UniqueName: \"kubernetes.io/projected/7eeb74fe-5ae3-43a7-aca8-a17477f24784-kube-api-access-8dl9q\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.112383 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.112302 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dl9q\" (UniqueName: \"kubernetes.io/projected/7eeb74fe-5ae3-43a7-aca8-a17477f24784-kube-api-access-8dl9q\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.112383 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.112343 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-trusted-ca-bundle\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.112613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.112400 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7eeb74fe-5ae3-43a7-aca8-a17477f24784-console-oauth-config\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.112613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.112430 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-oauth-serving-cert\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.112613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.112503 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7eeb74fe-5ae3-43a7-aca8-a17477f24784-console-serving-cert\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.112613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.112546 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-console-config\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.112613 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.112570 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-service-ca\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.113436 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.113406 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-trusted-ca-bundle\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.113565 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.113442 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-console-config\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.113565 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.113488 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-service-ca\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.113565 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.113529 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-oauth-serving-cert\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.115491 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.115460 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7eeb74fe-5ae3-43a7-aca8-a17477f24784-console-oauth-config\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.115609 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.115585 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7eeb74fe-5ae3-43a7-aca8-a17477f24784-console-serving-cert\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.121807 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.121740 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dl9q\" (UniqueName: \"kubernetes.io/projected/7eeb74fe-5ae3-43a7-aca8-a17477f24784-kube-api-access-8dl9q\") pod \"console-7795c54489-z8ljs\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.169901 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.169855 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:22.311264 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.311228 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7795c54489-z8ljs"] Apr 22 17:57:22.315021 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:57:22.314986 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7eeb74fe_5ae3_43a7_aca8_a17477f24784.slice/crio-aed1abf2a37e0d45ffe139a1967559e781711f66b5fc064823854e6c06eb765c WatchSource:0}: Error finding container aed1abf2a37e0d45ffe139a1967559e781711f66b5fc064823854e6c06eb765c: Status 404 returned error can't find the container with id aed1abf2a37e0d45ffe139a1967559e781711f66b5fc064823854e6c06eb765c Apr 22 17:57:22.586518 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:22.586478 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7795c54489-z8ljs" event={"ID":"7eeb74fe-5ae3-43a7-aca8-a17477f24784","Type":"ContainerStarted","Data":"aed1abf2a37e0d45ffe139a1967559e781711f66b5fc064823854e6c06eb765c"} Apr 22 17:57:24.908586 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:24.908559 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9h2ht_8201d0cd-3948-40c0-a905-1339af44c0e0/dns/0.log" Apr 22 17:57:25.108663 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:25.108581 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9h2ht_8201d0cd-3948-40c0-a905-1339af44c0e0/kube-rbac-proxy/0.log" Apr 22 17:57:25.707649 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:25.707615 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6d4jf_5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb/dns-node-resolver/0.log" Apr 22 17:57:26.600401 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:26.600355 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7795c54489-z8ljs" event={"ID":"7eeb74fe-5ae3-43a7-aca8-a17477f24784","Type":"ContainerStarted","Data":"fae761e0dd3f7a7775753babff7c45403c195fb018df5b5856645c2ba42cd0d2"} Apr 22 17:57:26.618080 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:26.618023 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7795c54489-z8ljs" podStartSLOduration=1.807251037 podStartE2EDuration="5.618004687s" podCreationTimestamp="2026-04-22 17:57:21 +0000 UTC" firstStartedPulling="2026-04-22 17:57:22.317444664 +0000 UTC m=+205.106034414" lastFinishedPulling="2026-04-22 17:57:26.128198308 +0000 UTC m=+208.916788064" observedRunningTime="2026-04-22 17:57:26.616884008 +0000 UTC m=+209.405473774" watchObservedRunningTime="2026-04-22 17:57:26.618004687 +0000 UTC m=+209.406594452" Apr 22 17:57:26.911382 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:26.911295 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-w9kfc_7faa2f31-ff6f-410e-94f4-9f9b7810616b/serve-healthcheck-canary/0.log" Apr 22 17:57:32.170253 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:32.170214 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:32.170253 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:32.170264 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:32.175130 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:32.175105 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:57:32.621862 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:57:32.621830 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:58:08.593624 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:58:08.593588 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs\") pod \"network-metrics-daemon-4jgqt\" (UID: \"8beb4b83-ece9-44df-bc80-fea79bf050d5\") " pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:58:08.596085 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:58:08.596060 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8beb4b83-ece9-44df-bc80-fea79bf050d5-metrics-certs\") pod \"network-metrics-daemon-4jgqt\" (UID: \"8beb4b83-ece9-44df-bc80-fea79bf050d5\") " pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:58:08.665006 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:58:08.664970 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bvxbv\"" Apr 22 17:58:08.673152 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:58:08.673121 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jgqt" Apr 22 17:58:08.795117 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:58:08.795083 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4jgqt"] Apr 22 17:58:08.798778 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:58:08.798735 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8beb4b83_ece9_44df_bc80_fea79bf050d5.slice/crio-a51041fd1740f3e9fd62dc393573c9ed25a95dce8ee81253a10adacd1d96b2d3 WatchSource:0}: Error finding container a51041fd1740f3e9fd62dc393573c9ed25a95dce8ee81253a10adacd1d96b2d3: Status 404 returned error can't find the container with id a51041fd1740f3e9fd62dc393573c9ed25a95dce8ee81253a10adacd1d96b2d3 Apr 22 17:58:09.717472 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:58:09.717443 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4jgqt" event={"ID":"8beb4b83-ece9-44df-bc80-fea79bf050d5","Type":"ContainerStarted","Data":"a51041fd1740f3e9fd62dc393573c9ed25a95dce8ee81253a10adacd1d96b2d3"} Apr 22 17:58:10.721831 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:58:10.721794 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4jgqt" event={"ID":"8beb4b83-ece9-44df-bc80-fea79bf050d5","Type":"ContainerStarted","Data":"0ae24f406954648149a4bdc93eca5dfbc39614afa7ce13b00aac460a45524e9c"} Apr 22 17:58:10.721831 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:58:10.721833 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4jgqt" event={"ID":"8beb4b83-ece9-44df-bc80-fea79bf050d5","Type":"ContainerStarted","Data":"edd7d404fe56a2fc628c4cbe415eef21beb835f16f54bd2bf45729504feb0941"} Apr 22 17:58:10.740473 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:58:10.740426 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4jgqt" podStartSLOduration=252.829326017 podStartE2EDuration="4m13.740412743s" podCreationTimestamp="2026-04-22 17:53:57 +0000 UTC" firstStartedPulling="2026-04-22 17:58:08.801035797 +0000 UTC m=+251.589625540" lastFinishedPulling="2026-04-22 17:58:09.712122521 +0000 UTC m=+252.500712266" observedRunningTime="2026-04-22 17:58:10.739475283 +0000 UTC m=+253.528065047" watchObservedRunningTime="2026-04-22 17:58:10.740412743 +0000 UTC m=+253.529002564" Apr 22 17:58:36.578962 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:58:36.578928 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7795c54489-z8ljs"] Apr 22 17:58:57.648036 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:58:57.648002 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 17:58:57.648508 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:58:57.648101 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 17:58:57.658158 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:58:57.658129 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 17:59:01.598849 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.598792 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7795c54489-z8ljs" podUID="7eeb74fe-5ae3-43a7-aca8-a17477f24784" containerName="console" containerID="cri-o://fae761e0dd3f7a7775753babff7c45403c195fb018df5b5856645c2ba42cd0d2" gracePeriod=15 Apr 22 17:59:01.843176 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.843152 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7795c54489-z8ljs_7eeb74fe-5ae3-43a7-aca8-a17477f24784/console/0.log" Apr 22 17:59:01.843302 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.843218 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:59:01.863902 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.863829 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7795c54489-z8ljs_7eeb74fe-5ae3-43a7-aca8-a17477f24784/console/0.log" Apr 22 17:59:01.863902 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.863870 2578 generic.go:358] "Generic (PLEG): container finished" podID="7eeb74fe-5ae3-43a7-aca8-a17477f24784" containerID="fae761e0dd3f7a7775753babff7c45403c195fb018df5b5856645c2ba42cd0d2" exitCode=2 Apr 22 17:59:01.864105 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.863913 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7795c54489-z8ljs" event={"ID":"7eeb74fe-5ae3-43a7-aca8-a17477f24784","Type":"ContainerDied","Data":"fae761e0dd3f7a7775753babff7c45403c195fb018df5b5856645c2ba42cd0d2"} Apr 22 17:59:01.864105 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.863942 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7795c54489-z8ljs" event={"ID":"7eeb74fe-5ae3-43a7-aca8-a17477f24784","Type":"ContainerDied","Data":"aed1abf2a37e0d45ffe139a1967559e781711f66b5fc064823854e6c06eb765c"} Apr 22 17:59:01.864105 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.863962 2578 scope.go:117] "RemoveContainer" containerID="fae761e0dd3f7a7775753babff7c45403c195fb018df5b5856645c2ba42cd0d2" Apr 22 17:59:01.864105 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.863966 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7795c54489-z8ljs" Apr 22 17:59:01.871834 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.871813 2578 scope.go:117] "RemoveContainer" containerID="fae761e0dd3f7a7775753babff7c45403c195fb018df5b5856645c2ba42cd0d2" Apr 22 17:59:01.872120 ip-10-0-130-112 kubenswrapper[2578]: E0422 17:59:01.872095 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fae761e0dd3f7a7775753babff7c45403c195fb018df5b5856645c2ba42cd0d2\": container with ID starting with fae761e0dd3f7a7775753babff7c45403c195fb018df5b5856645c2ba42cd0d2 not found: ID does not exist" containerID="fae761e0dd3f7a7775753babff7c45403c195fb018df5b5856645c2ba42cd0d2" Apr 22 17:59:01.872178 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.872136 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fae761e0dd3f7a7775753babff7c45403c195fb018df5b5856645c2ba42cd0d2"} err="failed to get container status \"fae761e0dd3f7a7775753babff7c45403c195fb018df5b5856645c2ba42cd0d2\": rpc error: code = NotFound desc = could not find container \"fae761e0dd3f7a7775753babff7c45403c195fb018df5b5856645c2ba42cd0d2\": container with ID starting with fae761e0dd3f7a7775753babff7c45403c195fb018df5b5856645c2ba42cd0d2 not found: ID does not exist" Apr 22 17:59:01.892262 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.892234 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-service-ca\") pod \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " Apr 22 17:59:01.892395 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.892279 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dl9q\" (UniqueName: \"kubernetes.io/projected/7eeb74fe-5ae3-43a7-aca8-a17477f24784-kube-api-access-8dl9q\") pod \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " Apr 22 17:59:01.892395 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.892308 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-console-config\") pod \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " Apr 22 17:59:01.892395 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.892353 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-oauth-serving-cert\") pod \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " Apr 22 17:59:01.892395 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.892375 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-trusted-ca-bundle\") pod \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " Apr 22 17:59:01.892395 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.892391 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7eeb74fe-5ae3-43a7-aca8-a17477f24784-console-serving-cert\") pod \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " Apr 22 17:59:01.892628 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.892407 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7eeb74fe-5ae3-43a7-aca8-a17477f24784-console-oauth-config\") pod \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\" (UID: \"7eeb74fe-5ae3-43a7-aca8-a17477f24784\") " Apr 22 17:59:01.892786 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.892733 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-service-ca" (OuterVolumeSpecName: "service-ca") pod "7eeb74fe-5ae3-43a7-aca8-a17477f24784" (UID: "7eeb74fe-5ae3-43a7-aca8-a17477f24784"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:59:01.892885 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.892846 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7eeb74fe-5ae3-43a7-aca8-a17477f24784" (UID: "7eeb74fe-5ae3-43a7-aca8-a17477f24784"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:59:01.892937 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.892882 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-console-config" (OuterVolumeSpecName: "console-config") pod "7eeb74fe-5ae3-43a7-aca8-a17477f24784" (UID: "7eeb74fe-5ae3-43a7-aca8-a17477f24784"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:59:01.892937 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.892889 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7eeb74fe-5ae3-43a7-aca8-a17477f24784" (UID: "7eeb74fe-5ae3-43a7-aca8-a17477f24784"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:59:01.894723 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.894698 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eeb74fe-5ae3-43a7-aca8-a17477f24784-kube-api-access-8dl9q" (OuterVolumeSpecName: "kube-api-access-8dl9q") pod "7eeb74fe-5ae3-43a7-aca8-a17477f24784" (UID: "7eeb74fe-5ae3-43a7-aca8-a17477f24784"). InnerVolumeSpecName "kube-api-access-8dl9q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:59:01.894839 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.894727 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eeb74fe-5ae3-43a7-aca8-a17477f24784-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7eeb74fe-5ae3-43a7-aca8-a17477f24784" (UID: "7eeb74fe-5ae3-43a7-aca8-a17477f24784"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:59:01.894886 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.894854 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eeb74fe-5ae3-43a7-aca8-a17477f24784-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7eeb74fe-5ae3-43a7-aca8-a17477f24784" (UID: "7eeb74fe-5ae3-43a7-aca8-a17477f24784"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:59:01.993722 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.993682 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-oauth-serving-cert\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 17:59:01.993722 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.993718 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-trusted-ca-bundle\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 17:59:01.993722 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.993727 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7eeb74fe-5ae3-43a7-aca8-a17477f24784-console-serving-cert\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 17:59:01.993972 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.993737 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7eeb74fe-5ae3-43a7-aca8-a17477f24784-console-oauth-config\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 17:59:01.993972 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.993746 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-service-ca\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 17:59:01.993972 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.993780 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8dl9q\" (UniqueName: \"kubernetes.io/projected/7eeb74fe-5ae3-43a7-aca8-a17477f24784-kube-api-access-8dl9q\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 17:59:01.993972 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:01.993789 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7eeb74fe-5ae3-43a7-aca8-a17477f24784-console-config\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 17:59:02.186302 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:02.186266 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7795c54489-z8ljs"] Apr 22 17:59:02.188664 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:02.188636 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7795c54489-z8ljs"] Apr 22 17:59:03.766479 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:03.766440 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eeb74fe-5ae3-43a7-aca8-a17477f24784" path="/var/lib/kubelet/pods/7eeb74fe-5ae3-43a7-aca8-a17477f24784/volumes" Apr 22 17:59:34.914628 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:34.914546 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-cfc449888-ghpwq"] Apr 22 17:59:34.915052 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:34.914876 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7eeb74fe-5ae3-43a7-aca8-a17477f24784" containerName="console" Apr 22 17:59:34.915052 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:34.914890 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eeb74fe-5ae3-43a7-aca8-a17477f24784" containerName="console" Apr 22 17:59:34.915052 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:34.914942 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="7eeb74fe-5ae3-43a7-aca8-a17477f24784" containerName="console" Apr 22 17:59:34.917737 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:34.917715 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:34.923994 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:34.923961 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-hpgsb\"" Apr 22 17:59:34.925501 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:34.925476 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 17:59:34.925612 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:34.925589 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 17:59:34.927644 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:34.927625 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 17:59:34.927784 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:34.927670 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 17:59:34.928932 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:34.928914 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 17:59:34.933335 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:34.933317 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 17:59:34.938821 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:34.938799 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cfc449888-ghpwq"] Apr 22 17:59:35.037360 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.037330 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66bwv\" (UniqueName: \"kubernetes.io/projected/14635ee6-5fdd-4278-b07a-8414bbf58feb-kube-api-access-66bwv\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.037553 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.037368 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-service-ca\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.037553 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.037390 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-oauth-serving-cert\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.037553 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.037496 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-console-config\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.037553 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.037528 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-trusted-ca-bundle\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.037553 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.037554 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/14635ee6-5fdd-4278-b07a-8414bbf58feb-console-serving-cert\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.037726 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.037621 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/14635ee6-5fdd-4278-b07a-8414bbf58feb-console-oauth-config\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.138926 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.138888 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66bwv\" (UniqueName: \"kubernetes.io/projected/14635ee6-5fdd-4278-b07a-8414bbf58feb-kube-api-access-66bwv\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.138926 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.138931 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-service-ca\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.139183 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.138953 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-oauth-serving-cert\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.139183 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.138979 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-console-config\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.139183 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.139003 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-trusted-ca-bundle\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.139183 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.139029 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/14635ee6-5fdd-4278-b07a-8414bbf58feb-console-serving-cert\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.139183 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.139052 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/14635ee6-5fdd-4278-b07a-8414bbf58feb-console-oauth-config\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.139793 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.139738 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-console-config\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.139922 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.139792 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-service-ca\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.139972 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.139949 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-trusted-ca-bundle\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.140008 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.139914 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-oauth-serving-cert\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.141526 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.141500 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/14635ee6-5fdd-4278-b07a-8414bbf58feb-console-oauth-config\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.141732 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.141715 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/14635ee6-5fdd-4278-b07a-8414bbf58feb-console-serving-cert\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.147340 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.147306 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66bwv\" (UniqueName: \"kubernetes.io/projected/14635ee6-5fdd-4278-b07a-8414bbf58feb-kube-api-access-66bwv\") pod \"console-cfc449888-ghpwq\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.226955 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.226869 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:35.353556 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.353530 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cfc449888-ghpwq"] Apr 22 17:59:35.356137 ip-10-0-130-112 kubenswrapper[2578]: W0422 17:59:35.356104 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14635ee6_5fdd_4278_b07a_8414bbf58feb.slice/crio-00d40a124a6d2cfb4b78099510480427e8d82b8efb8c7f076c157b417b7fce02 WatchSource:0}: Error finding container 00d40a124a6d2cfb4b78099510480427e8d82b8efb8c7f076c157b417b7fce02: Status 404 returned error can't find the container with id 00d40a124a6d2cfb4b78099510480427e8d82b8efb8c7f076c157b417b7fce02 Apr 22 17:59:35.358367 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.358351 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:59:35.958643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.958597 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cfc449888-ghpwq" event={"ID":"14635ee6-5fdd-4278-b07a-8414bbf58feb","Type":"ContainerStarted","Data":"3e54ad97c5c1678f31a8b3ec907ddefd9bfe139fd731798372ae1b3462dbfdd7"} Apr 22 17:59:35.958643 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.958646 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cfc449888-ghpwq" event={"ID":"14635ee6-5fdd-4278-b07a-8414bbf58feb","Type":"ContainerStarted","Data":"00d40a124a6d2cfb4b78099510480427e8d82b8efb8c7f076c157b417b7fce02"} Apr 22 17:59:35.977140 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:35.977083 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cfc449888-ghpwq" podStartSLOduration=1.977066201 podStartE2EDuration="1.977066201s" podCreationTimestamp="2026-04-22 17:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:59:35.97622774 +0000 UTC m=+338.764817504" watchObservedRunningTime="2026-04-22 17:59:35.977066201 +0000 UTC m=+338.765655964" Apr 22 17:59:45.227583 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:45.227486 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:45.227583 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:45.227527 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:45.232993 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:45.232970 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-cfc449888-ghpwq" Apr 22 17:59:45.990352 ip-10-0-130-112 kubenswrapper[2578]: I0422 17:59:45.990325 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-cfc449888-ghpwq" Apr 22 18:00:23.217977 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:00:23.217941 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-nt9hd"] Apr 22 18:00:23.221104 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:00:23.221078 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nt9hd" Apr 22 18:00:23.223335 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:00:23.223317 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:00:23.228974 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:00:23.228950 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nt9hd"] Apr 22 18:00:23.317977 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:00:23.317937 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0bfef4cd-ef07-46fa-889a-cf444d67efc9-dbus\") pod \"global-pull-secret-syncer-nt9hd\" (UID: \"0bfef4cd-ef07-46fa-889a-cf444d67efc9\") " pod="kube-system/global-pull-secret-syncer-nt9hd" Apr 22 18:00:23.317977 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:00:23.317984 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0bfef4cd-ef07-46fa-889a-cf444d67efc9-original-pull-secret\") pod \"global-pull-secret-syncer-nt9hd\" (UID: \"0bfef4cd-ef07-46fa-889a-cf444d67efc9\") " pod="kube-system/global-pull-secret-syncer-nt9hd" Apr 22 18:00:23.318183 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:00:23.318021 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0bfef4cd-ef07-46fa-889a-cf444d67efc9-kubelet-config\") pod \"global-pull-secret-syncer-nt9hd\" (UID: \"0bfef4cd-ef07-46fa-889a-cf444d67efc9\") " pod="kube-system/global-pull-secret-syncer-nt9hd" Apr 22 18:00:23.418510 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:00:23.418463 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0bfef4cd-ef07-46fa-889a-cf444d67efc9-kubelet-config\") pod \"global-pull-secret-syncer-nt9hd\" (UID: \"0bfef4cd-ef07-46fa-889a-cf444d67efc9\") " pod="kube-system/global-pull-secret-syncer-nt9hd" Apr 22 18:00:23.418510 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:00:23.418512 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0bfef4cd-ef07-46fa-889a-cf444d67efc9-dbus\") pod \"global-pull-secret-syncer-nt9hd\" (UID: \"0bfef4cd-ef07-46fa-889a-cf444d67efc9\") " pod="kube-system/global-pull-secret-syncer-nt9hd" Apr 22 18:00:23.418793 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:00:23.418540 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0bfef4cd-ef07-46fa-889a-cf444d67efc9-original-pull-secret\") pod \"global-pull-secret-syncer-nt9hd\" (UID: \"0bfef4cd-ef07-46fa-889a-cf444d67efc9\") " pod="kube-system/global-pull-secret-syncer-nt9hd" Apr 22 18:00:23.418793 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:00:23.418589 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0bfef4cd-ef07-46fa-889a-cf444d67efc9-kubelet-config\") pod \"global-pull-secret-syncer-nt9hd\" (UID: \"0bfef4cd-ef07-46fa-889a-cf444d67efc9\") " pod="kube-system/global-pull-secret-syncer-nt9hd" Apr 22 18:00:23.418793 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:00:23.418708 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0bfef4cd-ef07-46fa-889a-cf444d67efc9-dbus\") pod \"global-pull-secret-syncer-nt9hd\" (UID: \"0bfef4cd-ef07-46fa-889a-cf444d67efc9\") " pod="kube-system/global-pull-secret-syncer-nt9hd" Apr 22 18:00:23.421069 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:00:23.421048 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0bfef4cd-ef07-46fa-889a-cf444d67efc9-original-pull-secret\") pod \"global-pull-secret-syncer-nt9hd\" (UID: \"0bfef4cd-ef07-46fa-889a-cf444d67efc9\") " pod="kube-system/global-pull-secret-syncer-nt9hd" Apr 22 18:00:23.530953 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:00:23.530925 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nt9hd" Apr 22 18:00:23.658816 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:00:23.658782 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nt9hd"] Apr 22 18:00:23.661937 ip-10-0-130-112 kubenswrapper[2578]: W0422 18:00:23.661906 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bfef4cd_ef07_46fa_889a_cf444d67efc9.slice/crio-23481be4dce16a0bc72386c8b6f799d2f8725248d4a12c4632dd22dcc5a664ab WatchSource:0}: Error finding container 23481be4dce16a0bc72386c8b6f799d2f8725248d4a12c4632dd22dcc5a664ab: Status 404 returned error can't find the container with id 23481be4dce16a0bc72386c8b6f799d2f8725248d4a12c4632dd22dcc5a664ab Apr 22 18:00:24.087422 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:00:24.087373 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nt9hd" event={"ID":"0bfef4cd-ef07-46fa-889a-cf444d67efc9","Type":"ContainerStarted","Data":"23481be4dce16a0bc72386c8b6f799d2f8725248d4a12c4632dd22dcc5a664ab"} Apr 22 18:00:28.101183 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:00:28.101139 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nt9hd" event={"ID":"0bfef4cd-ef07-46fa-889a-cf444d67efc9","Type":"ContainerStarted","Data":"8939956d8be90e7fb078843138aa70d06b5c7e572e9fd581a2c790a5eeda5481"} Apr 22 18:00:28.116069 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:00:28.116010 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-nt9hd" podStartSLOduration=1.134783875 podStartE2EDuration="5.115990569s" podCreationTimestamp="2026-04-22 18:00:23 +0000 UTC" firstStartedPulling="2026-04-22 18:00:23.663925238 +0000 UTC m=+386.452514980" lastFinishedPulling="2026-04-22 18:00:27.645131915 +0000 UTC m=+390.433721674" observedRunningTime="2026-04-22 18:00:28.115321961 +0000 UTC m=+390.903911749" watchObservedRunningTime="2026-04-22 18:00:28.115990569 +0000 UTC m=+390.904580335" Apr 22 18:01:17.196335 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:17.196245 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm"] Apr 22 18:01:17.199423 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:17.199402 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm" Apr 22 18:01:17.202045 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:17.202019 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 18:01:17.202855 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:17.202837 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-gq982\"" Apr 22 18:01:17.202975 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:17.202950 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 18:01:17.203100 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:17.203083 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 18:01:17.218721 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:17.218693 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm"] Apr 22 18:01:17.338218 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:17.338180 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/4aaae6e0-cc60-41c2-9cc5-bf082d05b1ff-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm\" (UID: \"4aaae6e0-cc60-41c2-9cc5-bf082d05b1ff\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm" Apr 22 18:01:17.338403 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:17.338234 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fwx2\" (UniqueName: \"kubernetes.io/projected/4aaae6e0-cc60-41c2-9cc5-bf082d05b1ff-kube-api-access-6fwx2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm\" (UID: \"4aaae6e0-cc60-41c2-9cc5-bf082d05b1ff\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm" Apr 22 18:01:17.439534 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:17.439495 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/4aaae6e0-cc60-41c2-9cc5-bf082d05b1ff-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm\" (UID: \"4aaae6e0-cc60-41c2-9cc5-bf082d05b1ff\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm" Apr 22 18:01:17.439706 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:17.439549 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fwx2\" (UniqueName: \"kubernetes.io/projected/4aaae6e0-cc60-41c2-9cc5-bf082d05b1ff-kube-api-access-6fwx2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm\" (UID: \"4aaae6e0-cc60-41c2-9cc5-bf082d05b1ff\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm" Apr 22 18:01:17.442068 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:17.442048 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/4aaae6e0-cc60-41c2-9cc5-bf082d05b1ff-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm\" (UID: \"4aaae6e0-cc60-41c2-9cc5-bf082d05b1ff\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm" Apr 22 18:01:17.450150 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:17.450076 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fwx2\" (UniqueName: \"kubernetes.io/projected/4aaae6e0-cc60-41c2-9cc5-bf082d05b1ff-kube-api-access-6fwx2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm\" (UID: \"4aaae6e0-cc60-41c2-9cc5-bf082d05b1ff\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm" Apr 22 18:01:17.509065 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:17.509021 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm" Apr 22 18:01:17.648223 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:17.648198 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm"] Apr 22 18:01:17.650661 ip-10-0-130-112 kubenswrapper[2578]: W0422 18:01:17.650630 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aaae6e0_cc60_41c2_9cc5_bf082d05b1ff.slice/crio-78f10ed71918a6c2f0ce3841cd34a9ceafa1e468e04f1574bb6e1a26d394c203 WatchSource:0}: Error finding container 78f10ed71918a6c2f0ce3841cd34a9ceafa1e468e04f1574bb6e1a26d394c203: Status 404 returned error can't find the container with id 78f10ed71918a6c2f0ce3841cd34a9ceafa1e468e04f1574bb6e1a26d394c203 Apr 22 18:01:18.246148 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:18.246109 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm" event={"ID":"4aaae6e0-cc60-41c2-9cc5-bf082d05b1ff","Type":"ContainerStarted","Data":"78f10ed71918a6c2f0ce3841cd34a9ceafa1e468e04f1574bb6e1a26d394c203"} Apr 22 18:01:22.827675 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:22.827640 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-bvn66"] Apr 22 18:01:22.830581 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:22.830565 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-bvn66" Apr 22 18:01:22.833408 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:22.833387 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-wnbnm\"" Apr 22 18:01:22.833611 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:22.833596 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 18:01:22.833899 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:22.833880 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 18:01:22.842135 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:22.842115 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-bvn66"] Apr 22 18:01:22.988744 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:22.988710 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/bcb5a9df-d2c8-4060-8619-d3b6f4d56259-cabundle0\") pod \"keda-operator-ffbb595cb-bvn66\" (UID: \"bcb5a9df-d2c8-4060-8619-d3b6f4d56259\") " pod="openshift-keda/keda-operator-ffbb595cb-bvn66" Apr 22 18:01:22.988926 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:22.988781 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgszg\" (UniqueName: \"kubernetes.io/projected/bcb5a9df-d2c8-4060-8619-d3b6f4d56259-kube-api-access-sgszg\") pod \"keda-operator-ffbb595cb-bvn66\" (UID: \"bcb5a9df-d2c8-4060-8619-d3b6f4d56259\") " pod="openshift-keda/keda-operator-ffbb595cb-bvn66" Apr 22 18:01:22.988926 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:22.988838 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bcb5a9df-d2c8-4060-8619-d3b6f4d56259-certificates\") pod \"keda-operator-ffbb595cb-bvn66\" (UID: \"bcb5a9df-d2c8-4060-8619-d3b6f4d56259\") " pod="openshift-keda/keda-operator-ffbb595cb-bvn66" Apr 22 18:01:23.089346 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.089247 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/bcb5a9df-d2c8-4060-8619-d3b6f4d56259-cabundle0\") pod \"keda-operator-ffbb595cb-bvn66\" (UID: \"bcb5a9df-d2c8-4060-8619-d3b6f4d56259\") " pod="openshift-keda/keda-operator-ffbb595cb-bvn66" Apr 22 18:01:23.089346 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.089287 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgszg\" (UniqueName: \"kubernetes.io/projected/bcb5a9df-d2c8-4060-8619-d3b6f4d56259-kube-api-access-sgszg\") pod \"keda-operator-ffbb595cb-bvn66\" (UID: \"bcb5a9df-d2c8-4060-8619-d3b6f4d56259\") " pod="openshift-keda/keda-operator-ffbb595cb-bvn66" Apr 22 18:01:23.089346 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.089315 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bcb5a9df-d2c8-4060-8619-d3b6f4d56259-certificates\") pod \"keda-operator-ffbb595cb-bvn66\" (UID: \"bcb5a9df-d2c8-4060-8619-d3b6f4d56259\") " pod="openshift-keda/keda-operator-ffbb595cb-bvn66" Apr 22 18:01:23.089630 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:23.089423 2578 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:01:23.089630 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:23.089435 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:01:23.089630 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:23.089444 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-bvn66: references non-existent secret key: ca.crt Apr 22 18:01:23.089630 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:23.089506 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bcb5a9df-d2c8-4060-8619-d3b6f4d56259-certificates podName:bcb5a9df-d2c8-4060-8619-d3b6f4d56259 nodeName:}" failed. No retries permitted until 2026-04-22 18:01:23.589487378 +0000 UTC m=+446.378077124 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/bcb5a9df-d2c8-4060-8619-d3b6f4d56259-certificates") pod "keda-operator-ffbb595cb-bvn66" (UID: "bcb5a9df-d2c8-4060-8619-d3b6f4d56259") : references non-existent secret key: ca.crt Apr 22 18:01:23.089988 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.089967 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/bcb5a9df-d2c8-4060-8619-d3b6f4d56259-cabundle0\") pod \"keda-operator-ffbb595cb-bvn66\" (UID: \"bcb5a9df-d2c8-4060-8619-d3b6f4d56259\") " pod="openshift-keda/keda-operator-ffbb595cb-bvn66" Apr 22 18:01:23.098739 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.098715 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgszg\" (UniqueName: \"kubernetes.io/projected/bcb5a9df-d2c8-4060-8619-d3b6f4d56259-kube-api-access-sgszg\") pod \"keda-operator-ffbb595cb-bvn66\" (UID: \"bcb5a9df-d2c8-4060-8619-d3b6f4d56259\") " pod="openshift-keda/keda-operator-ffbb595cb-bvn66" Apr 22 18:01:23.124794 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.124734 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj"] Apr 22 18:01:23.128421 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.128401 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj" Apr 22 18:01:23.130911 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.130893 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 18:01:23.137120 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.137098 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj"] Apr 22 18:01:23.263019 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.262987 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm" event={"ID":"4aaae6e0-cc60-41c2-9cc5-bf082d05b1ff","Type":"ContainerStarted","Data":"660fb9ff56bcf3c324f9d8d56ec7cecfabae9f9e01b8393c5aa7506782906a72"} Apr 22 18:01:23.263201 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.263137 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm" Apr 22 18:01:23.283516 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.283451 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm" podStartSLOduration=1.715278777 podStartE2EDuration="6.28343181s" podCreationTimestamp="2026-04-22 18:01:17 +0000 UTC" firstStartedPulling="2026-04-22 18:01:17.652427575 +0000 UTC m=+440.441017316" lastFinishedPulling="2026-04-22 18:01:22.220580606 +0000 UTC m=+445.009170349" observedRunningTime="2026-04-22 18:01:23.281446833 +0000 UTC m=+446.070036599" watchObservedRunningTime="2026-04-22 18:01:23.28343181 +0000 UTC m=+446.072021574" Apr 22 18:01:23.290640 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.290604 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/676570bb-c4f8-464a-8643-f0495d900048-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hwpwj\" (UID: \"676570bb-c4f8-464a-8643-f0495d900048\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj" Apr 22 18:01:23.290870 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.290700 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/676570bb-c4f8-464a-8643-f0495d900048-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-hwpwj\" (UID: \"676570bb-c4f8-464a-8643-f0495d900048\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj" Apr 22 18:01:23.290870 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.290740 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnp5n\" (UniqueName: \"kubernetes.io/projected/676570bb-c4f8-464a-8643-f0495d900048-kube-api-access-qnp5n\") pod \"keda-metrics-apiserver-7c9f485588-hwpwj\" (UID: \"676570bb-c4f8-464a-8643-f0495d900048\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj" Apr 22 18:01:23.392278 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.392188 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/676570bb-c4f8-464a-8643-f0495d900048-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hwpwj\" (UID: \"676570bb-c4f8-464a-8643-f0495d900048\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj" Apr 22 18:01:23.392452 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.392297 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/676570bb-c4f8-464a-8643-f0495d900048-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-hwpwj\" (UID: \"676570bb-c4f8-464a-8643-f0495d900048\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj" Apr 22 18:01:23.392452 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:23.392336 2578 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:01:23.392452 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.392357 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qnp5n\" (UniqueName: \"kubernetes.io/projected/676570bb-c4f8-464a-8643-f0495d900048-kube-api-access-qnp5n\") pod \"keda-metrics-apiserver-7c9f485588-hwpwj\" (UID: \"676570bb-c4f8-464a-8643-f0495d900048\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj" Apr 22 18:01:23.392452 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:23.392361 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:01:23.392452 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:23.392452 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj: references non-existent secret key: tls.crt Apr 22 18:01:23.392708 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:23.392504 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/676570bb-c4f8-464a-8643-f0495d900048-certificates podName:676570bb-c4f8-464a-8643-f0495d900048 nodeName:}" failed. No retries permitted until 2026-04-22 18:01:23.89249028 +0000 UTC m=+446.681080041 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/676570bb-c4f8-464a-8643-f0495d900048-certificates") pod "keda-metrics-apiserver-7c9f485588-hwpwj" (UID: "676570bb-c4f8-464a-8643-f0495d900048") : references non-existent secret key: tls.crt Apr 22 18:01:23.392811 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.392718 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/676570bb-c4f8-464a-8643-f0495d900048-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-hwpwj\" (UID: \"676570bb-c4f8-464a-8643-f0495d900048\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj" Apr 22 18:01:23.408541 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.408483 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnp5n\" (UniqueName: \"kubernetes.io/projected/676570bb-c4f8-464a-8643-f0495d900048-kube-api-access-qnp5n\") pod \"keda-metrics-apiserver-7c9f485588-hwpwj\" (UID: \"676570bb-c4f8-464a-8643-f0495d900048\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj" Apr 22 18:01:23.594649 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.594608 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bcb5a9df-d2c8-4060-8619-d3b6f4d56259-certificates\") pod \"keda-operator-ffbb595cb-bvn66\" (UID: \"bcb5a9df-d2c8-4060-8619-d3b6f4d56259\") " pod="openshift-keda/keda-operator-ffbb595cb-bvn66" Apr 22 18:01:23.594851 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:23.594806 2578 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:01:23.594851 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:23.594828 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:01:23.594851 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:23.594837 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-bvn66: references non-existent secret key: ca.crt Apr 22 18:01:23.594974 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:23.594892 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bcb5a9df-d2c8-4060-8619-d3b6f4d56259-certificates podName:bcb5a9df-d2c8-4060-8619-d3b6f4d56259 nodeName:}" failed. No retries permitted until 2026-04-22 18:01:24.594876485 +0000 UTC m=+447.383466230 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/bcb5a9df-d2c8-4060-8619-d3b6f4d56259-certificates") pod "keda-operator-ffbb595cb-bvn66" (UID: "bcb5a9df-d2c8-4060-8619-d3b6f4d56259") : references non-existent secret key: ca.crt Apr 22 18:01:23.897541 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:23.897508 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/676570bb-c4f8-464a-8643-f0495d900048-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hwpwj\" (UID: \"676570bb-c4f8-464a-8643-f0495d900048\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj" Apr 22 18:01:23.898026 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:23.897684 2578 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:01:23.898026 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:23.897709 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:01:23.898026 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:23.897731 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj: references non-existent secret key: tls.crt Apr 22 18:01:23.898026 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:23.897818 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/676570bb-c4f8-464a-8643-f0495d900048-certificates podName:676570bb-c4f8-464a-8643-f0495d900048 nodeName:}" failed. No retries permitted until 2026-04-22 18:01:24.897797184 +0000 UTC m=+447.686386946 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/676570bb-c4f8-464a-8643-f0495d900048-certificates") pod "keda-metrics-apiserver-7c9f485588-hwpwj" (UID: "676570bb-c4f8-464a-8643-f0495d900048") : references non-existent secret key: tls.crt Apr 22 18:01:24.603664 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:24.603629 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bcb5a9df-d2c8-4060-8619-d3b6f4d56259-certificates\") pod \"keda-operator-ffbb595cb-bvn66\" (UID: \"bcb5a9df-d2c8-4060-8619-d3b6f4d56259\") " pod="openshift-keda/keda-operator-ffbb595cb-bvn66" Apr 22 18:01:24.603877 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:24.603804 2578 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:01:24.603877 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:24.603825 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:01:24.603877 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:24.603835 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-bvn66: references non-existent secret key: ca.crt Apr 22 18:01:24.603974 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:24.603897 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bcb5a9df-d2c8-4060-8619-d3b6f4d56259-certificates podName:bcb5a9df-d2c8-4060-8619-d3b6f4d56259 nodeName:}" failed. No retries permitted until 2026-04-22 18:01:26.603882377 +0000 UTC m=+449.392472119 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/bcb5a9df-d2c8-4060-8619-d3b6f4d56259-certificates") pod "keda-operator-ffbb595cb-bvn66" (UID: "bcb5a9df-d2c8-4060-8619-d3b6f4d56259") : references non-existent secret key: ca.crt Apr 22 18:01:24.905964 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:24.905871 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/676570bb-c4f8-464a-8643-f0495d900048-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hwpwj\" (UID: \"676570bb-c4f8-464a-8643-f0495d900048\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj" Apr 22 18:01:24.906418 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:24.906026 2578 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:01:24.906418 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:24.906050 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:01:24.906418 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:24.906073 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj: references non-existent secret key: tls.crt Apr 22 18:01:24.906418 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:01:24.906161 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/676570bb-c4f8-464a-8643-f0495d900048-certificates podName:676570bb-c4f8-464a-8643-f0495d900048 nodeName:}" failed. No retries permitted until 2026-04-22 18:01:26.906138906 +0000 UTC m=+449.694728650 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/676570bb-c4f8-464a-8643-f0495d900048-certificates") pod "keda-metrics-apiserver-7c9f485588-hwpwj" (UID: "676570bb-c4f8-464a-8643-f0495d900048") : references non-existent secret key: tls.crt Apr 22 18:01:26.617585 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:26.617542 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bcb5a9df-d2c8-4060-8619-d3b6f4d56259-certificates\") pod \"keda-operator-ffbb595cb-bvn66\" (UID: \"bcb5a9df-d2c8-4060-8619-d3b6f4d56259\") " pod="openshift-keda/keda-operator-ffbb595cb-bvn66" Apr 22 18:01:26.620139 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:26.620113 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/bcb5a9df-d2c8-4060-8619-d3b6f4d56259-certificates\") pod \"keda-operator-ffbb595cb-bvn66\" (UID: \"bcb5a9df-d2c8-4060-8619-d3b6f4d56259\") " pod="openshift-keda/keda-operator-ffbb595cb-bvn66" Apr 22 18:01:26.743070 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:26.743024 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-bvn66" Apr 22 18:01:26.865874 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:26.865843 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-bvn66"] Apr 22 18:01:26.868451 ip-10-0-130-112 kubenswrapper[2578]: W0422 18:01:26.868393 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcb5a9df_d2c8_4060_8619_d3b6f4d56259.slice/crio-c8accba7010922caea6919e3e48a232ce22d0ccaa16b71e26bf964d5401b1dd9 WatchSource:0}: Error finding container c8accba7010922caea6919e3e48a232ce22d0ccaa16b71e26bf964d5401b1dd9: Status 404 returned error can't find the container with id c8accba7010922caea6919e3e48a232ce22d0ccaa16b71e26bf964d5401b1dd9 Apr 22 18:01:26.918977 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:26.918940 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/676570bb-c4f8-464a-8643-f0495d900048-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hwpwj\" (UID: \"676570bb-c4f8-464a-8643-f0495d900048\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj" Apr 22 18:01:26.922112 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:26.922092 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/676570bb-c4f8-464a-8643-f0495d900048-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hwpwj\" (UID: \"676570bb-c4f8-464a-8643-f0495d900048\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj" Apr 22 18:01:27.040125 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:27.040086 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj" Apr 22 18:01:27.158130 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:27.158060 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj"] Apr 22 18:01:27.160927 ip-10-0-130-112 kubenswrapper[2578]: W0422 18:01:27.160899 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod676570bb_c4f8_464a_8643_f0495d900048.slice/crio-d63a3b3d9b7eced2f3e11e4bf52a6a8131c5a9026bf77af659c3fc2d69ab2797 WatchSource:0}: Error finding container d63a3b3d9b7eced2f3e11e4bf52a6a8131c5a9026bf77af659c3fc2d69ab2797: Status 404 returned error can't find the container with id d63a3b3d9b7eced2f3e11e4bf52a6a8131c5a9026bf77af659c3fc2d69ab2797 Apr 22 18:01:27.274657 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:27.274611 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj" event={"ID":"676570bb-c4f8-464a-8643-f0495d900048","Type":"ContainerStarted","Data":"d63a3b3d9b7eced2f3e11e4bf52a6a8131c5a9026bf77af659c3fc2d69ab2797"} Apr 22 18:01:27.277890 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:27.277863 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-bvn66" event={"ID":"bcb5a9df-d2c8-4060-8619-d3b6f4d56259","Type":"ContainerStarted","Data":"c8accba7010922caea6919e3e48a232ce22d0ccaa16b71e26bf964d5401b1dd9"} Apr 22 18:01:31.293352 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:31.293310 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj" event={"ID":"676570bb-c4f8-464a-8643-f0495d900048","Type":"ContainerStarted","Data":"1fb11fcce103033d3738335b55239b81f636dc2740110ee34946394fc89649f7"} Apr 22 18:01:31.293878 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:31.293399 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj" Apr 22 18:01:31.294686 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:31.294657 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-bvn66" event={"ID":"bcb5a9df-d2c8-4060-8619-d3b6f4d56259","Type":"ContainerStarted","Data":"5af2e53c4ea489dc7608d25e3e5a694deecccab1306cab8bf7b6175ac9b0c6dd"} Apr 22 18:01:31.294823 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:31.294793 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-bvn66" Apr 22 18:01:31.310202 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:31.310151 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj" podStartSLOduration=4.547405989 podStartE2EDuration="8.31013486s" podCreationTimestamp="2026-04-22 18:01:23 +0000 UTC" firstStartedPulling="2026-04-22 18:01:27.16230832 +0000 UTC m=+449.950898062" lastFinishedPulling="2026-04-22 18:01:30.925037191 +0000 UTC m=+453.713626933" observedRunningTime="2026-04-22 18:01:31.308681316 +0000 UTC m=+454.097271079" watchObservedRunningTime="2026-04-22 18:01:31.31013486 +0000 UTC m=+454.098724625" Apr 22 18:01:31.325785 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:31.325727 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-bvn66" podStartSLOduration=5.264843964 podStartE2EDuration="9.325713414s" podCreationTimestamp="2026-04-22 18:01:22 +0000 UTC" firstStartedPulling="2026-04-22 18:01:26.869573571 +0000 UTC m=+449.658163316" lastFinishedPulling="2026-04-22 18:01:30.930443025 +0000 UTC m=+453.719032766" observedRunningTime="2026-04-22 18:01:31.324477352 +0000 UTC m=+454.113067110" watchObservedRunningTime="2026-04-22 18:01:31.325713414 +0000 UTC m=+454.114303177" Apr 22 18:01:42.302048 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:42.302020 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hwpwj" Apr 22 18:01:44.268280 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:44.268245 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-p6pqm" Apr 22 18:01:52.300015 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:01:52.299984 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-bvn66" Apr 22 18:02:28.722291 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:28.722252 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-sn2lg"] Apr 22 18:02:28.724854 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:28.724835 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-644fd69db4-sn2lg" Apr 22 18:02:28.727465 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:28.727436 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:02:28.727465 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:28.727456 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:02:28.727685 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:28.727460 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 18:02:28.728340 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:28.728322 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-brxn8\"" Apr 22 18:02:28.733691 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:28.733650 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-sn2lg"] Apr 22 18:02:28.761861 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:28.761828 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-hlbzs"] Apr 22 18:02:28.763971 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:28.763954 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-hlbzs" Apr 22 18:02:28.766697 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:28.766673 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 18:02:28.767298 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:28.767276 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-w99hj\"" Apr 22 18:02:28.777633 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:28.777607 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-hlbzs"] Apr 22 18:02:28.908349 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:28.908318 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1b89f84-ed0f-41c2-bc67-4da00b430ec4-cert\") pod \"kserve-controller-manager-644fd69db4-sn2lg\" (UID: \"e1b89f84-ed0f-41c2-bc67-4da00b430ec4\") " pod="kserve/kserve-controller-manager-644fd69db4-sn2lg" Apr 22 18:02:28.908349 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:28.908353 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ldx2\" (UniqueName: \"kubernetes.io/projected/866ef802-e659-4c5f-8d70-9b6a6ade3843-kube-api-access-5ldx2\") pod \"seaweedfs-86cc847c5c-hlbzs\" (UID: \"866ef802-e659-4c5f-8d70-9b6a6ade3843\") " pod="kserve/seaweedfs-86cc847c5c-hlbzs" Apr 22 18:02:28.908613 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:28.908398 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/866ef802-e659-4c5f-8d70-9b6a6ade3843-data\") pod \"seaweedfs-86cc847c5c-hlbzs\" (UID: \"866ef802-e659-4c5f-8d70-9b6a6ade3843\") " pod="kserve/seaweedfs-86cc847c5c-hlbzs" Apr 22 18:02:28.908613 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:28.908449 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qftv\" (UniqueName: \"kubernetes.io/projected/e1b89f84-ed0f-41c2-bc67-4da00b430ec4-kube-api-access-9qftv\") pod \"kserve-controller-manager-644fd69db4-sn2lg\" (UID: \"e1b89f84-ed0f-41c2-bc67-4da00b430ec4\") " pod="kserve/kserve-controller-manager-644fd69db4-sn2lg" Apr 22 18:02:29.009570 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:29.009470 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/866ef802-e659-4c5f-8d70-9b6a6ade3843-data\") pod \"seaweedfs-86cc847c5c-hlbzs\" (UID: \"866ef802-e659-4c5f-8d70-9b6a6ade3843\") " pod="kserve/seaweedfs-86cc847c5c-hlbzs" Apr 22 18:02:29.009570 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:29.009529 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qftv\" (UniqueName: \"kubernetes.io/projected/e1b89f84-ed0f-41c2-bc67-4da00b430ec4-kube-api-access-9qftv\") pod \"kserve-controller-manager-644fd69db4-sn2lg\" (UID: \"e1b89f84-ed0f-41c2-bc67-4da00b430ec4\") " pod="kserve/kserve-controller-manager-644fd69db4-sn2lg" Apr 22 18:02:29.009570 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:29.009568 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1b89f84-ed0f-41c2-bc67-4da00b430ec4-cert\") pod \"kserve-controller-manager-644fd69db4-sn2lg\" (UID: \"e1b89f84-ed0f-41c2-bc67-4da00b430ec4\") " pod="kserve/kserve-controller-manager-644fd69db4-sn2lg" Apr 22 18:02:29.009899 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:29.009585 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ldx2\" (UniqueName: \"kubernetes.io/projected/866ef802-e659-4c5f-8d70-9b6a6ade3843-kube-api-access-5ldx2\") pod \"seaweedfs-86cc847c5c-hlbzs\" (UID: \"866ef802-e659-4c5f-8d70-9b6a6ade3843\") " pod="kserve/seaweedfs-86cc847c5c-hlbzs" Apr 22 18:02:29.009899 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:02:29.009727 2578 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 22 18:02:29.009899 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:02:29.009821 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1b89f84-ed0f-41c2-bc67-4da00b430ec4-cert podName:e1b89f84-ed0f-41c2-bc67-4da00b430ec4 nodeName:}" failed. No retries permitted until 2026-04-22 18:02:29.509798596 +0000 UTC m=+512.298388343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e1b89f84-ed0f-41c2-bc67-4da00b430ec4-cert") pod "kserve-controller-manager-644fd69db4-sn2lg" (UID: "e1b89f84-ed0f-41c2-bc67-4da00b430ec4") : secret "kserve-webhook-server-cert" not found Apr 22 18:02:29.010091 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:29.009948 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/866ef802-e659-4c5f-8d70-9b6a6ade3843-data\") pod \"seaweedfs-86cc847c5c-hlbzs\" (UID: \"866ef802-e659-4c5f-8d70-9b6a6ade3843\") " pod="kserve/seaweedfs-86cc847c5c-hlbzs" Apr 22 18:02:29.019419 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:29.019391 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qftv\" (UniqueName: \"kubernetes.io/projected/e1b89f84-ed0f-41c2-bc67-4da00b430ec4-kube-api-access-9qftv\") pod \"kserve-controller-manager-644fd69db4-sn2lg\" (UID: \"e1b89f84-ed0f-41c2-bc67-4da00b430ec4\") " pod="kserve/kserve-controller-manager-644fd69db4-sn2lg" Apr 22 18:02:29.019542 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:29.019503 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ldx2\" (UniqueName: \"kubernetes.io/projected/866ef802-e659-4c5f-8d70-9b6a6ade3843-kube-api-access-5ldx2\") pod \"seaweedfs-86cc847c5c-hlbzs\" (UID: \"866ef802-e659-4c5f-8d70-9b6a6ade3843\") " pod="kserve/seaweedfs-86cc847c5c-hlbzs" Apr 22 18:02:29.073575 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:29.073530 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-hlbzs" Apr 22 18:02:29.201493 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:29.201457 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-hlbzs"] Apr 22 18:02:29.204483 ip-10-0-130-112 kubenswrapper[2578]: W0422 18:02:29.204445 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod866ef802_e659_4c5f_8d70_9b6a6ade3843.slice/crio-d91f3e3271ad3d000f979f7c6cbc2276f0030b11df7bf20346d6c4a2aaf7d9f8 WatchSource:0}: Error finding container d91f3e3271ad3d000f979f7c6cbc2276f0030b11df7bf20346d6c4a2aaf7d9f8: Status 404 returned error can't find the container with id d91f3e3271ad3d000f979f7c6cbc2276f0030b11df7bf20346d6c4a2aaf7d9f8 Apr 22 18:02:29.483275 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:29.483176 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-hlbzs" event={"ID":"866ef802-e659-4c5f-8d70-9b6a6ade3843","Type":"ContainerStarted","Data":"d91f3e3271ad3d000f979f7c6cbc2276f0030b11df7bf20346d6c4a2aaf7d9f8"} Apr 22 18:02:29.514235 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:29.514202 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1b89f84-ed0f-41c2-bc67-4da00b430ec4-cert\") pod \"kserve-controller-manager-644fd69db4-sn2lg\" (UID: \"e1b89f84-ed0f-41c2-bc67-4da00b430ec4\") " pod="kserve/kserve-controller-manager-644fd69db4-sn2lg" Apr 22 18:02:29.516723 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:29.516697 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1b89f84-ed0f-41c2-bc67-4da00b430ec4-cert\") pod \"kserve-controller-manager-644fd69db4-sn2lg\" (UID: \"e1b89f84-ed0f-41c2-bc67-4da00b430ec4\") " pod="kserve/kserve-controller-manager-644fd69db4-sn2lg" Apr 22 18:02:29.637363 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:29.637325 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-644fd69db4-sn2lg" Apr 22 18:02:29.848859 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:29.848828 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-sn2lg"] Apr 22 18:02:29.852631 ip-10-0-130-112 kubenswrapper[2578]: W0422 18:02:29.852593 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1b89f84_ed0f_41c2_bc67_4da00b430ec4.slice/crio-628e1a43e591a7a4fff4a510fe756eb4e6cfe06f322b4567421f9a6e0153c95a WatchSource:0}: Error finding container 628e1a43e591a7a4fff4a510fe756eb4e6cfe06f322b4567421f9a6e0153c95a: Status 404 returned error can't find the container with id 628e1a43e591a7a4fff4a510fe756eb4e6cfe06f322b4567421f9a6e0153c95a Apr 22 18:02:30.489302 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:30.489250 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-644fd69db4-sn2lg" event={"ID":"e1b89f84-ed0f-41c2-bc67-4da00b430ec4","Type":"ContainerStarted","Data":"628e1a43e591a7a4fff4a510fe756eb4e6cfe06f322b4567421f9a6e0153c95a"} Apr 22 18:02:33.500540 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:33.500503 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-644fd69db4-sn2lg" event={"ID":"e1b89f84-ed0f-41c2-bc67-4da00b430ec4","Type":"ContainerStarted","Data":"c583714505ea0e9be430425959e5bd1d356b14aa7d5acd5940805e69bf04fdef"} Apr 22 18:02:33.501065 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:33.500644 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-644fd69db4-sn2lg" Apr 22 18:02:33.501967 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:33.501940 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-hlbzs" event={"ID":"866ef802-e659-4c5f-8d70-9b6a6ade3843","Type":"ContainerStarted","Data":"7a1fa53e1072233e6d820d331580eec2bf0f7e35c78210a25e461424489a881c"} Apr 22 18:02:33.502092 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:33.502048 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-hlbzs" Apr 22 18:02:33.517482 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:33.517442 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-644fd69db4-sn2lg" podStartSLOduration=2.364957143 podStartE2EDuration="5.517429266s" podCreationTimestamp="2026-04-22 18:02:28 +0000 UTC" firstStartedPulling="2026-04-22 18:02:29.854395202 +0000 UTC m=+512.642984944" lastFinishedPulling="2026-04-22 18:02:33.006867119 +0000 UTC m=+515.795457067" observedRunningTime="2026-04-22 18:02:33.516263155 +0000 UTC m=+516.304852918" watchObservedRunningTime="2026-04-22 18:02:33.517429266 +0000 UTC m=+516.306019029" Apr 22 18:02:33.533484 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:33.533431 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-hlbzs" podStartSLOduration=1.69294151 podStartE2EDuration="5.533418158s" podCreationTimestamp="2026-04-22 18:02:28 +0000 UTC" firstStartedPulling="2026-04-22 18:02:29.206072591 +0000 UTC m=+511.994662347" lastFinishedPulling="2026-04-22 18:02:33.046549254 +0000 UTC m=+515.835138995" observedRunningTime="2026-04-22 18:02:33.53211299 +0000 UTC m=+516.320702755" watchObservedRunningTime="2026-04-22 18:02:33.533418158 +0000 UTC m=+516.322007955" Apr 22 18:02:39.507608 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:02:39.507578 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-hlbzs" Apr 22 18:03:04.456603 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.456523 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-sn2lg"] Apr 22 18:03:04.457190 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.456791 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-644fd69db4-sn2lg" podUID="e1b89f84-ed0f-41c2-bc67-4da00b430ec4" containerName="manager" containerID="cri-o://c583714505ea0e9be430425959e5bd1d356b14aa7d5acd5940805e69bf04fdef" gracePeriod=10 Apr 22 18:03:04.461519 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.461494 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-644fd69db4-sn2lg" Apr 22 18:03:04.479195 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.479167 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-rdjnv"] Apr 22 18:03:04.483539 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.483516 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-644fd69db4-rdjnv" Apr 22 18:03:04.491897 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.491862 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-rdjnv"] Apr 22 18:03:04.598877 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.598844 2578 generic.go:358] "Generic (PLEG): container finished" podID="e1b89f84-ed0f-41c2-bc67-4da00b430ec4" containerID="c583714505ea0e9be430425959e5bd1d356b14aa7d5acd5940805e69bf04fdef" exitCode=0 Apr 22 18:03:04.599049 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.598922 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-644fd69db4-sn2lg" event={"ID":"e1b89f84-ed0f-41c2-bc67-4da00b430ec4","Type":"ContainerDied","Data":"c583714505ea0e9be430425959e5bd1d356b14aa7d5acd5940805e69bf04fdef"} Apr 22 18:03:04.600179 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.600157 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v47tz\" (UniqueName: \"kubernetes.io/projected/4035efb5-c6a2-4821-96e1-cf71eda908de-kube-api-access-v47tz\") pod \"kserve-controller-manager-644fd69db4-rdjnv\" (UID: \"4035efb5-c6a2-4821-96e1-cf71eda908de\") " pod="kserve/kserve-controller-manager-644fd69db4-rdjnv" Apr 22 18:03:04.600239 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.600199 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4035efb5-c6a2-4821-96e1-cf71eda908de-cert\") pod \"kserve-controller-manager-644fd69db4-rdjnv\" (UID: \"4035efb5-c6a2-4821-96e1-cf71eda908de\") " pod="kserve/kserve-controller-manager-644fd69db4-rdjnv" Apr 22 18:03:04.690916 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.690894 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-644fd69db4-sn2lg" Apr 22 18:03:04.700900 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.700873 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v47tz\" (UniqueName: \"kubernetes.io/projected/4035efb5-c6a2-4821-96e1-cf71eda908de-kube-api-access-v47tz\") pod \"kserve-controller-manager-644fd69db4-rdjnv\" (UID: \"4035efb5-c6a2-4821-96e1-cf71eda908de\") " pod="kserve/kserve-controller-manager-644fd69db4-rdjnv" Apr 22 18:03:04.701018 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.700914 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4035efb5-c6a2-4821-96e1-cf71eda908de-cert\") pod \"kserve-controller-manager-644fd69db4-rdjnv\" (UID: \"4035efb5-c6a2-4821-96e1-cf71eda908de\") " pod="kserve/kserve-controller-manager-644fd69db4-rdjnv" Apr 22 18:03:04.703355 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.703326 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4035efb5-c6a2-4821-96e1-cf71eda908de-cert\") pod \"kserve-controller-manager-644fd69db4-rdjnv\" (UID: \"4035efb5-c6a2-4821-96e1-cf71eda908de\") " pod="kserve/kserve-controller-manager-644fd69db4-rdjnv" Apr 22 18:03:04.712372 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.712307 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v47tz\" (UniqueName: \"kubernetes.io/projected/4035efb5-c6a2-4821-96e1-cf71eda908de-kube-api-access-v47tz\") pod \"kserve-controller-manager-644fd69db4-rdjnv\" (UID: \"4035efb5-c6a2-4821-96e1-cf71eda908de\") " pod="kserve/kserve-controller-manager-644fd69db4-rdjnv" Apr 22 18:03:04.801344 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.801307 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qftv\" (UniqueName: \"kubernetes.io/projected/e1b89f84-ed0f-41c2-bc67-4da00b430ec4-kube-api-access-9qftv\") pod \"e1b89f84-ed0f-41c2-bc67-4da00b430ec4\" (UID: \"e1b89f84-ed0f-41c2-bc67-4da00b430ec4\") " Apr 22 18:03:04.801344 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.801351 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1b89f84-ed0f-41c2-bc67-4da00b430ec4-cert\") pod \"e1b89f84-ed0f-41c2-bc67-4da00b430ec4\" (UID: \"e1b89f84-ed0f-41c2-bc67-4da00b430ec4\") " Apr 22 18:03:04.803625 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.803593 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1b89f84-ed0f-41c2-bc67-4da00b430ec4-cert" (OuterVolumeSpecName: "cert") pod "e1b89f84-ed0f-41c2-bc67-4da00b430ec4" (UID: "e1b89f84-ed0f-41c2-bc67-4da00b430ec4"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:03:04.803736 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.803633 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1b89f84-ed0f-41c2-bc67-4da00b430ec4-kube-api-access-9qftv" (OuterVolumeSpecName: "kube-api-access-9qftv") pod "e1b89f84-ed0f-41c2-bc67-4da00b430ec4" (UID: "e1b89f84-ed0f-41c2-bc67-4da00b430ec4"). InnerVolumeSpecName "kube-api-access-9qftv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:03:04.839856 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.839826 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-644fd69db4-rdjnv" Apr 22 18:03:04.902802 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.902773 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9qftv\" (UniqueName: \"kubernetes.io/projected/e1b89f84-ed0f-41c2-bc67-4da00b430ec4-kube-api-access-9qftv\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 18:03:04.902802 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.902803 2578 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e1b89f84-ed0f-41c2-bc67-4da00b430ec4-cert\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 18:03:04.961140 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:04.961106 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-rdjnv"] Apr 22 18:03:04.964247 ip-10-0-130-112 kubenswrapper[2578]: W0422 18:03:04.964187 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4035efb5_c6a2_4821_96e1_cf71eda908de.slice/crio-47e91c69839c0791208857c9c1ab7c534f88d291a4ae1060c44ba5101fd780bc WatchSource:0}: Error finding container 47e91c69839c0791208857c9c1ab7c534f88d291a4ae1060c44ba5101fd780bc: Status 404 returned error can't find the container with id 47e91c69839c0791208857c9c1ab7c534f88d291a4ae1060c44ba5101fd780bc Apr 22 18:03:05.604994 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:05.604963 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-644fd69db4-sn2lg" Apr 22 18:03:05.604994 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:05.604973 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-644fd69db4-sn2lg" event={"ID":"e1b89f84-ed0f-41c2-bc67-4da00b430ec4","Type":"ContainerDied","Data":"628e1a43e591a7a4fff4a510fe756eb4e6cfe06f322b4567421f9a6e0153c95a"} Apr 22 18:03:05.605477 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:05.605020 2578 scope.go:117] "RemoveContainer" containerID="c583714505ea0e9be430425959e5bd1d356b14aa7d5acd5940805e69bf04fdef" Apr 22 18:03:05.606590 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:05.606561 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-644fd69db4-rdjnv" event={"ID":"4035efb5-c6a2-4821-96e1-cf71eda908de","Type":"ContainerStarted","Data":"730edd0f763da1af5fea85b2c575ad30646d39e44b562c2040c0a3e2b48d7da4"} Apr 22 18:03:05.606692 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:05.606597 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-644fd69db4-rdjnv" event={"ID":"4035efb5-c6a2-4821-96e1-cf71eda908de","Type":"ContainerStarted","Data":"47e91c69839c0791208857c9c1ab7c534f88d291a4ae1060c44ba5101fd780bc"} Apr 22 18:03:05.606749 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:05.606705 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-644fd69db4-rdjnv" Apr 22 18:03:05.624731 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:05.624688 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-644fd69db4-rdjnv" podStartSLOduration=1.34255272 podStartE2EDuration="1.624672703s" podCreationTimestamp="2026-04-22 18:03:04 +0000 UTC" firstStartedPulling="2026-04-22 18:03:04.965418157 +0000 UTC m=+547.754007900" lastFinishedPulling="2026-04-22 18:03:05.247538141 +0000 UTC m=+548.036127883" observedRunningTime="2026-04-22 18:03:05.623412552 +0000 UTC m=+548.412002315" watchObservedRunningTime="2026-04-22 18:03:05.624672703 +0000 UTC m=+548.413262515" Apr 22 18:03:05.638519 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:05.638491 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-sn2lg"] Apr 22 18:03:05.641810 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:05.641787 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-644fd69db4-sn2lg"] Apr 22 18:03:05.766396 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:05.766362 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1b89f84-ed0f-41c2-bc67-4da00b430ec4" path="/var/lib/kubelet/pods/e1b89f84-ed0f-41c2-bc67-4da00b430ec4/volumes" Apr 22 18:03:36.616222 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:36.616186 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-644fd69db4-rdjnv" Apr 22 18:03:37.500621 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:37.500591 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-t28wn"] Apr 22 18:03:37.500945 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:37.500932 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1b89f84-ed0f-41c2-bc67-4da00b430ec4" containerName="manager" Apr 22 18:03:37.500997 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:37.500948 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1b89f84-ed0f-41c2-bc67-4da00b430ec4" containerName="manager" Apr 22 18:03:37.501033 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:37.501009 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1b89f84-ed0f-41c2-bc67-4da00b430ec4" containerName="manager" Apr 22 18:03:37.502943 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:37.502927 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-t28wn" Apr 22 18:03:37.505531 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:37.505504 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-xpxkw\"" Apr 22 18:03:37.505662 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:37.505559 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 18:03:37.513050 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:37.513021 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-t28wn"] Apr 22 18:03:37.564985 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:37.564942 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb5xj\" (UniqueName: \"kubernetes.io/projected/1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7-kube-api-access-qb5xj\") pod \"odh-model-controller-696fc77849-t28wn\" (UID: \"1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7\") " pod="kserve/odh-model-controller-696fc77849-t28wn" Apr 22 18:03:37.565153 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:37.565027 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7-cert\") pod \"odh-model-controller-696fc77849-t28wn\" (UID: \"1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7\") " pod="kserve/odh-model-controller-696fc77849-t28wn" Apr 22 18:03:37.665436 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:37.665400 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7-cert\") pod \"odh-model-controller-696fc77849-t28wn\" (UID: \"1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7\") " pod="kserve/odh-model-controller-696fc77849-t28wn" Apr 22 18:03:37.665937 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:37.665480 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qb5xj\" (UniqueName: \"kubernetes.io/projected/1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7-kube-api-access-qb5xj\") pod \"odh-model-controller-696fc77849-t28wn\" (UID: \"1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7\") " pod="kserve/odh-model-controller-696fc77849-t28wn" Apr 22 18:03:37.665937 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:03:37.665565 2578 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 18:03:37.665937 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:03:37.665657 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7-cert podName:1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7 nodeName:}" failed. No retries permitted until 2026-04-22 18:03:38.165635876 +0000 UTC m=+580.954225623 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7-cert") pod "odh-model-controller-696fc77849-t28wn" (UID: "1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7") : secret "odh-model-controller-webhook-cert" not found Apr 22 18:03:37.678034 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:37.678009 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb5xj\" (UniqueName: \"kubernetes.io/projected/1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7-kube-api-access-qb5xj\") pod \"odh-model-controller-696fc77849-t28wn\" (UID: \"1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7\") " pod="kserve/odh-model-controller-696fc77849-t28wn" Apr 22 18:03:38.169949 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:38.169913 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7-cert\") pod \"odh-model-controller-696fc77849-t28wn\" (UID: \"1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7\") " pod="kserve/odh-model-controller-696fc77849-t28wn" Apr 22 18:03:38.172462 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:38.172429 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7-cert\") pod \"odh-model-controller-696fc77849-t28wn\" (UID: \"1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7\") " pod="kserve/odh-model-controller-696fc77849-t28wn" Apr 22 18:03:38.414283 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:38.414243 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-t28wn" Apr 22 18:03:38.541515 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:38.541482 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-t28wn"] Apr 22 18:03:38.544554 ip-10-0-130-112 kubenswrapper[2578]: W0422 18:03:38.544525 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c8d3e9a_b28b_4798_abc8_8b74d2f2a4b7.slice/crio-2db3cdda1cac00658fc257654308e7fffcadd3e0b7e0c7ed6e692c2896753e2f WatchSource:0}: Error finding container 2db3cdda1cac00658fc257654308e7fffcadd3e0b7e0c7ed6e692c2896753e2f: Status 404 returned error can't find the container with id 2db3cdda1cac00658fc257654308e7fffcadd3e0b7e0c7ed6e692c2896753e2f Apr 22 18:03:38.715088 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:38.714991 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-t28wn" event={"ID":"1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7","Type":"ContainerStarted","Data":"2db3cdda1cac00658fc257654308e7fffcadd3e0b7e0c7ed6e692c2896753e2f"} Apr 22 18:03:41.728850 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:41.728725 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-t28wn" event={"ID":"1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7","Type":"ContainerStarted","Data":"774f4b7aa2eb29b3ae10e768f187276c5cda8ba1946d9090b0ce923df84a2ec8"} Apr 22 18:03:41.728850 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:41.728825 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-t28wn" Apr 22 18:03:41.746040 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:41.745993 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-t28wn" podStartSLOduration=1.998225214 podStartE2EDuration="4.745979363s" podCreationTimestamp="2026-04-22 18:03:37 +0000 UTC" firstStartedPulling="2026-04-22 18:03:38.545869084 +0000 UTC m=+581.334458829" lastFinishedPulling="2026-04-22 18:03:41.293623148 +0000 UTC m=+584.082212978" observedRunningTime="2026-04-22 18:03:41.745222937 +0000 UTC m=+584.533812726" watchObservedRunningTime="2026-04-22 18:03:41.745979363 +0000 UTC m=+584.534569127" Apr 22 18:03:42.175226 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.175191 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d9459f6dc-6h5f4"] Apr 22 18:03:42.177607 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.177578 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.189348 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.189318 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d9459f6dc-6h5f4"] Apr 22 18:03:42.208776 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.207580 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db61b3d2-1e27-41f5-923d-2d163ca299d2-oauth-serving-cert\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.208776 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.207640 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db61b3d2-1e27-41f5-923d-2d163ca299d2-service-ca\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.208776 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.207675 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smfbm\" (UniqueName: \"kubernetes.io/projected/db61b3d2-1e27-41f5-923d-2d163ca299d2-kube-api-access-smfbm\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.208776 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.207722 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db61b3d2-1e27-41f5-923d-2d163ca299d2-console-config\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.208776 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.207747 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db61b3d2-1e27-41f5-923d-2d163ca299d2-console-oauth-config\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.208776 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.207822 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db61b3d2-1e27-41f5-923d-2d163ca299d2-console-serving-cert\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.208776 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.207854 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db61b3d2-1e27-41f5-923d-2d163ca299d2-trusted-ca-bundle\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.308887 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.308849 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db61b3d2-1e27-41f5-923d-2d163ca299d2-service-ca\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.309087 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.308902 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smfbm\" (UniqueName: \"kubernetes.io/projected/db61b3d2-1e27-41f5-923d-2d163ca299d2-kube-api-access-smfbm\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.309087 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.308953 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db61b3d2-1e27-41f5-923d-2d163ca299d2-console-config\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.309087 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.308980 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db61b3d2-1e27-41f5-923d-2d163ca299d2-console-oauth-config\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.309238 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.309110 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db61b3d2-1e27-41f5-923d-2d163ca299d2-console-serving-cert\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.309238 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.309149 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db61b3d2-1e27-41f5-923d-2d163ca299d2-trusted-ca-bundle\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.309238 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.309210 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db61b3d2-1e27-41f5-923d-2d163ca299d2-oauth-serving-cert\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.309881 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.309841 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db61b3d2-1e27-41f5-923d-2d163ca299d2-service-ca\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.310013 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.309900 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db61b3d2-1e27-41f5-923d-2d163ca299d2-oauth-serving-cert\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.310070 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.310036 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db61b3d2-1e27-41f5-923d-2d163ca299d2-console-config\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.310129 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.310106 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db61b3d2-1e27-41f5-923d-2d163ca299d2-trusted-ca-bundle\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.311880 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.311856 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db61b3d2-1e27-41f5-923d-2d163ca299d2-console-serving-cert\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.311951 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.311905 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db61b3d2-1e27-41f5-923d-2d163ca299d2-console-oauth-config\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.318204 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.318180 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smfbm\" (UniqueName: \"kubernetes.io/projected/db61b3d2-1e27-41f5-923d-2d163ca299d2-kube-api-access-smfbm\") pod \"console-6d9459f6dc-6h5f4\" (UID: \"db61b3d2-1e27-41f5-923d-2d163ca299d2\") " pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.488653 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.488572 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:42.624294 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.624257 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d9459f6dc-6h5f4"] Apr 22 18:03:42.629001 ip-10-0-130-112 kubenswrapper[2578]: W0422 18:03:42.628955 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb61b3d2_1e27_41f5_923d_2d163ca299d2.slice/crio-4712fe7385965ecd6c7b1f0f76e72aefae0a27f70aa1395fc4aa4cfcbc2a97ac WatchSource:0}: Error finding container 4712fe7385965ecd6c7b1f0f76e72aefae0a27f70aa1395fc4aa4cfcbc2a97ac: Status 404 returned error can't find the container with id 4712fe7385965ecd6c7b1f0f76e72aefae0a27f70aa1395fc4aa4cfcbc2a97ac Apr 22 18:03:42.734955 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.734916 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d9459f6dc-6h5f4" event={"ID":"db61b3d2-1e27-41f5-923d-2d163ca299d2","Type":"ContainerStarted","Data":"e325eb698061b0569a860d20c45284a57bbf179a10191285597717253c21bbb2"} Apr 22 18:03:42.734955 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.734961 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d9459f6dc-6h5f4" event={"ID":"db61b3d2-1e27-41f5-923d-2d163ca299d2","Type":"ContainerStarted","Data":"4712fe7385965ecd6c7b1f0f76e72aefae0a27f70aa1395fc4aa4cfcbc2a97ac"} Apr 22 18:03:42.756054 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:42.755926 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d9459f6dc-6h5f4" podStartSLOduration=0.755909534 podStartE2EDuration="755.909534ms" podCreationTimestamp="2026-04-22 18:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:03:42.753852379 +0000 UTC m=+585.542442143" watchObservedRunningTime="2026-04-22 18:03:42.755909534 +0000 UTC m=+585.544499297" Apr 22 18:03:52.489491 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:52.489453 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:52.489904 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:52.489531 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:52.494139 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:52.494115 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:52.737312 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:52.737281 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-t28wn" Apr 22 18:03:52.769850 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:52.769821 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6d9459f6dc-6h5f4" Apr 22 18:03:52.821573 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:52.821540 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cfc449888-ghpwq"] Apr 22 18:03:57.669746 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:57.669717 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:03:57.671503 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:03:57.671482 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:04:17.842603 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:17.842503 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-cfc449888-ghpwq" podUID="14635ee6-5fdd-4278-b07a-8414bbf58feb" containerName="console" containerID="cri-o://3e54ad97c5c1678f31a8b3ec907ddefd9bfe139fd731798372ae1b3462dbfdd7" gracePeriod=15 Apr 22 18:04:18.090141 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.090118 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cfc449888-ghpwq_14635ee6-5fdd-4278-b07a-8414bbf58feb/console/0.log" Apr 22 18:04:18.090280 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.090180 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cfc449888-ghpwq" Apr 22 18:04:18.125100 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.125017 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-service-ca\") pod \"14635ee6-5fdd-4278-b07a-8414bbf58feb\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " Apr 22 18:04:18.125100 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.125066 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-trusted-ca-bundle\") pod \"14635ee6-5fdd-4278-b07a-8414bbf58feb\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " Apr 22 18:04:18.125303 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.125122 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-console-config\") pod \"14635ee6-5fdd-4278-b07a-8414bbf58feb\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " Apr 22 18:04:18.125303 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.125154 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-oauth-serving-cert\") pod \"14635ee6-5fdd-4278-b07a-8414bbf58feb\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " Apr 22 18:04:18.125303 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.125191 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/14635ee6-5fdd-4278-b07a-8414bbf58feb-console-oauth-config\") pod \"14635ee6-5fdd-4278-b07a-8414bbf58feb\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " Apr 22 18:04:18.125303 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.125221 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/14635ee6-5fdd-4278-b07a-8414bbf58feb-console-serving-cert\") pod \"14635ee6-5fdd-4278-b07a-8414bbf58feb\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " Apr 22 18:04:18.125303 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.125275 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66bwv\" (UniqueName: \"kubernetes.io/projected/14635ee6-5fdd-4278-b07a-8414bbf58feb-kube-api-access-66bwv\") pod \"14635ee6-5fdd-4278-b07a-8414bbf58feb\" (UID: \"14635ee6-5fdd-4278-b07a-8414bbf58feb\") " Apr 22 18:04:18.125547 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.125519 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-console-config" (OuterVolumeSpecName: "console-config") pod "14635ee6-5fdd-4278-b07a-8414bbf58feb" (UID: "14635ee6-5fdd-4278-b07a-8414bbf58feb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:04:18.125598 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.125541 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "14635ee6-5fdd-4278-b07a-8414bbf58feb" (UID: "14635ee6-5fdd-4278-b07a-8414bbf58feb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:04:18.125598 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.125551 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "14635ee6-5fdd-4278-b07a-8414bbf58feb" (UID: "14635ee6-5fdd-4278-b07a-8414bbf58feb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:04:18.125666 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.125641 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-service-ca" (OuterVolumeSpecName: "service-ca") pod "14635ee6-5fdd-4278-b07a-8414bbf58feb" (UID: "14635ee6-5fdd-4278-b07a-8414bbf58feb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:04:18.127672 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.127645 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14635ee6-5fdd-4278-b07a-8414bbf58feb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "14635ee6-5fdd-4278-b07a-8414bbf58feb" (UID: "14635ee6-5fdd-4278-b07a-8414bbf58feb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:04:18.127797 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.127684 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14635ee6-5fdd-4278-b07a-8414bbf58feb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "14635ee6-5fdd-4278-b07a-8414bbf58feb" (UID: "14635ee6-5fdd-4278-b07a-8414bbf58feb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:04:18.127797 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.127695 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14635ee6-5fdd-4278-b07a-8414bbf58feb-kube-api-access-66bwv" (OuterVolumeSpecName: "kube-api-access-66bwv") pod "14635ee6-5fdd-4278-b07a-8414bbf58feb" (UID: "14635ee6-5fdd-4278-b07a-8414bbf58feb"). InnerVolumeSpecName "kube-api-access-66bwv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:04:18.226043 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.226007 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-66bwv\" (UniqueName: \"kubernetes.io/projected/14635ee6-5fdd-4278-b07a-8414bbf58feb-kube-api-access-66bwv\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 18:04:18.226043 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.226038 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-service-ca\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 18:04:18.226043 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.226050 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-trusted-ca-bundle\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 18:04:18.226290 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.226061 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-console-config\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 18:04:18.226290 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.226070 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/14635ee6-5fdd-4278-b07a-8414bbf58feb-oauth-serving-cert\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 18:04:18.226290 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.226078 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/14635ee6-5fdd-4278-b07a-8414bbf58feb-console-oauth-config\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 18:04:18.226290 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.226087 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/14635ee6-5fdd-4278-b07a-8414bbf58feb-console-serving-cert\") on node \"ip-10-0-130-112.ec2.internal\" DevicePath \"\"" Apr 22 18:04:18.847263 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.847235 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cfc449888-ghpwq_14635ee6-5fdd-4278-b07a-8414bbf58feb/console/0.log" Apr 22 18:04:18.847749 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.847275 2578 generic.go:358] "Generic (PLEG): container finished" podID="14635ee6-5fdd-4278-b07a-8414bbf58feb" containerID="3e54ad97c5c1678f31a8b3ec907ddefd9bfe139fd731798372ae1b3462dbfdd7" exitCode=2 Apr 22 18:04:18.847749 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.847311 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cfc449888-ghpwq" event={"ID":"14635ee6-5fdd-4278-b07a-8414bbf58feb","Type":"ContainerDied","Data":"3e54ad97c5c1678f31a8b3ec907ddefd9bfe139fd731798372ae1b3462dbfdd7"} Apr 22 18:04:18.847749 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.847351 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cfc449888-ghpwq" event={"ID":"14635ee6-5fdd-4278-b07a-8414bbf58feb","Type":"ContainerDied","Data":"00d40a124a6d2cfb4b78099510480427e8d82b8efb8c7f076c157b417b7fce02"} Apr 22 18:04:18.847749 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.847356 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cfc449888-ghpwq" Apr 22 18:04:18.847749 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.847368 2578 scope.go:117] "RemoveContainer" containerID="3e54ad97c5c1678f31a8b3ec907ddefd9bfe139fd731798372ae1b3462dbfdd7" Apr 22 18:04:18.855802 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.855746 2578 scope.go:117] "RemoveContainer" containerID="3e54ad97c5c1678f31a8b3ec907ddefd9bfe139fd731798372ae1b3462dbfdd7" Apr 22 18:04:18.856093 ip-10-0-130-112 kubenswrapper[2578]: E0422 18:04:18.856073 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e54ad97c5c1678f31a8b3ec907ddefd9bfe139fd731798372ae1b3462dbfdd7\": container with ID starting with 3e54ad97c5c1678f31a8b3ec907ddefd9bfe139fd731798372ae1b3462dbfdd7 not found: ID does not exist" containerID="3e54ad97c5c1678f31a8b3ec907ddefd9bfe139fd731798372ae1b3462dbfdd7" Apr 22 18:04:18.856166 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.856102 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e54ad97c5c1678f31a8b3ec907ddefd9bfe139fd731798372ae1b3462dbfdd7"} err="failed to get container status \"3e54ad97c5c1678f31a8b3ec907ddefd9bfe139fd731798372ae1b3462dbfdd7\": rpc error: code = NotFound desc = could not find container \"3e54ad97c5c1678f31a8b3ec907ddefd9bfe139fd731798372ae1b3462dbfdd7\": container with ID starting with 3e54ad97c5c1678f31a8b3ec907ddefd9bfe139fd731798372ae1b3462dbfdd7 not found: ID does not exist" Apr 22 18:04:18.869299 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.869274 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cfc449888-ghpwq"] Apr 22 18:04:18.872480 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:18.872456 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-cfc449888-ghpwq"] Apr 22 18:04:19.766748 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:04:19.766709 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14635ee6-5fdd-4278-b07a-8414bbf58feb" path="/var/lib/kubelet/pods/14635ee6-5fdd-4278-b07a-8414bbf58feb/volumes" Apr 22 18:08:57.689518 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:08:57.689429 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:08:57.692010 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:08:57.691984 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:13:57.709450 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:13:57.709420 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:13:57.712546 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:13:57.712522 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:18:57.730063 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:18:57.730032 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:18:57.738550 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:18:57.738526 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:23:57.757742 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:23:57.757709 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:23:57.766805 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:23:57.766779 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:28:57.777303 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:28:57.777274 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:28:57.786783 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:28:57.786739 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:33:57.796595 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:33:57.796484 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:33:57.807029 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:33:57.807004 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:38:57.816924 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:38:57.816813 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:38:57.826523 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:38:57.826501 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:43:57.836105 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:43:57.836000 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:43:57.850926 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:43:57.850901 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:48:57.855473 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:48:57.855368 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:48:57.870033 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:48:57.870002 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:53:57.874372 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:53:57.874265 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:53:57.889602 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:53:57.889577 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:58:57.893312 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:58:57.893186 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 18:58:57.908687 ip-10-0-130-112 kubenswrapper[2578]: I0422 18:58:57.908661 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 19:03:57.917660 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:03:57.917550 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 19:03:57.928119 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:03:57.928096 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 19:04:52.538840 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:52.538808 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-nt9hd_0bfef4cd-ef07-46fa-889a-cf444d67efc9/global-pull-secret-syncer/0.log" Apr 22 19:04:52.701133 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:52.701093 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-rcvv7_cec375e4-f4c2-456e-ae9c-d41aec34f6d1/konnectivity-agent/0.log" Apr 22 19:04:52.724883 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:52.724852 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-112.ec2.internal_e8a719b01c36f6eedbdfdd5602a1c99a/haproxy/0.log" Apr 22 19:04:56.422484 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:56.422451 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pxjc5_893114b4-edb1-4f70-a308-a09c361eb8de/node-exporter/0.log" Apr 22 19:04:56.442426 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:56.442395 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pxjc5_893114b4-edb1-4f70-a308-a09c361eb8de/kube-rbac-proxy/0.log" Apr 22 19:04:56.464513 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:56.464481 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pxjc5_893114b4-edb1-4f70-a308-a09c361eb8de/init-textfile/0.log" Apr 22 19:04:56.753036 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:56.752930 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-ljd7r_c8442c44-0c99-4d0a-86a7-b9dce45aa069/prometheus-operator/0.log" Apr 22 19:04:56.769289 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:56.769264 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-ljd7r_c8442c44-0c99-4d0a-86a7-b9dce45aa069/kube-rbac-proxy/0.log" Apr 22 19:04:58.668023 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:58.667989 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/2.log" Apr 22 19:04:58.675809 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:58.675783 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7mmh4_938d50a9-283b-4aa2-b8ec-60b629bd4253/console-operator/3.log" Apr 22 19:04:59.065466 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.065432 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d9459f6dc-6h5f4_db61b3d2-1e27-41f5-923d-2d163ca299d2/console/0.log" Apr 22 19:04:59.109738 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.109701 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-ktv42_51c44193-f38a-4889-91eb-9be344531e77/download-server/0.log" Apr 22 19:04:59.538378 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.538344 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-t2gkt_1c982a62-ab1c-4e4a-973c-aafc17ef396c/volume-data-source-validator/0.log" Apr 22 19:04:59.699527 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.699493 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg"] Apr 22 19:04:59.699975 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.699845 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14635ee6-5fdd-4278-b07a-8414bbf58feb" containerName="console" Apr 22 19:04:59.699975 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.699856 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="14635ee6-5fdd-4278-b07a-8414bbf58feb" containerName="console" Apr 22 19:04:59.699975 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.699912 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="14635ee6-5fdd-4278-b07a-8414bbf58feb" containerName="console" Apr 22 19:04:59.702922 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.702903 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" Apr 22 19:04:59.708904 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.708872 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-862r5\"/\"openshift-service-ca.crt\"" Apr 22 19:04:59.709094 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.708907 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-862r5\"/\"kube-root-ca.crt\"" Apr 22 19:04:59.709809 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.709786 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-862r5\"/\"default-dockercfg-dzdgq\"" Apr 22 19:04:59.715559 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.715538 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg"] Apr 22 19:04:59.832394 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.832289 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1024bd44-dabd-4f14-93a7-e1bf21a1d2df-sys\") pod \"perf-node-gather-daemonset-6rrrg\" (UID: \"1024bd44-dabd-4f14-93a7-e1bf21a1d2df\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" Apr 22 19:04:59.832394 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.832337 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1024bd44-dabd-4f14-93a7-e1bf21a1d2df-podres\") pod \"perf-node-gather-daemonset-6rrrg\" (UID: \"1024bd44-dabd-4f14-93a7-e1bf21a1d2df\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" Apr 22 19:04:59.832394 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.832387 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1024bd44-dabd-4f14-93a7-e1bf21a1d2df-lib-modules\") pod \"perf-node-gather-daemonset-6rrrg\" (UID: \"1024bd44-dabd-4f14-93a7-e1bf21a1d2df\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" Apr 22 19:04:59.832626 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.832420 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1024bd44-dabd-4f14-93a7-e1bf21a1d2df-proc\") pod \"perf-node-gather-daemonset-6rrrg\" (UID: \"1024bd44-dabd-4f14-93a7-e1bf21a1d2df\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" Apr 22 19:04:59.832626 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.832447 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmxb7\" (UniqueName: \"kubernetes.io/projected/1024bd44-dabd-4f14-93a7-e1bf21a1d2df-kube-api-access-hmxb7\") pod \"perf-node-gather-daemonset-6rrrg\" (UID: \"1024bd44-dabd-4f14-93a7-e1bf21a1d2df\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" Apr 22 19:04:59.933614 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.933563 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1024bd44-dabd-4f14-93a7-e1bf21a1d2df-sys\") pod \"perf-node-gather-daemonset-6rrrg\" (UID: \"1024bd44-dabd-4f14-93a7-e1bf21a1d2df\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" Apr 22 19:04:59.933614 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.933613 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1024bd44-dabd-4f14-93a7-e1bf21a1d2df-podres\") pod \"perf-node-gather-daemonset-6rrrg\" (UID: \"1024bd44-dabd-4f14-93a7-e1bf21a1d2df\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" Apr 22 19:04:59.933903 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.933639 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1024bd44-dabd-4f14-93a7-e1bf21a1d2df-lib-modules\") pod \"perf-node-gather-daemonset-6rrrg\" (UID: \"1024bd44-dabd-4f14-93a7-e1bf21a1d2df\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" Apr 22 19:04:59.933903 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.933660 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1024bd44-dabd-4f14-93a7-e1bf21a1d2df-proc\") pod \"perf-node-gather-daemonset-6rrrg\" (UID: \"1024bd44-dabd-4f14-93a7-e1bf21a1d2df\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" Apr 22 19:04:59.933903 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.933699 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1024bd44-dabd-4f14-93a7-e1bf21a1d2df-sys\") pod \"perf-node-gather-daemonset-6rrrg\" (UID: \"1024bd44-dabd-4f14-93a7-e1bf21a1d2df\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" Apr 22 19:04:59.933903 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.933715 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1024bd44-dabd-4f14-93a7-e1bf21a1d2df-proc\") pod \"perf-node-gather-daemonset-6rrrg\" (UID: \"1024bd44-dabd-4f14-93a7-e1bf21a1d2df\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" Apr 22 19:04:59.933903 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.933735 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmxb7\" (UniqueName: \"kubernetes.io/projected/1024bd44-dabd-4f14-93a7-e1bf21a1d2df-kube-api-access-hmxb7\") pod \"perf-node-gather-daemonset-6rrrg\" (UID: \"1024bd44-dabd-4f14-93a7-e1bf21a1d2df\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" Apr 22 19:04:59.933903 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.933796 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1024bd44-dabd-4f14-93a7-e1bf21a1d2df-podres\") pod \"perf-node-gather-daemonset-6rrrg\" (UID: \"1024bd44-dabd-4f14-93a7-e1bf21a1d2df\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" Apr 22 19:04:59.933903 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.933798 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1024bd44-dabd-4f14-93a7-e1bf21a1d2df-lib-modules\") pod \"perf-node-gather-daemonset-6rrrg\" (UID: \"1024bd44-dabd-4f14-93a7-e1bf21a1d2df\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" Apr 22 19:04:59.950559 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:04:59.950537 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmxb7\" (UniqueName: \"kubernetes.io/projected/1024bd44-dabd-4f14-93a7-e1bf21a1d2df-kube-api-access-hmxb7\") pod \"perf-node-gather-daemonset-6rrrg\" (UID: \"1024bd44-dabd-4f14-93a7-e1bf21a1d2df\") " pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" Apr 22 19:05:00.013339 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:00.013300 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" Apr 22 19:05:00.143043 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:00.143012 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg"] Apr 22 19:05:00.146031 ip-10-0-130-112 kubenswrapper[2578]: W0422 19:05:00.146003 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1024bd44_dabd_4f14_93a7_e1bf21a1d2df.slice/crio-03d9a0d79e2759c4ec56fc733cfe24b06775378482e83502ad2b9750574a9420 WatchSource:0}: Error finding container 03d9a0d79e2759c4ec56fc733cfe24b06775378482e83502ad2b9750574a9420: Status 404 returned error can't find the container with id 03d9a0d79e2759c4ec56fc733cfe24b06775378482e83502ad2b9750574a9420 Apr 22 19:05:00.147836 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:00.147814 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:05:00.318997 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:00.318959 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" event={"ID":"1024bd44-dabd-4f14-93a7-e1bf21a1d2df","Type":"ContainerStarted","Data":"842cc60aea1014a1554e60b87857056aa38d78349c4abfe7bce12cca5fcba9b7"} Apr 22 19:05:00.318997 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:00.318997 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" event={"ID":"1024bd44-dabd-4f14-93a7-e1bf21a1d2df","Type":"ContainerStarted","Data":"03d9a0d79e2759c4ec56fc733cfe24b06775378482e83502ad2b9750574a9420"} Apr 22 19:05:00.319217 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:00.319071 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" Apr 22 19:05:00.344246 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:00.344145 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" podStartSLOduration=1.344129109 podStartE2EDuration="1.344129109s" podCreationTimestamp="2026-04-22 19:04:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:05:00.342603057 +0000 UTC m=+4263.131192821" watchObservedRunningTime="2026-04-22 19:05:00.344129109 +0000 UTC m=+4263.132718873" Apr 22 19:05:00.367315 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:00.367283 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9h2ht_8201d0cd-3948-40c0-a905-1339af44c0e0/dns/0.log" Apr 22 19:05:00.391156 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:00.391132 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9h2ht_8201d0cd-3948-40c0-a905-1339af44c0e0/kube-rbac-proxy/0.log" Apr 22 19:05:00.475933 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:00.475905 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6d4jf_5bfed2dc-0dd0-4e54-bfc5-dba8108bcdeb/dns-node-resolver/0.log" Apr 22 19:05:01.027288 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:01.027252 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-75545b9c8c-9xb6c_d1337fc7-a968-4b05-8ed3-4921f082016a/registry/0.log" Apr 22 19:05:01.093423 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:01.093388 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-w5cfx_77641058-c12b-451a-a008-3394e1d05e7f/node-ca/0.log" Apr 22 19:05:02.266321 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:02.266293 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-w9kfc_7faa2f31-ff6f-410e-94f4-9f9b7810616b/serve-healthcheck-canary/0.log" Apr 22 19:05:02.632317 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:02.632235 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6qmlv_817cb43c-5c91-44cf-b3e9-cc2c4b8b6449/kube-rbac-proxy/0.log" Apr 22 19:05:02.659988 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:02.659962 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6qmlv_817cb43c-5c91-44cf-b3e9-cc2c4b8b6449/exporter/0.log" Apr 22 19:05:02.682934 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:02.682892 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6qmlv_817cb43c-5c91-44cf-b3e9-cc2c4b8b6449/extractor/0.log" Apr 22 19:05:04.762341 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:04.762310 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-644fd69db4-rdjnv_4035efb5-c6a2-4821-96e1-cf71eda908de/manager/0.log" Apr 22 19:05:05.274190 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:05.274155 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-t28wn_1c8d3e9a-b28b-4798-abc8-8b74d2f2a4b7/manager/0.log" Apr 22 19:05:05.328297 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:05.328270 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-hlbzs_866ef802-e659-4c5f-8d70-9b6a6ade3843/seaweedfs/0.log" Apr 22 19:05:06.332412 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:06.332387 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-862r5/perf-node-gather-daemonset-6rrrg" Apr 22 19:05:10.485716 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:10.485687 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f496z_d8999947-ef96-4a70-8257-7e319d5967db/kube-multus-additional-cni-plugins/0.log" Apr 22 19:05:10.509883 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:10.509849 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f496z_d8999947-ef96-4a70-8257-7e319d5967db/egress-router-binary-copy/0.log" Apr 22 19:05:10.536695 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:10.536659 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f496z_d8999947-ef96-4a70-8257-7e319d5967db/cni-plugins/0.log" Apr 22 19:05:10.569028 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:10.569001 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f496z_d8999947-ef96-4a70-8257-7e319d5967db/bond-cni-plugin/0.log" Apr 22 19:05:10.592459 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:10.592429 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f496z_d8999947-ef96-4a70-8257-7e319d5967db/routeoverride-cni/0.log" Apr 22 19:05:10.625419 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:10.625385 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f496z_d8999947-ef96-4a70-8257-7e319d5967db/whereabouts-cni-bincopy/0.log" Apr 22 19:05:10.669299 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:10.669272 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f496z_d8999947-ef96-4a70-8257-7e319d5967db/whereabouts-cni/0.log" Apr 22 19:05:11.087277 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:11.087239 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dgpsk_60af7321-9798-4032-9193-93cd4606e87a/kube-multus/0.log" Apr 22 19:05:11.133898 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:11.133865 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4jgqt_8beb4b83-ece9-44df-bc80-fea79bf050d5/network-metrics-daemon/0.log" Apr 22 19:05:11.154809 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:11.154714 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4jgqt_8beb4b83-ece9-44df-bc80-fea79bf050d5/kube-rbac-proxy/0.log" Apr 22 19:05:12.217473 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:12.217436 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jx4b_90656894-8740-4f11-8fc1-932678cddd3c/ovn-controller/0.log" Apr 22 19:05:12.274274 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:12.274240 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jx4b_90656894-8740-4f11-8fc1-932678cddd3c/ovn-acl-logging/0.log" Apr 22 19:05:12.295907 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:12.295874 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jx4b_90656894-8740-4f11-8fc1-932678cddd3c/kube-rbac-proxy-node/0.log" Apr 22 19:05:12.317213 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:12.317179 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jx4b_90656894-8740-4f11-8fc1-932678cddd3c/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:05:12.333368 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:12.333340 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jx4b_90656894-8740-4f11-8fc1-932678cddd3c/northd/0.log" Apr 22 19:05:12.354158 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:12.354136 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jx4b_90656894-8740-4f11-8fc1-932678cddd3c/nbdb/0.log" Apr 22 19:05:12.374643 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:12.374617 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jx4b_90656894-8740-4f11-8fc1-932678cddd3c/sbdb/0.log" Apr 22 19:05:12.530185 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:12.530155 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jx4b_90656894-8740-4f11-8fc1-932678cddd3c/ovnkube-controller/0.log" Apr 22 19:05:13.903528 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:13.903490 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-vpl9w_84ebfb35-ba39-488d-9650-3d97856af9a3/check-endpoints/0.log" Apr 22 19:05:13.924852 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:13.924827 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-bnklt_76eee7a4-cad7-46a8-bca3-2eb3e2d99cd1/network-check-target-container/0.log" Apr 22 19:05:14.927178 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:14.927140 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-rzm27_f2a82d37-d50f-4213-8068-442b3abeb6b3/iptables-alerter/0.log" Apr 22 19:05:15.641230 ip-10-0-130-112 kubenswrapper[2578]: I0422 19:05:15.641199 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-xvsrm_ef7bdbb8-6d5c-428b-9aa0-57c42271876a/tuned/0.log"