Apr 16 14:52:07.494875 ip-10-0-142-86 systemd[1]: Starting Kubernetes Kubelet... Apr 16 14:52:07.957169 ip-10-0-142-86 kubenswrapper[2561]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:07.957169 ip-10-0-142-86 kubenswrapper[2561]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 14:52:07.957169 ip-10-0-142-86 kubenswrapper[2561]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:07.957169 ip-10-0-142-86 kubenswrapper[2561]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 14:52:07.957169 ip-10-0-142-86 kubenswrapper[2561]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:07.958698 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.958619 2561 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 14:52:07.960812 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960797 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:07.960812 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960812 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:07.960871 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960816 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:07.960871 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960819 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:07.960871 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960822 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:07.960871 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960825 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:07.960871 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960828 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:07.960871 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960830 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:07.960871 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960833 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:07.960871 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960836 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:07.960871 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960838 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:07.960871 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960842 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:07.960871 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960849 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:07.960871 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960852 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:07.960871 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960855 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:07.960871 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960857 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:07.960871 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960861 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:07.960871 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960865 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:07.960871 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960868 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:07.960871 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960871 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:07.960871 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960874 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:07.960871 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960877 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:07.961363 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960880 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:07.961363 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960883 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:07.961363 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960886 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:07.961363 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960889 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:07.961363 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960892 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:07.961363 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960894 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:07.961363 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960897 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:07.961363 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960900 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:07.961363 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960902 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:07.961363 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960905 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:07.961363 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960907 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:07.961363 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960910 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:07.961363 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960913 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:07.961363 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960915 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:07.961363 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960918 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:07.961363 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960920 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:07.961363 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960923 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:07.961363 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960926 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:07.961363 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960930 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:07.961866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960934 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:07.961866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960937 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:07.961866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960939 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:07.961866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960942 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:07.961866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960944 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:07.961866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960947 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:07.961866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960950 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:07.961866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960953 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:07.961866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960955 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:07.961866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960958 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:07.961866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960960 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:07.961866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960963 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:07.961866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960965 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:07.961866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960968 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:07.961866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960971 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:07.961866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960974 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:07.961866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960977 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:07.961866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960980 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:07.961866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960982 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:07.961866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960985 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:07.962379 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960988 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:07.962379 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960990 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:07.962379 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960993 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:07.962379 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960996 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:07.962379 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.960998 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:07.962379 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961001 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:07.962379 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961003 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:07.962379 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961006 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:07.962379 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961009 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:07.962379 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961011 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:07.962379 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961015 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:07.962379 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961018 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:07.962379 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961021 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:07.962379 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961023 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:07.962379 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961026 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:07.962379 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961029 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:07.962379 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961031 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:07.962379 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961034 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:07.962379 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961037 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:07.962848 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961040 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:07.962848 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961043 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:07.962848 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961046 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:07.962848 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961048 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:07.962848 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961051 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:07.962848 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961053 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:07.962848 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961413 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:07.962848 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961418 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:07.962848 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961421 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:07.962848 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961425 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:07.962848 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961427 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:07.962848 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961430 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:07.962848 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961435 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:07.962848 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961438 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:07.962848 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961441 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:07.962848 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961444 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:07.962848 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961447 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:07.962848 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961450 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:07.962848 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961452 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:07.962848 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961455 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:07.963352 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961458 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:07.963352 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961461 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:07.963352 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961464 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:07.963352 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961467 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:07.963352 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961470 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:07.963352 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961472 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:07.963352 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961475 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:07.963352 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961477 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:07.963352 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961480 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:07.963352 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961482 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:07.963352 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961485 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:07.963352 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961488 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:07.963352 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961490 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:07.963352 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961493 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:07.963352 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961495 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:07.963352 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961498 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:07.963352 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961501 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:07.963352 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961503 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:07.963352 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961506 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:07.963882 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961510 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:07.963882 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961512 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:07.963882 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961515 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:07.963882 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961518 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:07.963882 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961520 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:07.963882 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961523 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:07.963882 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961525 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:07.963882 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961528 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:07.963882 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961530 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:07.963882 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961533 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:07.963882 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961536 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:07.963882 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961538 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:07.963882 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961540 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:07.963882 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961543 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:07.963882 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961546 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:07.963882 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961549 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:07.963882 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961552 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:07.963882 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961554 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:07.963882 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961557 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:07.963882 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961560 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:07.964359 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961563 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:07.964359 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961566 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:07.964359 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961569 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:07.964359 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961572 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:07.964359 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961576 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:07.964359 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961579 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:07.964359 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961581 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:07.964359 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961584 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:07.964359 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961586 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:07.964359 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961589 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:07.964359 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961591 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:07.964359 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961594 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:07.964359 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961596 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:07.964359 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961599 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:07.964359 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961601 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:07.964359 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961618 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:07.964359 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961621 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:07.964359 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961624 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:07.964359 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961627 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:07.964359 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961629 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961632 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961635 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961637 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961640 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961642 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961645 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961648 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961650 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961653 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961656 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961658 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961661 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.961664 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961742 2561 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961749 2561 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961755 2561 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961761 2561 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961765 2561 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961768 2561 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961772 2561 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 14:52:07.964866 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961777 2561 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961780 2561 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961783 2561 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961787 2561 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961790 2561 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961794 2561 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961797 2561 flags.go:64] FLAG: --cgroup-root="" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961800 2561 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961803 2561 flags.go:64] FLAG: --client-ca-file="" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961806 2561 flags.go:64] FLAG: --cloud-config="" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961809 2561 flags.go:64] FLAG: --cloud-provider="external" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961812 2561 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961816 2561 flags.go:64] FLAG: --cluster-domain="" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961819 2561 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961822 2561 flags.go:64] FLAG: --config-dir="" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961825 2561 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961828 2561 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961832 2561 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961835 2561 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961838 2561 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961842 2561 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961845 2561 flags.go:64] FLAG: --contention-profiling="false" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961848 2561 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961851 2561 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961854 2561 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 14:52:07.965390 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961857 2561 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961861 2561 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961865 2561 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961868 2561 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961871 2561 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961874 2561 flags.go:64] FLAG: --enable-server="true" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961877 2561 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961882 2561 flags.go:64] FLAG: --event-burst="100" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961885 2561 flags.go:64] FLAG: --event-qps="50" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961888 2561 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961891 2561 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961894 2561 flags.go:64] FLAG: --eviction-hard="" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961898 2561 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961901 2561 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961904 2561 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961907 2561 flags.go:64] FLAG: --eviction-soft="" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961910 2561 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961913 2561 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961916 2561 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961920 2561 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961922 2561 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961925 2561 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961928 2561 flags.go:64] FLAG: --feature-gates="" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961932 2561 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961935 2561 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 14:52:07.966003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961938 2561 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961941 2561 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961945 2561 flags.go:64] FLAG: --healthz-port="10248" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961948 2561 flags.go:64] FLAG: --help="false" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961950 2561 flags.go:64] FLAG: --hostname-override="ip-10-0-142-86.ec2.internal" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961954 2561 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961957 2561 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961960 2561 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961963 2561 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961967 2561 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961970 2561 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961973 2561 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961976 2561 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961979 2561 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961981 2561 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961985 2561 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961988 2561 flags.go:64] FLAG: --kube-reserved="" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961990 2561 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961993 2561 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961997 2561 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.961999 2561 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962002 2561 flags.go:64] FLAG: --lock-file="" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962005 2561 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962008 2561 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 14:52:07.966602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962011 2561 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962017 2561 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962021 2561 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962024 2561 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962026 2561 flags.go:64] FLAG: --logging-format="text" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962030 2561 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962033 2561 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962036 2561 flags.go:64] FLAG: --manifest-url="" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962039 2561 flags.go:64] FLAG: --manifest-url-header="" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962043 2561 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962046 2561 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962050 2561 flags.go:64] FLAG: --max-pods="110" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962053 2561 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962056 2561 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962059 2561 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962062 2561 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962065 2561 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962068 2561 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962071 2561 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962089 2561 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962093 2561 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962096 2561 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962099 2561 flags.go:64] FLAG: --pod-cidr="" Apr 16 14:52:07.967191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962102 2561 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962107 2561 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962118 2561 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962123 2561 flags.go:64] FLAG: --pods-per-core="0" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962130 2561 flags.go:64] FLAG: --port="10250" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962133 2561 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962136 2561 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0aa909b3037399c2f" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962139 2561 flags.go:64] FLAG: --qos-reserved="" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962142 2561 flags.go:64] FLAG: --read-only-port="10255" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962145 2561 flags.go:64] FLAG: --register-node="true" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962148 2561 flags.go:64] FLAG: --register-schedulable="true" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962151 2561 flags.go:64] FLAG: --register-with-taints="" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962160 2561 flags.go:64] FLAG: --registry-burst="10" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962163 2561 flags.go:64] FLAG: --registry-qps="5" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962166 2561 flags.go:64] FLAG: --reserved-cpus="" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962169 2561 flags.go:64] FLAG: --reserved-memory="" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962172 2561 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962175 2561 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962179 2561 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962182 2561 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962185 2561 flags.go:64] FLAG: --runonce="false" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962188 2561 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962191 2561 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962194 2561 flags.go:64] FLAG: --seccomp-default="false" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962197 2561 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962200 2561 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 14:52:07.967765 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962203 2561 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962207 2561 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962210 2561 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962213 2561 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962215 2561 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962219 2561 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962221 2561 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962224 2561 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962227 2561 flags.go:64] FLAG: --system-cgroups="" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962230 2561 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962237 2561 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962240 2561 flags.go:64] FLAG: --tls-cert-file="" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962243 2561 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962247 2561 flags.go:64] FLAG: --tls-min-version="" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962249 2561 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962252 2561 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962255 2561 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962259 2561 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962263 2561 flags.go:64] FLAG: --v="2" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962267 2561 flags.go:64] FLAG: --version="false" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962270 2561 flags.go:64] FLAG: --vmodule="" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962275 2561 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.962278 2561 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962368 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:07.968435 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962373 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:07.969030 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962375 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:07.969030 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962378 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:07.969030 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962381 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:07.969030 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962384 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:07.969030 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962387 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:07.969030 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962390 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:07.969030 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962393 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:07.969030 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962395 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:07.969030 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962401 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:07.969030 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962403 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:07.969030 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962406 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:07.969030 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962409 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:07.969030 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962411 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:07.969030 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962415 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:07.969030 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962419 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:07.969030 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962422 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:07.969030 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962425 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:07.969030 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962430 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:07.969030 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962433 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:07.969841 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962436 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:07.969841 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962439 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:07.969841 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962442 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:07.969841 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962444 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:07.969841 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962447 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:07.969841 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962451 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:07.969841 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962453 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:07.969841 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962456 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:07.969841 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962459 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:07.969841 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962461 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:07.969841 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962464 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:07.969841 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962467 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:07.969841 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962469 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:07.969841 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962472 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:07.969841 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962474 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:07.969841 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962477 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:07.969841 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962480 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:07.969841 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962482 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:07.969841 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962485 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:07.969841 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962487 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:07.970714 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962490 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:07.970714 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962493 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:07.970714 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962496 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:07.970714 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962498 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:07.970714 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962501 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:07.970714 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962503 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:07.970714 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962506 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:07.970714 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962508 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:07.970714 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962511 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:07.970714 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962513 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:07.970714 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962517 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:07.970714 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962520 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:07.970714 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962522 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:07.970714 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962525 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:07.970714 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962527 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:07.970714 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962530 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:07.970714 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962533 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:07.970714 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962536 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:07.970714 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962539 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:07.970714 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962541 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:07.971339 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962544 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:07.971339 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962546 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:07.971339 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962549 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:07.971339 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962551 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:07.971339 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962554 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:07.971339 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962557 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:07.971339 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962559 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:07.971339 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962562 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:07.971339 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962564 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:07.971339 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962567 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:07.971339 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962569 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:07.971339 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962572 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:07.971339 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962574 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:07.971339 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962576 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:07.971339 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962579 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:07.971339 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962582 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:07.971339 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962584 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:07.971339 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962588 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:07.971339 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962592 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:07.971339 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962594 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:07.971859 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962597 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:07.971859 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962599 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:07.971859 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962617 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:07.971859 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962621 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:07.971859 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.962625 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:07.971859 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.963430 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:07.971859 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.971711 2561 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 14:52:07.971859 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.971727 2561 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 14:52:07.971859 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971776 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:07.971859 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971782 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:07.971859 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971785 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:07.971859 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971789 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:07.971859 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971792 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:07.971859 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971796 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:07.971859 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971799 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:07.972234 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971802 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:07.972234 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971805 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:07.972234 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971808 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:07.972234 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971810 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:07.972234 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971813 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:07.972234 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971816 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:07.972234 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971819 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:07.972234 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971821 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:07.972234 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971824 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:07.972234 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971827 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:07.972234 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971829 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:07.972234 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971832 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:07.972234 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971835 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:07.972234 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971837 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:07.972234 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971840 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:07.972234 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971843 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:07.972234 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971846 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:07.972234 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971848 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:07.972234 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971851 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:07.972234 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971855 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:07.972743 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971859 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:07.972743 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971862 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:07.972743 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971865 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:07.972743 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971868 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:07.972743 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971872 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:07.972743 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971875 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:07.972743 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971877 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:07.972743 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971880 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:07.972743 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971883 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:07.972743 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971886 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:07.972743 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971889 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:07.972743 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971892 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:07.972743 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971895 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:07.972743 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971898 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:07.972743 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971901 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:07.972743 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971904 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:07.972743 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971906 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:07.972743 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971909 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:07.972743 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971912 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:07.972743 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971914 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:07.973242 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971917 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:07.973242 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971919 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:07.973242 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971922 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:07.973242 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971925 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:07.973242 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971927 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:07.973242 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971930 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:07.973242 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971932 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:07.973242 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971935 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:07.973242 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971937 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:07.973242 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971940 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:07.973242 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971942 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:07.973242 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971945 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:07.973242 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971948 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:07.973242 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971951 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:07.973242 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971953 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:07.973242 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971956 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:07.973242 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971958 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:07.973242 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971961 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:07.973242 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971963 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:07.973242 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971966 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:07.973774 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971968 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:07.973774 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971970 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:07.973774 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971973 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:07.973774 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971976 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:07.973774 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971978 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:07.973774 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971982 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:07.973774 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971985 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:07.973774 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971988 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:07.973774 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971990 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:07.973774 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971995 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:07.973774 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.971999 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:07.973774 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972002 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:07.973774 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972005 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:07.973774 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972007 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:07.973774 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972010 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:07.973774 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972013 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:07.973774 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972016 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:07.973774 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972018 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:07.973774 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972021 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:07.974231 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.972026 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:07.974231 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972116 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:07.974231 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972121 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:07.974231 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972124 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:07.974231 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972126 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:07.974231 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972129 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:07.974231 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972131 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:07.974231 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972134 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:07.974231 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972136 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:07.974231 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972139 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:07.974231 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972142 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:07.974231 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972144 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:07.974231 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972147 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:07.974231 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972149 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:07.974231 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972152 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:07.974598 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972154 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:07.974598 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972157 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:07.974598 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972159 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:07.974598 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972163 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:07.974598 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972165 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:07.974598 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972168 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:07.974598 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972170 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:07.974598 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972173 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:07.974598 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972176 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:07.974598 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972179 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:07.974598 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972181 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:07.974598 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972184 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:07.974598 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972187 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:07.974598 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972189 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:07.974598 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972192 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:07.974598 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972194 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:07.974598 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972197 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:07.974598 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972199 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:07.974598 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972202 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:07.974598 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972204 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:07.975103 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972206 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:07.975103 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972209 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:07.975103 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972212 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:07.975103 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972214 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:07.975103 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972217 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:07.975103 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972219 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:07.975103 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972222 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:07.975103 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972225 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:07.975103 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972227 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:07.975103 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972230 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:07.975103 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972232 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:07.975103 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972235 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:07.975103 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972239 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:07.975103 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972243 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:07.975103 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972246 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:07.975103 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972249 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:07.975103 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972252 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:07.975103 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972255 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:07.975103 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972258 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:07.975103 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972260 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:07.975593 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972263 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:07.975593 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972266 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:07.975593 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972269 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:07.975593 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972271 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:07.975593 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972273 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:07.975593 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972276 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:07.975593 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972279 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:07.975593 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972281 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:07.975593 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972283 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:07.975593 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972286 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:07.975593 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972288 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:07.975593 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972291 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:07.975593 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972293 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:07.975593 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972296 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:07.975593 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972299 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:07.975593 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972301 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:07.975593 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972304 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:07.975593 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972306 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:07.975593 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972309 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:07.975593 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972311 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:07.976081 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972314 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:07.976081 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972316 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:07.976081 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972319 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:07.976081 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972321 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:07.976081 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972324 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:07.976081 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972326 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:07.976081 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972330 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:07.976081 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972333 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:07.976081 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972336 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:07.976081 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972339 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:07.976081 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972342 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:07.976081 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:07.972344 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:07.976081 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.972349 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:07.976081 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.973104 2561 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 14:52:07.976527 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.976512 2561 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 14:52:07.977568 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.977557 2561 server.go:1019] "Starting client certificate rotation" Apr 16 14:52:07.977677 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.977658 2561 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:07.977737 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:07.977721 2561 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:08.004726 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.004701 2561 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:08.009336 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.009315 2561 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:08.026021 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.026004 2561 log.go:25] "Validated CRI v1 runtime API" Apr 16 14:52:08.031152 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.031136 2561 log.go:25] "Validated CRI v1 image API" Apr 16 14:52:08.032814 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.032791 2561 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 14:52:08.036101 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.036083 2561 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:08.037501 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.037479 2561 fs.go:135] Filesystem UUIDs: map[01d745dc-3ccc-4411-b04a-300e24c264c2:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 c600612a-a5a7-49a9-834e-17b40b60835e:/dev/nvme0n1p3] Apr 16 14:52:08.037554 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.037502 2561 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 14:52:08.043540 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.043434 2561 manager.go:217] Machine: {Timestamp:2026-04-16 14:52:08.041288323 +0000 UTC m=+0.422285927 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3107040 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c8aa1ca445c8e308a14d8f64572be SystemUUID:ec2c8aa1-ca44-5c8e-308a-14d8f64572be BootID:e6016e00-c18b-42d8-83e9-eeae1bead306 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8e:d0:cf:b4:51 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8e:d0:cf:b4:51 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:aa:6b:ab:58:67:9a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 14:52:08.043540 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.043538 2561 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 14:52:08.043674 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.043626 2561 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 14:52:08.044544 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.044523 2561 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 14:52:08.044688 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.044545 2561 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-86.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 14:52:08.044737 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.044698 2561 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 14:52:08.044737 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.044707 2561 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 14:52:08.044737 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.044719 2561 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:08.044737 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.044732 2561 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:08.045496 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.045487 2561 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:08.045597 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.045587 2561 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 14:52:08.047986 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.047976 2561 kubelet.go:491] "Attempting to sync node with API server" Apr 16 14:52:08.048026 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.047988 2561 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 14:52:08.048026 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.047999 2561 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 14:52:08.048026 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.048007 2561 kubelet.go:397] "Adding apiserver pod source" Apr 16 14:52:08.048026 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.048016 2561 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 14:52:08.049682 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.049663 2561 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:08.049757 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.049703 2561 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:08.058683 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.058661 2561 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 14:52:08.060544 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.060529 2561 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 14:52:08.060890 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.060859 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 14:52:08.060890 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.060874 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-86.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 14:52:08.061910 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.061892 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 14:52:08.061998 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.061917 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 14:52:08.061998 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.061926 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 14:52:08.061998 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.061934 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 14:52:08.061998 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.061943 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 14:52:08.061998 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.061951 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 14:52:08.061998 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.061959 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 14:52:08.061998 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.061968 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 14:52:08.061998 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.061980 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 14:52:08.061998 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.061989 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 14:52:08.061998 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.062002 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 14:52:08.062304 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.062017 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 14:52:08.063777 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.063766 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 14:52:08.063842 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.063782 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 14:52:08.066722 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.066704 2561 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-86.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 14:52:08.067126 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.067113 2561 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 14:52:08.067184 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.067154 2561 server.go:1295] "Started kubelet" Apr 16 14:52:08.067276 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.067227 2561 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 14:52:08.067372 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.067324 2561 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 14:52:08.067425 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.067396 2561 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 14:52:08.068011 ip-10-0-142-86 systemd[1]: Started Kubernetes Kubelet. Apr 16 14:52:08.068545 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.068473 2561 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 14:52:08.069814 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.069800 2561 server.go:317] "Adding debug handlers to kubelet server" Apr 16 14:52:08.072900 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.072874 2561 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rjp7v" Apr 16 14:52:08.074040 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.074017 2561 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:08.075148 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.075123 2561 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 14:52:08.076248 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.076234 2561 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 14:52:08.076845 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.076829 2561 factory.go:55] Registering systemd factory Apr 16 14:52:08.076925 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.076886 2561 factory.go:223] Registration of the systemd container factory successfully Apr 16 14:52:08.077082 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.077067 2561 factory.go:153] Registering CRI-O factory Apr 16 14:52:08.077154 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.077091 2561 factory.go:223] Registration of the crio container factory successfully Apr 16 14:52:08.077154 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.077106 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-86.ec2.internal\" not found" Apr 16 14:52:08.077154 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.077144 2561 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 14:52:08.077299 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.077165 2561 factory.go:103] Registering Raw factory Apr 16 14:52:08.077299 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.077180 2561 manager.go:1196] Started watching for new ooms in manager Apr 16 14:52:08.077299 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.077181 2561 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 14:52:08.077299 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.077190 2561 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 14:52:08.077299 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.077203 2561 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 14:52:08.077299 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.077285 2561 reconstruct.go:97] "Volume reconstruction finished" Apr 16 14:52:08.077299 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.077296 2561 reconciler.go:26] "Reconciler: start to sync state" Apr 16 14:52:08.077822 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.077808 2561 manager.go:319] Starting recovery of all containers Apr 16 14:52:08.081001 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.080977 2561 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rjp7v" Apr 16 14:52:08.083175 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.082128 2561 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-86.ec2.internal.18a6ddf080ce3502 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-86.ec2.internal,UID:ip-10-0-142-86.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-86.ec2.internal,},FirstTimestamp:2026-04-16 14:52:08.06712653 +0000 UTC m=+0.448124136,LastTimestamp:2026-04-16 14:52:08.06712653 +0000 UTC m=+0.448124136,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-86.ec2.internal,}" Apr 16 14:52:08.083536 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.083511 2561 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-142-86.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 14:52:08.084207 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.084176 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 14:52:08.089354 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.089210 2561 manager.go:324] Recovery completed Apr 16 14:52:08.093429 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.093417 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:08.095527 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.095514 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-86.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:08.095578 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.095539 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-86.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:08.095578 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.095549 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-86.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:08.096044 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.096031 2561 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 14:52:08.096044 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.096043 2561 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 14:52:08.096112 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.096058 2561 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:08.098134 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.098124 2561 policy_none.go:49] "None policy: Start" Apr 16 14:52:08.098178 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.098139 2561 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 14:52:08.098178 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.098148 2561 state_mem.go:35] "Initializing new in-memory state store" Apr 16 14:52:08.132291 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.132276 2561 manager.go:341] "Starting Device Plugin manager" Apr 16 14:52:08.132373 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.132304 2561 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 14:52:08.132373 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.132314 2561 server.go:85] "Starting device plugin registration server" Apr 16 14:52:08.132529 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.132514 2561 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 14:52:08.132635 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.132530 2561 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 14:52:08.132635 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.132619 2561 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 14:52:08.132739 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.132697 2561 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 14:52:08.132739 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.132706 2561 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 14:52:08.133125 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.133100 2561 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 14:52:08.133190 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.133134 2561 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-86.ec2.internal\" not found" Apr 16 14:52:08.219515 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.219460 2561 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 14:52:08.220638 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.220603 2561 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 14:52:08.220691 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.220653 2561 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 14:52:08.220691 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.220673 2561 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 14:52:08.220691 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.220682 2561 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 14:52:08.220811 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.220718 2561 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 14:52:08.222750 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.222727 2561 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:08.233414 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.233399 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:08.234131 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.234109 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-86.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:08.234212 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.234148 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-86.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:08.234212 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.234163 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-86.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:08.234212 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.234189 2561 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-86.ec2.internal" Apr 16 14:52:08.240234 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.240220 2561 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-86.ec2.internal" Apr 16 14:52:08.240300 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.240239 2561 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-86.ec2.internal\": node \"ip-10-0-142-86.ec2.internal\" not found" Apr 16 14:52:08.253091 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.253071 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-86.ec2.internal\" not found" Apr 16 14:52:08.321521 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.321500 2561 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-86.ec2.internal"] Apr 16 14:52:08.321624 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.321563 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:08.322285 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.322272 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-86.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:08.322363 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.322301 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-86.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:08.322363 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.322316 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-86.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:08.323555 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.323540 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:08.323713 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.323699 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal" Apr 16 14:52:08.323760 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.323726 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:08.324243 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.324219 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-86.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:08.324323 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.324253 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-86.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:08.324323 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.324225 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-86.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:08.324323 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.324264 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-86.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:08.324437 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.324301 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-86.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:08.324437 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.324347 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-86.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:08.325732 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.325719 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-86.ec2.internal" Apr 16 14:52:08.325790 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.325742 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:08.326341 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.326326 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-86.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:08.326429 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.326359 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-86.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:08.326429 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.326371 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-86.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:08.341155 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.341138 2561 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-86.ec2.internal\" not found" node="ip-10-0-142-86.ec2.internal" Apr 16 14:52:08.345335 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.345319 2561 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-86.ec2.internal\" not found" node="ip-10-0-142-86.ec2.internal" Apr 16 14:52:08.354102 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.354085 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-86.ec2.internal\" not found" Apr 16 14:52:08.378798 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.378776 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2e98882ee5619cf5d9be6b369dc8f0f8-config\") pod \"kube-apiserver-proxy-ip-10-0-142-86.ec2.internal\" (UID: \"2e98882ee5619cf5d9be6b369dc8f0f8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-86.ec2.internal" Apr 16 14:52:08.378885 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.378805 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c411056c01183f890ff0140711b4d529-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal\" (UID: \"c411056c01183f890ff0140711b4d529\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal" Apr 16 14:52:08.378885 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.378830 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c411056c01183f890ff0140711b4d529-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal\" (UID: \"c411056c01183f890ff0140711b4d529\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal" Apr 16 14:52:08.454535 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.454514 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-86.ec2.internal\" not found" Apr 16 14:52:08.479939 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.479890 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c411056c01183f890ff0140711b4d529-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal\" (UID: \"c411056c01183f890ff0140711b4d529\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal" Apr 16 14:52:08.479939 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.479928 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c411056c01183f890ff0140711b4d529-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal\" (UID: \"c411056c01183f890ff0140711b4d529\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal" Apr 16 14:52:08.480038 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.479945 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2e98882ee5619cf5d9be6b369dc8f0f8-config\") pod \"kube-apiserver-proxy-ip-10-0-142-86.ec2.internal\" (UID: \"2e98882ee5619cf5d9be6b369dc8f0f8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-86.ec2.internal" Apr 16 14:52:08.480038 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.479980 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c411056c01183f890ff0140711b4d529-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal\" (UID: \"c411056c01183f890ff0140711b4d529\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal" Apr 16 14:52:08.480038 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.479995 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c411056c01183f890ff0140711b4d529-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal\" (UID: \"c411056c01183f890ff0140711b4d529\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal" Apr 16 14:52:08.480038 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.480031 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2e98882ee5619cf5d9be6b369dc8f0f8-config\") pod \"kube-apiserver-proxy-ip-10-0-142-86.ec2.internal\" (UID: \"2e98882ee5619cf5d9be6b369dc8f0f8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-86.ec2.internal" Apr 16 14:52:08.555302 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.555282 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-86.ec2.internal\" not found" Apr 16 14:52:08.644807 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.644784 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal" Apr 16 14:52:08.648142 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.648127 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-86.ec2.internal" Apr 16 14:52:08.655975 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.655958 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-86.ec2.internal\" not found" Apr 16 14:52:08.756537 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.756481 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-86.ec2.internal\" not found" Apr 16 14:52:08.857014 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.856988 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-86.ec2.internal\" not found" Apr 16 14:52:08.957619 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:08.957587 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-86.ec2.internal\" not found" Apr 16 14:52:08.977848 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.977827 2561 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 14:52:08.978346 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:08.977966 2561 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:09.058517 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:09.058486 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-86.ec2.internal\" not found" Apr 16 14:52:09.074305 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:09.074286 2561 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:09.083010 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:09.082973 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 14:47:08 +0000 UTC" deadline="2027-10-11 13:31:14.252952503 +0000 UTC" Apr 16 14:52:09.083010 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:09.083008 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13030h39m5.16994744s" Apr 16 14:52:09.090416 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:09.090399 2561 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:09.108205 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:09.108186 2561 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-sgnw4" Apr 16 14:52:09.115226 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:09.115210 2561 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-sgnw4" Apr 16 14:52:09.134554 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:09.134531 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e98882ee5619cf5d9be6b369dc8f0f8.slice/crio-3602dcad96157989f0ea86ecfa68a84c99896079da520581fd6092a84cc6e7fa WatchSource:0}: Error finding container 3602dcad96157989f0ea86ecfa68a84c99896079da520581fd6092a84cc6e7fa: Status 404 returned error can't find the container with id 3602dcad96157989f0ea86ecfa68a84c99896079da520581fd6092a84cc6e7fa Apr 16 14:52:09.135158 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:09.135142 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc411056c01183f890ff0140711b4d529.slice/crio-5ade18addfca5f12cdb6a47824d7ba6cf34b86e8b4ad3906642e93ae22f21b7e WatchSource:0}: Error finding container 5ade18addfca5f12cdb6a47824d7ba6cf34b86e8b4ad3906642e93ae22f21b7e: Status 404 returned error can't find the container with id 5ade18addfca5f12cdb6a47824d7ba6cf34b86e8b4ad3906642e93ae22f21b7e Apr 16 14:52:09.138772 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:09.138760 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:52:09.159136 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:09.159114 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-86.ec2.internal\" not found" Apr 16 14:52:09.223238 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:09.223198 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal" event={"ID":"c411056c01183f890ff0140711b4d529","Type":"ContainerStarted","Data":"5ade18addfca5f12cdb6a47824d7ba6cf34b86e8b4ad3906642e93ae22f21b7e"} Apr 16 14:52:09.224139 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:09.224118 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-86.ec2.internal" event={"ID":"2e98882ee5619cf5d9be6b369dc8f0f8","Type":"ContainerStarted","Data":"3602dcad96157989f0ea86ecfa68a84c99896079da520581fd6092a84cc6e7fa"} Apr 16 14:52:09.259229 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:09.259211 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-86.ec2.internal\" not found" Apr 16 14:52:09.359694 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:09.359641 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-86.ec2.internal\" not found" Apr 16 14:52:09.460185 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:09.460158 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-86.ec2.internal\" not found" Apr 16 14:52:09.465900 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:09.465878 2561 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:09.560912 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:09.560886 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-86.ec2.internal\" not found" Apr 16 14:52:09.604676 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:09.604382 2561 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:09.606748 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:09.606586 2561 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:09.676902 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:09.676848 2561 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal" Apr 16 14:52:09.690703 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:09.690678 2561 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:09.691654 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:09.691631 2561 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-86.ec2.internal" Apr 16 14:52:09.699334 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:09.699255 2561 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:10.048703 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.048630 2561 apiserver.go:52] "Watching apiserver" Apr 16 14:52:10.056944 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.056921 2561 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 14:52:10.057331 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.057307 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-nzfb6","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm","openshift-cluster-node-tuning-operator/tuned-bg8fh","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal","openshift-multus/multus-4h6ft","openshift-multus/multus-additional-cni-plugins-575fz","openshift-multus/network-metrics-daemon-bppkn","openshift-network-operator/iptables-alerter-577nk","kube-system/kube-apiserver-proxy-ip-10-0-142-86.ec2.internal","openshift-image-registry/node-ca-tkx2t","openshift-network-diagnostics/network-check-target-cp47z","openshift-ovn-kubernetes/ovnkube-node-vtjk7"] Apr 16 14:52:10.059674 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.059649 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nzfb6" Apr 16 14:52:10.061129 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.061111 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.062229 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.062214 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-z8nh9\"" Apr 16 14:52:10.062317 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.062256 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 14:52:10.062317 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.062218 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 14:52:10.062690 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.062673 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.063250 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.063231 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 14:52:10.063474 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.063453 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 14:52:10.063556 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.063489 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 14:52:10.063688 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.063559 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8cxk6\"" Apr 16 14:52:10.064575 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.064537 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.065248 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.065230 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:10.065336 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.065279 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:10.065336 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.065284 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-58s96\"" Apr 16 14:52:10.066320 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.066302 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.066687 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.066670 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 14:52:10.067634 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.067130 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 14:52:10.067634 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.067168 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 14:52:10.067634 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.067179 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-tc7xh\"" Apr 16 14:52:10.067634 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.067232 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 14:52:10.068300 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.068283 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:10.068503 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:10.068367 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bppkn" podUID="ed427102-c549-468d-8146-32fba6da0a45" Apr 16 14:52:10.068741 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.068723 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 14:52:10.068814 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.068747 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 14:52:10.068814 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.068778 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vgtz9\"" Apr 16 14:52:10.071937 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.071912 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-577nk" Apr 16 14:52:10.072138 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.072060 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tkx2t" Apr 16 14:52:10.076320 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.076266 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:10.076409 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.076361 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:10.076475 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.076433 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 14:52:10.076750 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.076629 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 14:52:10.076750 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.076654 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lvsxn\"" Apr 16 14:52:10.076971 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.076778 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-847j8\"" Apr 16 14:52:10.076971 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.076878 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 14:52:10.076971 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.076925 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 14:52:10.078257 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.077493 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:10.078257 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:10.077568 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cp47z" podUID="d37fc185-ce8f-4c06-ace2-ca0a852977db" Apr 16 14:52:10.078257 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.077599 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.079715 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.079696 2561 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 14:52:10.079805 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.079765 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-fpdhf\"" Apr 16 14:52:10.079867 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.079829 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 14:52:10.080204 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.080185 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 14:52:10.080204 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.080200 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 14:52:10.080385 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.080361 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 14:52:10.080435 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.080389 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 14:52:10.081786 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.081767 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 14:52:10.088767 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.088749 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-run\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.088877 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.088858 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c9a20462-c104-4c17-a123-4c9d7acb06df-agent-certs\") pod \"konnectivity-agent-nzfb6\" (UID: \"c9a20462-c104-4c17-a123-4c9d7acb06df\") " pod="kube-system/konnectivity-agent-nzfb6" Apr 16 14:52:10.088941 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.088892 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.088941 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.088917 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-kubernetes\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.089036 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.088939 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-cnibin\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.089036 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.088962 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-hostroot\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.089036 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.088986 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3fc647f3-6334-4298-9319-057ba1d04ae8-registration-dir\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.089036 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089014 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-sys\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.089190 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089046 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-sysctl-conf\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.089190 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089093 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1f3fc48-12ff-4b51-a439-082e842f2b08-system-cni-dir\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.089190 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089141 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g86ss\" (UniqueName: \"kubernetes.io/projected/c1f3fc48-12ff-4b51-a439-082e842f2b08-kube-api-access-g86ss\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.089190 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089173 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs\") pod \"network-metrics-daemon-bppkn\" (UID: \"ed427102-c549-468d-8146-32fba6da0a45\") " pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:10.089373 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089200 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c9a20462-c104-4c17-a123-4c9d7acb06df-konnectivity-ca\") pod \"konnectivity-agent-nzfb6\" (UID: \"c9a20462-c104-4c17-a123-4c9d7acb06df\") " pod="kube-system/konnectivity-agent-nzfb6" Apr 16 14:52:10.089373 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089224 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-multus-cni-dir\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.089373 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089247 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c6c72ed9-22d4-4b46-99b0-c1f258e78270-multus-daemon-config\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.089373 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089273 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/227b75fb-5d17-43d6-a870-e0959b3989c4-iptables-alerter-script\") pod \"iptables-alerter-577nk\" (UID: \"227b75fb-5d17-43d6-a870-e0959b3989c4\") " pod="openshift-network-operator/iptables-alerter-577nk" Apr 16 14:52:10.089373 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089297 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-multus-socket-dir-parent\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.089373 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089321 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-host-run-multus-certs\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.089373 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089358 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-etc-kubernetes\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.089708 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089391 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-var-lib-openvswitch\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.089708 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089430 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-run-openvswitch\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.089708 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089477 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-run-ovn\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.089708 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089501 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-log-socket\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.089708 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089530 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kskkc\" (UniqueName: \"kubernetes.io/projected/227b75fb-5d17-43d6-a870-e0959b3989c4-kube-api-access-kskkc\") pod \"iptables-alerter-577nk\" (UID: \"227b75fb-5d17-43d6-a870-e0959b3989c4\") " pod="openshift-network-operator/iptables-alerter-577nk" Apr 16 14:52:10.089708 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089555 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-splxd\" (UniqueName: \"kubernetes.io/projected/c6c72ed9-22d4-4b46-99b0-c1f258e78270-kube-api-access-splxd\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.089708 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089574 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-env-overrides\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.089708 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089629 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3fc647f3-6334-4298-9319-057ba1d04ae8-etc-selinux\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.089708 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089664 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-lib-modules\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.089708 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089690 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f593ceff-50dd-4533-bbeb-0dcc375c12b9-serviceca\") pod \"node-ca-tkx2t\" (UID: \"f593ceff-50dd-4533-bbeb-0dcc375c12b9\") " pod="openshift-image-registry/node-ca-tkx2t" Apr 16 14:52:10.090327 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089775 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-system-cni-dir\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.090327 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089803 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-multus-conf-dir\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.090327 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089836 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-run-netns\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.090327 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089855 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-etc-openvswitch\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.090327 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089869 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-node-log\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.090327 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089884 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-ovnkube-config\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.090327 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089898 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-modprobe-d\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.090327 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089953 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1f3fc48-12ff-4b51-a439-082e842f2b08-cnibin\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.090327 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.089997 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnxvs\" (UniqueName: \"kubernetes.io/projected/ed427102-c549-468d-8146-32fba6da0a45-kube-api-access-nnxvs\") pod \"network-metrics-daemon-bppkn\" (UID: \"ed427102-c549-468d-8146-32fba6da0a45\") " pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:10.090327 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090032 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-host-run-netns\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.090327 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090055 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-slash\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.090327 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090102 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1f3fc48-12ff-4b51-a439-082e842f2b08-os-release\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.090327 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090153 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-host-var-lib-cni-bin\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.090327 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090184 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-host-var-lib-cni-multus\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.090327 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090212 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-host-var-lib-kubelet\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.090327 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090236 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-run-ovn-kubernetes\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.091056 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090251 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n5dg\" (UniqueName: \"kubernetes.io/projected/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-kube-api-access-8n5dg\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.091056 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090273 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/227b75fb-5d17-43d6-a870-e0959b3989c4-host-slash\") pod \"iptables-alerter-577nk\" (UID: \"227b75fb-5d17-43d6-a870-e0959b3989c4\") " pod="openshift-network-operator/iptables-alerter-577nk" Apr 16 14:52:10.091056 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090296 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f593ceff-50dd-4533-bbeb-0dcc375c12b9-host\") pod \"node-ca-tkx2t\" (UID: \"f593ceff-50dd-4533-bbeb-0dcc375c12b9\") " pod="openshift-image-registry/node-ca-tkx2t" Apr 16 14:52:10.091056 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090322 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-os-release\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.091056 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090362 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-host-run-k8s-cni-cncf-io\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.091056 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090394 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snssf\" (UniqueName: \"kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf\") pod \"network-check-target-cp47z\" (UID: \"d37fc185-ce8f-4c06-ace2-ca0a852977db\") " pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:10.091056 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090437 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1f3fc48-12ff-4b51-a439-082e842f2b08-cni-binary-copy\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.091056 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090478 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1f3fc48-12ff-4b51-a439-082e842f2b08-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.091056 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090514 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-run-systemd\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.091056 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090538 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-cni-bin\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.091056 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090590 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3fc647f3-6334-4298-9319-057ba1d04ae8-socket-dir\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.091056 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090657 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-sysctl-d\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.091056 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090701 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1f3fc48-12ff-4b51-a439-082e842f2b08-tuning-conf-dir\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.091056 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090725 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-cni-netd\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.091056 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090771 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-systemd-units\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.091056 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090800 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-ovn-node-metrics-cert\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.091817 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090819 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3fc647f3-6334-4298-9319-057ba1d04ae8-device-dir\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.091817 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090835 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6lz7\" (UniqueName: \"kubernetes.io/projected/3fc647f3-6334-4298-9319-057ba1d04ae8-kube-api-access-q6lz7\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.091817 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090864 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-sysconfig\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.091817 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090886 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-systemd\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.091817 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090904 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/59571617-fe46-4cd8-8766-d1b4dbb300e8-tmp\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.091817 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090928 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6c72ed9-22d4-4b46-99b0-c1f258e78270-cni-binary-copy\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.091817 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090971 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3fc647f3-6334-4298-9319-057ba1d04ae8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.091817 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.090996 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3fc647f3-6334-4298-9319-057ba1d04ae8-sys-fs\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.091817 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.091021 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-host\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.091817 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.091055 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwlx5\" (UniqueName: \"kubernetes.io/projected/59571617-fe46-4cd8-8766-d1b4dbb300e8-kube-api-access-kwlx5\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.091817 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.091090 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c1f3fc48-12ff-4b51-a439-082e842f2b08-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.091817 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.091122 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-var-lib-kubelet\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.091817 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.091138 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-tuned\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.091817 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.091152 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j96ps\" (UniqueName: \"kubernetes.io/projected/f593ceff-50dd-4533-bbeb-0dcc375c12b9-kube-api-access-j96ps\") pod \"node-ca-tkx2t\" (UID: \"f593ceff-50dd-4533-bbeb-0dcc375c12b9\") " pod="openshift-image-registry/node-ca-tkx2t" Apr 16 14:52:10.091817 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.091167 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-ovnkube-script-lib\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.091817 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.091206 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-kubelet\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.115935 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.115911 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:09 +0000 UTC" deadline="2027-11-28 23:17:44.972531307 +0000 UTC" Apr 16 14:52:10.116013 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.115933 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14192h25m34.856601077s" Apr 16 14:52:10.191982 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.191950 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3fc647f3-6334-4298-9319-057ba1d04ae8-socket-dir\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.192125 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.191996 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-sysctl-d\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.192125 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192019 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1f3fc48-12ff-4b51-a439-082e842f2b08-tuning-conf-dir\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.192125 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192039 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-cni-netd\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.192125 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192063 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-systemd-units\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.192125 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192085 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-ovn-node-metrics-cert\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.192125 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192105 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3fc647f3-6334-4298-9319-057ba1d04ae8-device-dir\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.192125 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192125 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6lz7\" (UniqueName: \"kubernetes.io/projected/3fc647f3-6334-4298-9319-057ba1d04ae8-kube-api-access-q6lz7\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.192456 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192149 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-sysconfig\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.192456 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192146 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-sysctl-d\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.192456 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192158 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-cni-netd\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.192456 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192170 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-systemd\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.192456 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192150 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-systemd-units\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.192456 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192189 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3fc647f3-6334-4298-9319-057ba1d04ae8-socket-dir\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.192456 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192203 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3fc647f3-6334-4298-9319-057ba1d04ae8-device-dir\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.192456 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192227 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-systemd\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.192456 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192229 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-sysconfig\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.192456 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192305 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/59571617-fe46-4cd8-8766-d1b4dbb300e8-tmp\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.192456 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192352 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6c72ed9-22d4-4b46-99b0-c1f258e78270-cni-binary-copy\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.192456 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192377 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3fc647f3-6334-4298-9319-057ba1d04ae8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.192456 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192404 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3fc647f3-6334-4298-9319-057ba1d04ae8-sys-fs\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.192456 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192450 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3fc647f3-6334-4298-9319-057ba1d04ae8-kubelet-dir\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.192456 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192460 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3fc647f3-6334-4298-9319-057ba1d04ae8-sys-fs\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.193131 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192479 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-host\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.193131 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192519 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-host\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.193131 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192518 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwlx5\" (UniqueName: \"kubernetes.io/projected/59571617-fe46-4cd8-8766-d1b4dbb300e8-kube-api-access-kwlx5\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.193131 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192509 2561 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 14:52:10.193131 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192563 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c1f3fc48-12ff-4b51-a439-082e842f2b08-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.193131 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192590 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-var-lib-kubelet\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.193131 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192629 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-tuned\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.193131 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192654 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j96ps\" (UniqueName: \"kubernetes.io/projected/f593ceff-50dd-4533-bbeb-0dcc375c12b9-kube-api-access-j96ps\") pod \"node-ca-tkx2t\" (UID: \"f593ceff-50dd-4533-bbeb-0dcc375c12b9\") " pod="openshift-image-registry/node-ca-tkx2t" Apr 16 14:52:10.193131 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192681 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-ovnkube-script-lib\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.193131 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192706 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-kubelet\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.193131 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192728 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-run\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.193131 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192753 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c9a20462-c104-4c17-a123-4c9d7acb06df-agent-certs\") pod \"konnectivity-agent-nzfb6\" (UID: \"c9a20462-c104-4c17-a123-4c9d7acb06df\") " pod="kube-system/konnectivity-agent-nzfb6" Apr 16 14:52:10.193131 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192778 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.193131 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192824 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-kubernetes\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.193131 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192848 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-cnibin\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.193131 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192872 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-kubelet\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.193131 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192894 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-hostroot\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.193131 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192912 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3fc647f3-6334-4298-9319-057ba1d04ae8-registration-dir\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.193954 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192927 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-sys\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.193954 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192935 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1f3fc48-12ff-4b51-a439-082e842f2b08-tuning-conf-dir\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.193954 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192947 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-sysctl-conf\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.193954 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.192992 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1f3fc48-12ff-4b51-a439-082e842f2b08-system-cni-dir\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.193954 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193013 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-run\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.193954 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193023 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g86ss\" (UniqueName: \"kubernetes.io/projected/c1f3fc48-12ff-4b51-a439-082e842f2b08-kube-api-access-g86ss\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.193954 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193055 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs\") pod \"network-metrics-daemon-bppkn\" (UID: \"ed427102-c549-468d-8146-32fba6da0a45\") " pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:10.193954 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193070 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.193954 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193082 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c9a20462-c104-4c17-a123-4c9d7acb06df-konnectivity-ca\") pod \"konnectivity-agent-nzfb6\" (UID: \"c9a20462-c104-4c17-a123-4c9d7acb06df\") " pod="kube-system/konnectivity-agent-nzfb6" Apr 16 14:52:10.193954 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193109 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-multus-cni-dir\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.193954 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193116 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-kubernetes\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.193954 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193133 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c6c72ed9-22d4-4b46-99b0-c1f258e78270-multus-daemon-config\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.193954 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193157 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/227b75fb-5d17-43d6-a870-e0959b3989c4-iptables-alerter-script\") pod \"iptables-alerter-577nk\" (UID: \"227b75fb-5d17-43d6-a870-e0959b3989c4\") " pod="openshift-network-operator/iptables-alerter-577nk" Apr 16 14:52:10.193954 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193157 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c6c72ed9-22d4-4b46-99b0-c1f258e78270-cni-binary-copy\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.193954 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193210 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1f3fc48-12ff-4b51-a439-082e842f2b08-system-cni-dir\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.193954 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193219 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3fc647f3-6334-4298-9319-057ba1d04ae8-registration-dir\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.193954 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193187 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-hostroot\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.194754 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193158 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-cnibin\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.194754 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193137 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c1f3fc48-12ff-4b51-a439-082e842f2b08-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.194754 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193274 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-sysctl-conf\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.194754 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193287 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-sys\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.194754 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193312 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-multus-socket-dir-parent\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.194754 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193340 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-multus-cni-dir\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.194754 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193374 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-host-run-multus-certs\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.194754 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193402 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-etc-kubernetes\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.194754 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193429 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-var-lib-openvswitch\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.194754 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:10.193464 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:10.194754 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193475 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-run-openvswitch\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.194754 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193504 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-run-ovn\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.194754 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193527 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-log-socket\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.194754 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193577 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-log-socket\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.194754 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:10.193588 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs podName:ed427102-c549-468d-8146-32fba6da0a45 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:10.693523138 +0000 UTC m=+3.074520752 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs") pod "network-metrics-daemon-bppkn" (UID: "ed427102-c549-468d-8146-32fba6da0a45") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:10.194754 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193645 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-multus-socket-dir-parent\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.194754 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193676 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-ovnkube-script-lib\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.195519 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193684 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-host-run-multus-certs\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.195519 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193719 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-etc-kubernetes\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.195519 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193757 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-var-lib-openvswitch\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.195519 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193796 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-run-openvswitch\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.195519 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193837 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-run-ovn\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.195519 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193842 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kskkc\" (UniqueName: \"kubernetes.io/projected/227b75fb-5d17-43d6-a870-e0959b3989c4-kube-api-access-kskkc\") pod \"iptables-alerter-577nk\" (UID: \"227b75fb-5d17-43d6-a870-e0959b3989c4\") " pod="openshift-network-operator/iptables-alerter-577nk" Apr 16 14:52:10.195519 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193890 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c6c72ed9-22d4-4b46-99b0-c1f258e78270-multus-daemon-config\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.195519 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193896 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-splxd\" (UniqueName: \"kubernetes.io/projected/c6c72ed9-22d4-4b46-99b0-c1f258e78270-kube-api-access-splxd\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.195519 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193963 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-env-overrides\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.195519 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.193989 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3fc647f3-6334-4298-9319-057ba1d04ae8-etc-selinux\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.195519 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194014 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-lib-modules\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.195519 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194043 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f593ceff-50dd-4533-bbeb-0dcc375c12b9-serviceca\") pod \"node-ca-tkx2t\" (UID: \"f593ceff-50dd-4533-bbeb-0dcc375c12b9\") " pod="openshift-image-registry/node-ca-tkx2t" Apr 16 14:52:10.195519 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194066 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-system-cni-dir\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.195519 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194090 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-multus-conf-dir\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.195519 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194125 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-run-netns\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.195519 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194151 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-etc-openvswitch\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.195519 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194173 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-node-log\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.195519 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194188 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-var-lib-kubelet\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.196397 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194198 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-ovnkube-config\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.196397 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194223 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-modprobe-d\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.196397 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194258 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1f3fc48-12ff-4b51-a439-082e842f2b08-cnibin\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.196397 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194284 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnxvs\" (UniqueName: \"kubernetes.io/projected/ed427102-c549-468d-8146-32fba6da0a45-kube-api-access-nnxvs\") pod \"network-metrics-daemon-bppkn\" (UID: \"ed427102-c549-468d-8146-32fba6da0a45\") " pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:10.196397 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194313 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-host-run-netns\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.196397 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194340 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-slash\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.196397 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194342 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c9a20462-c104-4c17-a123-4c9d7acb06df-konnectivity-ca\") pod \"konnectivity-agent-nzfb6\" (UID: \"c9a20462-c104-4c17-a123-4c9d7acb06df\") " pod="kube-system/konnectivity-agent-nzfb6" Apr 16 14:52:10.196397 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194367 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1f3fc48-12ff-4b51-a439-082e842f2b08-os-release\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.196397 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194221 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/227b75fb-5d17-43d6-a870-e0959b3989c4-iptables-alerter-script\") pod \"iptables-alerter-577nk\" (UID: \"227b75fb-5d17-43d6-a870-e0959b3989c4\") " pod="openshift-network-operator/iptables-alerter-577nk" Apr 16 14:52:10.196397 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194392 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-host-var-lib-cni-bin\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.196397 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194438 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-host-var-lib-cni-bin\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.196397 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194450 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-host-var-lib-cni-multus\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.196397 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194461 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-lib-modules\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.196397 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194480 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-host-var-lib-kubelet\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.196397 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194523 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-run-ovn-kubernetes\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.196397 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194544 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3fc647f3-6334-4298-9319-057ba1d04ae8-etc-selinux\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.196397 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194550 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8n5dg\" (UniqueName: \"kubernetes.io/projected/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-kube-api-access-8n5dg\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.197211 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194590 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/227b75fb-5d17-43d6-a870-e0959b3989c4-host-slash\") pod \"iptables-alerter-577nk\" (UID: \"227b75fb-5d17-43d6-a870-e0959b3989c4\") " pod="openshift-network-operator/iptables-alerter-577nk" Apr 16 14:52:10.197211 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194631 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-system-cni-dir\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.197211 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194638 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f593ceff-50dd-4533-bbeb-0dcc375c12b9-host\") pod \"node-ca-tkx2t\" (UID: \"f593ceff-50dd-4533-bbeb-0dcc375c12b9\") " pod="openshift-image-registry/node-ca-tkx2t" Apr 16 14:52:10.197211 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194644 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-env-overrides\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.197211 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194662 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-os-release\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.197211 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194696 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-host-run-netns\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.197211 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194687 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-host-run-k8s-cni-cncf-io\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.197211 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194712 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-host-var-lib-cni-multus\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.197211 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194723 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1f3fc48-12ff-4b51-a439-082e842f2b08-os-release\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.197211 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194728 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snssf\" (UniqueName: \"kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf\") pod \"network-check-target-cp47z\" (UID: \"d37fc185-ce8f-4c06-ace2-ca0a852977db\") " pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:10.197211 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194735 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-slash\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.197211 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194488 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1f3fc48-12ff-4b51-a439-082e842f2b08-cnibin\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.197211 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194755 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1f3fc48-12ff-4b51-a439-082e842f2b08-cni-binary-copy\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.197211 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194770 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-run-netns\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.197211 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194790 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f593ceff-50dd-4533-bbeb-0dcc375c12b9-serviceca\") pod \"node-ca-tkx2t\" (UID: \"f593ceff-50dd-4533-bbeb-0dcc375c12b9\") " pod="openshift-image-registry/node-ca-tkx2t" Apr 16 14:52:10.197211 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194799 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-host-var-lib-kubelet\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.197211 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194815 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-os-release\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.197211 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194851 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/227b75fb-5d17-43d6-a870-e0959b3989c4-host-slash\") pod \"iptables-alerter-577nk\" (UID: \"227b75fb-5d17-43d6-a870-e0959b3989c4\") " pod="openshift-network-operator/iptables-alerter-577nk" Apr 16 14:52:10.198065 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194871 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-node-log\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.198065 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194875 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-modprobe-d\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.198065 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194908 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-multus-conf-dir\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.198065 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194914 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f593ceff-50dd-4533-bbeb-0dcc375c12b9-host\") pod \"node-ca-tkx2t\" (UID: \"f593ceff-50dd-4533-bbeb-0dcc375c12b9\") " pod="openshift-image-registry/node-ca-tkx2t" Apr 16 14:52:10.198065 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194941 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1f3fc48-12ff-4b51-a439-082e842f2b08-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.198065 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194952 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-etc-openvswitch\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.198065 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.194958 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c6c72ed9-22d4-4b46-99b0-c1f258e78270-host-run-k8s-cni-cncf-io\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.198065 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.195008 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-run-ovn-kubernetes\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.198065 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.195053 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-run-systemd\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.198065 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.195287 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1f3fc48-12ff-4b51-a439-082e842f2b08-cni-binary-copy\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.198065 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.195341 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-cni-bin\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.198065 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.195333 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-run-systemd\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.198065 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.195432 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1f3fc48-12ff-4b51-a439-082e842f2b08-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.198065 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.195430 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-host-cni-bin\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.198065 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.195584 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-ovnkube-config\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.198065 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.196277 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/59571617-fe46-4cd8-8766-d1b4dbb300e8-etc-tuned\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.198065 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.196936 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c9a20462-c104-4c17-a123-4c9d7acb06df-agent-certs\") pod \"konnectivity-agent-nzfb6\" (UID: \"c9a20462-c104-4c17-a123-4c9d7acb06df\") " pod="kube-system/konnectivity-agent-nzfb6" Apr 16 14:52:10.198065 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.197276 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/59571617-fe46-4cd8-8766-d1b4dbb300e8-tmp\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.198908 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.198151 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-ovn-node-metrics-cert\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.202468 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.202399 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6lz7\" (UniqueName: \"kubernetes.io/projected/3fc647f3-6334-4298-9319-057ba1d04ae8-kube-api-access-q6lz7\") pod \"aws-ebs-csi-driver-node-94bvm\" (UID: \"3fc647f3-6334-4298-9319-057ba1d04ae8\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.203955 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.203924 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j96ps\" (UniqueName: \"kubernetes.io/projected/f593ceff-50dd-4533-bbeb-0dcc375c12b9-kube-api-access-j96ps\") pod \"node-ca-tkx2t\" (UID: \"f593ceff-50dd-4533-bbeb-0dcc375c12b9\") " pod="openshift-image-registry/node-ca-tkx2t" Apr 16 14:52:10.204963 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:10.204937 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:10.204963 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:10.204964 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:10.205112 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:10.204978 2561 projected.go:194] Error preparing data for projected volume kube-api-access-snssf for pod openshift-network-diagnostics/network-check-target-cp47z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:10.205112 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:10.205034 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf podName:d37fc185-ce8f-4c06-ace2-ca0a852977db nodeName:}" failed. No retries permitted until 2026-04-16 14:52:10.705016316 +0000 UTC m=+3.086013924 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-snssf" (UniqueName: "kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf") pod "network-check-target-cp47z" (UID: "d37fc185-ce8f-4c06-ace2-ca0a852977db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:10.205589 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.205565 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwlx5\" (UniqueName: \"kubernetes.io/projected/59571617-fe46-4cd8-8766-d1b4dbb300e8-kube-api-access-kwlx5\") pod \"tuned-bg8fh\" (UID: \"59571617-fe46-4cd8-8766-d1b4dbb300e8\") " pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.207011 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.206988 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g86ss\" (UniqueName: \"kubernetes.io/projected/c1f3fc48-12ff-4b51-a439-082e842f2b08-kube-api-access-g86ss\") pod \"multus-additional-cni-plugins-575fz\" (UID: \"c1f3fc48-12ff-4b51-a439-082e842f2b08\") " pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.207539 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.207518 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-splxd\" (UniqueName: \"kubernetes.io/projected/c6c72ed9-22d4-4b46-99b0-c1f258e78270-kube-api-access-splxd\") pod \"multus-4h6ft\" (UID: \"c6c72ed9-22d4-4b46-99b0-c1f258e78270\") " pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.207810 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.207789 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kskkc\" (UniqueName: \"kubernetes.io/projected/227b75fb-5d17-43d6-a870-e0959b3989c4-kube-api-access-kskkc\") pod \"iptables-alerter-577nk\" (UID: \"227b75fb-5d17-43d6-a870-e0959b3989c4\") " pod="openshift-network-operator/iptables-alerter-577nk" Apr 16 14:52:10.208399 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.208380 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n5dg\" (UniqueName: \"kubernetes.io/projected/cf2d6237-3c32-44f2-bf46-6f36e887e3c2-kube-api-access-8n5dg\") pod \"ovnkube-node-vtjk7\" (UID: \"cf2d6237-3c32-44f2-bf46-6f36e887e3c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.208562 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.208541 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnxvs\" (UniqueName: \"kubernetes.io/projected/ed427102-c549-468d-8146-32fba6da0a45-kube-api-access-nnxvs\") pod \"network-metrics-daemon-bppkn\" (UID: \"ed427102-c549-468d-8146-32fba6da0a45\") " pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:10.371398 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.371307 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nzfb6" Apr 16 14:52:10.380103 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.380077 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" Apr 16 14:52:10.387858 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.387835 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" Apr 16 14:52:10.393430 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.393413 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4h6ft" Apr 16 14:52:10.400113 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.400094 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-575fz" Apr 16 14:52:10.408583 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.408563 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-577nk" Apr 16 14:52:10.415088 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.415059 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tkx2t" Apr 16 14:52:10.420697 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.420680 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:10.496733 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.496711 2561 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:10.698754 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.698669 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs\") pod \"network-metrics-daemon-bppkn\" (UID: \"ed427102-c549-468d-8146-32fba6da0a45\") " pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:10.698914 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:10.698833 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:10.698914 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:10.698904 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs podName:ed427102-c549-468d-8146-32fba6da0a45 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:11.698883487 +0000 UTC m=+4.079881079 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs") pod "network-metrics-daemon-bppkn" (UID: "ed427102-c549-468d-8146-32fba6da0a45") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:10.772320 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:10.772290 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1f3fc48_12ff_4b51_a439_082e842f2b08.slice/crio-3c71420c967bc5072010f1c817b0f091ff4c0497ae818cb87b77791732f90ec8 WatchSource:0}: Error finding container 3c71420c967bc5072010f1c817b0f091ff4c0497ae818cb87b77791732f90ec8: Status 404 returned error can't find the container with id 3c71420c967bc5072010f1c817b0f091ff4c0497ae818cb87b77791732f90ec8 Apr 16 14:52:10.773013 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:10.772976 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59571617_fe46_4cd8_8766_d1b4dbb300e8.slice/crio-a256304dbb5abf2e846144cabc7235a4b1e01df9a84b824ac301fa99623c275b WatchSource:0}: Error finding container a256304dbb5abf2e846144cabc7235a4b1e01df9a84b824ac301fa99623c275b: Status 404 returned error can't find the container with id a256304dbb5abf2e846144cabc7235a4b1e01df9a84b824ac301fa99623c275b Apr 16 14:52:10.775062 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:10.775035 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6c72ed9_22d4_4b46_99b0_c1f258e78270.slice/crio-96419bb8b5bbf7d87c9cb57272d38d8a0fe47eace60e29f33b6d47f805aad5a8 WatchSource:0}: Error finding container 96419bb8b5bbf7d87c9cb57272d38d8a0fe47eace60e29f33b6d47f805aad5a8: Status 404 returned error can't find the container with id 96419bb8b5bbf7d87c9cb57272d38d8a0fe47eace60e29f33b6d47f805aad5a8 Apr 16 14:52:10.777473 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:10.777446 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf2d6237_3c32_44f2_bf46_6f36e887e3c2.slice/crio-ae5a960e2293878d6a633220045093c388cd5cff0a08c92c3d2fe4e78f2676d3 WatchSource:0}: Error finding container ae5a960e2293878d6a633220045093c388cd5cff0a08c92c3d2fe4e78f2676d3: Status 404 returned error can't find the container with id ae5a960e2293878d6a633220045093c388cd5cff0a08c92c3d2fe4e78f2676d3 Apr 16 14:52:10.798863 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:10.798835 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf593ceff_50dd_4533_bbeb_0dcc375c12b9.slice/crio-b2ed6d7248d1012180e247c35eda74606c29ff2cab32e694e7c0413419064ece WatchSource:0}: Error finding container b2ed6d7248d1012180e247c35eda74606c29ff2cab32e694e7c0413419064ece: Status 404 returned error can't find the container with id b2ed6d7248d1012180e247c35eda74606c29ff2cab32e694e7c0413419064ece Apr 16 14:52:10.799071 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:10.799049 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snssf\" (UniqueName: \"kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf\") pod \"network-check-target-cp47z\" (UID: \"d37fc185-ce8f-4c06-ace2-ca0a852977db\") " pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:10.799272 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:10.799212 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:10.799272 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:10.799235 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:10.799272 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:10.799248 2561 projected.go:194] Error preparing data for projected volume kube-api-access-snssf for pod openshift-network-diagnostics/network-check-target-cp47z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:10.799449 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:10.799305 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf podName:d37fc185-ce8f-4c06-ace2-ca0a852977db nodeName:}" failed. No retries permitted until 2026-04-16 14:52:11.799285599 +0000 UTC m=+4.180283192 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-snssf" (UniqueName: "kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf") pod "network-check-target-cp47z" (UID: "d37fc185-ce8f-4c06-ace2-ca0a852977db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:10.800059 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:10.799997 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fc647f3_6334_4298_9319_057ba1d04ae8.slice/crio-3770a4c1a821ca75618751a904bf8b69b6365de3389fb9ebd41d48bf8ff22836 WatchSource:0}: Error finding container 3770a4c1a821ca75618751a904bf8b69b6365de3389fb9ebd41d48bf8ff22836: Status 404 returned error can't find the container with id 3770a4c1a821ca75618751a904bf8b69b6365de3389fb9ebd41d48bf8ff22836 Apr 16 14:52:10.800913 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:10.800879 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod227b75fb_5d17_43d6_a870_e0959b3989c4.slice/crio-bfa54f57cc5288a4c723e5e28632062f5f4e768b5af2a30016eb5599b3717fdf WatchSource:0}: Error finding container bfa54f57cc5288a4c723e5e28632062f5f4e768b5af2a30016eb5599b3717fdf: Status 404 returned error can't find the container with id bfa54f57cc5288a4c723e5e28632062f5f4e768b5af2a30016eb5599b3717fdf Apr 16 14:52:10.802949 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:10.802929 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9a20462_c104_4c17_a123_4c9d7acb06df.slice/crio-4f26e57213074b5ed88bead9b8f3abca38fd42f0a75e0b217ef501275e3739c0 WatchSource:0}: Error finding container 4f26e57213074b5ed88bead9b8f3abca38fd42f0a75e0b217ef501275e3739c0: Status 404 returned error can't find the container with id 4f26e57213074b5ed88bead9b8f3abca38fd42f0a75e0b217ef501275e3739c0 Apr 16 14:52:11.116764 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:11.116404 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:09 +0000 UTC" deadline="2027-10-06 16:07:13.44205243 +0000 UTC" Apr 16 14:52:11.116764 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:11.116704 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12913h15m2.325352973s" Apr 16 14:52:11.235120 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:11.234468 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-86.ec2.internal" event={"ID":"2e98882ee5619cf5d9be6b369dc8f0f8","Type":"ContainerStarted","Data":"ae2437204a71023a136280cbcf9bbc081a45a5f48e6c5e1056a4388211384437"} Apr 16 14:52:11.243150 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:11.243087 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nzfb6" event={"ID":"c9a20462-c104-4c17-a123-4c9d7acb06df","Type":"ContainerStarted","Data":"4f26e57213074b5ed88bead9b8f3abca38fd42f0a75e0b217ef501275e3739c0"} Apr 16 14:52:11.247259 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:11.247162 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-577nk" event={"ID":"227b75fb-5d17-43d6-a870-e0959b3989c4","Type":"ContainerStarted","Data":"bfa54f57cc5288a4c723e5e28632062f5f4e768b5af2a30016eb5599b3717fdf"} Apr 16 14:52:11.248988 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:11.248553 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-86.ec2.internal" podStartSLOduration=2.248538324 podStartE2EDuration="2.248538324s" podCreationTimestamp="2026-04-16 14:52:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:11.248222405 +0000 UTC m=+3.629220020" watchObservedRunningTime="2026-04-16 14:52:11.248538324 +0000 UTC m=+3.629535955" Apr 16 14:52:11.254744 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:11.254717 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tkx2t" event={"ID":"f593ceff-50dd-4533-bbeb-0dcc375c12b9","Type":"ContainerStarted","Data":"b2ed6d7248d1012180e247c35eda74606c29ff2cab32e694e7c0413419064ece"} Apr 16 14:52:11.259250 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:11.258581 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4h6ft" event={"ID":"c6c72ed9-22d4-4b46-99b0-c1f258e78270","Type":"ContainerStarted","Data":"96419bb8b5bbf7d87c9cb57272d38d8a0fe47eace60e29f33b6d47f805aad5a8"} Apr 16 14:52:11.262632 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:11.261470 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-575fz" event={"ID":"c1f3fc48-12ff-4b51-a439-082e842f2b08","Type":"ContainerStarted","Data":"3c71420c967bc5072010f1c817b0f091ff4c0497ae818cb87b77791732f90ec8"} Apr 16 14:52:11.264496 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:11.264447 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" event={"ID":"3fc647f3-6334-4298-9319-057ba1d04ae8","Type":"ContainerStarted","Data":"3770a4c1a821ca75618751a904bf8b69b6365de3389fb9ebd41d48bf8ff22836"} Apr 16 14:52:11.267343 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:11.267313 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" event={"ID":"cf2d6237-3c32-44f2-bf46-6f36e887e3c2","Type":"ContainerStarted","Data":"ae5a960e2293878d6a633220045093c388cd5cff0a08c92c3d2fe4e78f2676d3"} Apr 16 14:52:11.281180 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:11.280965 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" event={"ID":"59571617-fe46-4cd8-8766-d1b4dbb300e8","Type":"ContainerStarted","Data":"a256304dbb5abf2e846144cabc7235a4b1e01df9a84b824ac301fa99623c275b"} Apr 16 14:52:11.709192 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:11.709163 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs\") pod \"network-metrics-daemon-bppkn\" (UID: \"ed427102-c549-468d-8146-32fba6da0a45\") " pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:11.709362 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:11.709284 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:11.709362 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:11.709327 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs podName:ed427102-c549-468d-8146-32fba6da0a45 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:13.709314284 +0000 UTC m=+6.090311876 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs") pod "network-metrics-daemon-bppkn" (UID: "ed427102-c549-468d-8146-32fba6da0a45") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:11.811945 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:11.811864 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snssf\" (UniqueName: \"kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf\") pod \"network-check-target-cp47z\" (UID: \"d37fc185-ce8f-4c06-ace2-ca0a852977db\") " pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:11.812095 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:11.812073 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:11.812095 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:11.812094 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:11.812197 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:11.812107 2561 projected.go:194] Error preparing data for projected volume kube-api-access-snssf for pod openshift-network-diagnostics/network-check-target-cp47z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:11.812197 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:11.812165 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf podName:d37fc185-ce8f-4c06-ace2-ca0a852977db nodeName:}" failed. No retries permitted until 2026-04-16 14:52:13.812145668 +0000 UTC m=+6.193143312 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-snssf" (UniqueName: "kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf") pod "network-check-target-cp47z" (UID: "d37fc185-ce8f-4c06-ace2-ca0a852977db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:12.223375 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:12.223297 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:12.223803 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:12.223431 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bppkn" podUID="ed427102-c549-468d-8146-32fba6da0a45" Apr 16 14:52:12.223876 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:12.223857 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:12.224051 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:12.223972 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cp47z" podUID="d37fc185-ce8f-4c06-ace2-ca0a852977db" Apr 16 14:52:12.297479 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:12.297442 2561 generic.go:358] "Generic (PLEG): container finished" podID="c411056c01183f890ff0140711b4d529" containerID="cbadd690d13d4275eb1ad511f427d6289182f9b696c6efd6de412a4a5ae52f4f" exitCode=0 Apr 16 14:52:12.298354 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:12.298326 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal" event={"ID":"c411056c01183f890ff0140711b4d529","Type":"ContainerDied","Data":"cbadd690d13d4275eb1ad511f427d6289182f9b696c6efd6de412a4a5ae52f4f"} Apr 16 14:52:13.304565 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:13.303911 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal" event={"ID":"c411056c01183f890ff0140711b4d529","Type":"ContainerStarted","Data":"107a1a7dfbb9f19e3d3375f82048c70b5272d11bb254d35abfece14e731dd263"} Apr 16 14:52:13.729403 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:13.728775 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs\") pod \"network-metrics-daemon-bppkn\" (UID: \"ed427102-c549-468d-8146-32fba6da0a45\") " pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:13.729403 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:13.728963 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:13.729403 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:13.729030 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs podName:ed427102-c549-468d-8146-32fba6da0a45 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:17.729010853 +0000 UTC m=+10.110008456 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs") pod "network-metrics-daemon-bppkn" (UID: "ed427102-c549-468d-8146-32fba6da0a45") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:13.829336 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:13.829301 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snssf\" (UniqueName: \"kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf\") pod \"network-check-target-cp47z\" (UID: \"d37fc185-ce8f-4c06-ace2-ca0a852977db\") " pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:13.829542 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:13.829511 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:13.829542 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:13.829536 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:13.829695 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:13.829549 2561 projected.go:194] Error preparing data for projected volume kube-api-access-snssf for pod openshift-network-diagnostics/network-check-target-cp47z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:13.829695 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:13.829623 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf podName:d37fc185-ce8f-4c06-ace2-ca0a852977db nodeName:}" failed. No retries permitted until 2026-04-16 14:52:17.829586783 +0000 UTC m=+10.210584379 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-snssf" (UniqueName: "kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf") pod "network-check-target-cp47z" (UID: "d37fc185-ce8f-4c06-ace2-ca0a852977db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:14.222588 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:14.221319 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:14.222588 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:14.221328 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:14.222588 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:14.221441 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cp47z" podUID="d37fc185-ce8f-4c06-ace2-ca0a852977db" Apr 16 14:52:14.222588 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:14.221522 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bppkn" podUID="ed427102-c549-468d-8146-32fba6da0a45" Apr 16 14:52:15.456506 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:15.456453 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-86.ec2.internal" podStartSLOduration=6.456434209 podStartE2EDuration="6.456434209s" podCreationTimestamp="2026-04-16 14:52:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:13.324785292 +0000 UTC m=+5.705782919" watchObservedRunningTime="2026-04-16 14:52:15.456434209 +0000 UTC m=+7.837431824" Apr 16 14:52:15.456993 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:15.456696 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-h54t4"] Apr 16 14:52:15.463065 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:15.463027 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h54t4" Apr 16 14:52:15.467376 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:15.466516 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 14:52:15.467376 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:15.466771 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 14:52:15.467376 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:15.467013 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cj8gs\"" Apr 16 14:52:15.543993 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:15.543964 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/72695b7c-a9cd-4e46-80e9-10740ab7de94-hosts-file\") pod \"node-resolver-h54t4\" (UID: \"72695b7c-a9cd-4e46-80e9-10740ab7de94\") " pod="openshift-dns/node-resolver-h54t4" Apr 16 14:52:15.544173 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:15.544013 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gg5d\" (UniqueName: \"kubernetes.io/projected/72695b7c-a9cd-4e46-80e9-10740ab7de94-kube-api-access-6gg5d\") pod \"node-resolver-h54t4\" (UID: \"72695b7c-a9cd-4e46-80e9-10740ab7de94\") " pod="openshift-dns/node-resolver-h54t4" Apr 16 14:52:15.544173 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:15.544094 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/72695b7c-a9cd-4e46-80e9-10740ab7de94-tmp-dir\") pod \"node-resolver-h54t4\" (UID: \"72695b7c-a9cd-4e46-80e9-10740ab7de94\") " pod="openshift-dns/node-resolver-h54t4" Apr 16 14:52:15.645602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:15.644976 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/72695b7c-a9cd-4e46-80e9-10740ab7de94-tmp-dir\") pod \"node-resolver-h54t4\" (UID: \"72695b7c-a9cd-4e46-80e9-10740ab7de94\") " pod="openshift-dns/node-resolver-h54t4" Apr 16 14:52:15.645602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:15.645022 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/72695b7c-a9cd-4e46-80e9-10740ab7de94-hosts-file\") pod \"node-resolver-h54t4\" (UID: \"72695b7c-a9cd-4e46-80e9-10740ab7de94\") " pod="openshift-dns/node-resolver-h54t4" Apr 16 14:52:15.645602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:15.645053 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gg5d\" (UniqueName: \"kubernetes.io/projected/72695b7c-a9cd-4e46-80e9-10740ab7de94-kube-api-access-6gg5d\") pod \"node-resolver-h54t4\" (UID: \"72695b7c-a9cd-4e46-80e9-10740ab7de94\") " pod="openshift-dns/node-resolver-h54t4" Apr 16 14:52:15.645602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:15.645535 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/72695b7c-a9cd-4e46-80e9-10740ab7de94-hosts-file\") pod \"node-resolver-h54t4\" (UID: \"72695b7c-a9cd-4e46-80e9-10740ab7de94\") " pod="openshift-dns/node-resolver-h54t4" Apr 16 14:52:15.645957 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:15.645718 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/72695b7c-a9cd-4e46-80e9-10740ab7de94-tmp-dir\") pod \"node-resolver-h54t4\" (UID: \"72695b7c-a9cd-4e46-80e9-10740ab7de94\") " pod="openshift-dns/node-resolver-h54t4" Apr 16 14:52:15.654511 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:15.654486 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gg5d\" (UniqueName: \"kubernetes.io/projected/72695b7c-a9cd-4e46-80e9-10740ab7de94-kube-api-access-6gg5d\") pod \"node-resolver-h54t4\" (UID: \"72695b7c-a9cd-4e46-80e9-10740ab7de94\") " pod="openshift-dns/node-resolver-h54t4" Apr 16 14:52:15.776402 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:15.776062 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h54t4" Apr 16 14:52:16.221818 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:16.221442 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:16.221818 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:16.221472 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:16.221818 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:16.221581 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bppkn" podUID="ed427102-c549-468d-8146-32fba6da0a45" Apr 16 14:52:16.221818 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:16.221650 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cp47z" podUID="d37fc185-ce8f-4c06-ace2-ca0a852977db" Apr 16 14:52:16.506937 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:16.506788 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6fksx"] Apr 16 14:52:16.509923 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:16.509900 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:16.510065 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:16.509983 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6fksx" podUID="2a785a0f-ff3d-4e46-9883-f604a6fec502" Apr 16 14:52:16.552647 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:16.552599 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2a785a0f-ff3d-4e46-9883-f604a6fec502-dbus\") pod \"global-pull-secret-syncer-6fksx\" (UID: \"2a785a0f-ff3d-4e46-9883-f604a6fec502\") " pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:16.552788 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:16.552663 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2a785a0f-ff3d-4e46-9883-f604a6fec502-kubelet-config\") pod \"global-pull-secret-syncer-6fksx\" (UID: \"2a785a0f-ff3d-4e46-9883-f604a6fec502\") " pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:16.552788 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:16.552696 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret\") pod \"global-pull-secret-syncer-6fksx\" (UID: \"2a785a0f-ff3d-4e46-9883-f604a6fec502\") " pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:16.653698 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:16.653399 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2a785a0f-ff3d-4e46-9883-f604a6fec502-dbus\") pod \"global-pull-secret-syncer-6fksx\" (UID: \"2a785a0f-ff3d-4e46-9883-f604a6fec502\") " pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:16.653698 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:16.653442 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2a785a0f-ff3d-4e46-9883-f604a6fec502-kubelet-config\") pod \"global-pull-secret-syncer-6fksx\" (UID: \"2a785a0f-ff3d-4e46-9883-f604a6fec502\") " pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:16.653698 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:16.653470 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret\") pod \"global-pull-secret-syncer-6fksx\" (UID: \"2a785a0f-ff3d-4e46-9883-f604a6fec502\") " pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:16.653698 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:16.653583 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2a785a0f-ff3d-4e46-9883-f604a6fec502-dbus\") pod \"global-pull-secret-syncer-6fksx\" (UID: \"2a785a0f-ff3d-4e46-9883-f604a6fec502\") " pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:16.653698 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:16.653597 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:16.653698 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:16.653665 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret podName:2a785a0f-ff3d-4e46-9883-f604a6fec502 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:17.15364516 +0000 UTC m=+9.534642757 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret") pod "global-pull-secret-syncer-6fksx" (UID: "2a785a0f-ff3d-4e46-9883-f604a6fec502") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:16.653698 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:16.653660 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2a785a0f-ff3d-4e46-9883-f604a6fec502-kubelet-config\") pod \"global-pull-secret-syncer-6fksx\" (UID: \"2a785a0f-ff3d-4e46-9883-f604a6fec502\") " pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:17.156754 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:17.156719 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret\") pod \"global-pull-secret-syncer-6fksx\" (UID: \"2a785a0f-ff3d-4e46-9883-f604a6fec502\") " pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:17.156938 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:17.156863 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:17.156938 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:17.156927 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret podName:2a785a0f-ff3d-4e46-9883-f604a6fec502 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:18.156908151 +0000 UTC m=+10.537905748 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret") pod "global-pull-secret-syncer-6fksx" (UID: "2a785a0f-ff3d-4e46-9883-f604a6fec502") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:17.761312 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:17.761282 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs\") pod \"network-metrics-daemon-bppkn\" (UID: \"ed427102-c549-468d-8146-32fba6da0a45\") " pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:17.761711 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:17.761470 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:17.761711 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:17.761539 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs podName:ed427102-c549-468d-8146-32fba6da0a45 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:25.761518993 +0000 UTC m=+18.142516599 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs") pod "network-metrics-daemon-bppkn" (UID: "ed427102-c549-468d-8146-32fba6da0a45") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:17.862568 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:17.862529 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snssf\" (UniqueName: \"kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf\") pod \"network-check-target-cp47z\" (UID: \"d37fc185-ce8f-4c06-ace2-ca0a852977db\") " pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:17.862752 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:17.862725 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:17.862752 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:17.862748 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:17.862859 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:17.862761 2561 projected.go:194] Error preparing data for projected volume kube-api-access-snssf for pod openshift-network-diagnostics/network-check-target-cp47z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:17.862859 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:17.862824 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf podName:d37fc185-ce8f-4c06-ace2-ca0a852977db nodeName:}" failed. No retries permitted until 2026-04-16 14:52:25.862803565 +0000 UTC m=+18.243801172 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-snssf" (UniqueName: "kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf") pod "network-check-target-cp47z" (UID: "d37fc185-ce8f-4c06-ace2-ca0a852977db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:18.165644 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:18.165540 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret\") pod \"global-pull-secret-syncer-6fksx\" (UID: \"2a785a0f-ff3d-4e46-9883-f604a6fec502\") " pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:18.165800 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:18.165670 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:18.165800 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:18.165721 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret podName:2a785a0f-ff3d-4e46-9883-f604a6fec502 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:20.16570768 +0000 UTC m=+12.546705272 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret") pod "global-pull-secret-syncer-6fksx" (UID: "2a785a0f-ff3d-4e46-9883-f604a6fec502") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:18.222496 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:18.222444 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:18.222668 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:18.222521 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:18.222668 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:18.222545 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:18.222668 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:18.222635 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cp47z" podUID="d37fc185-ce8f-4c06-ace2-ca0a852977db" Apr 16 14:52:18.222978 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:18.222936 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6fksx" podUID="2a785a0f-ff3d-4e46-9883-f604a6fec502" Apr 16 14:52:18.223103 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:18.223017 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bppkn" podUID="ed427102-c549-468d-8146-32fba6da0a45" Apr 16 14:52:20.179342 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:20.179300 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret\") pod \"global-pull-secret-syncer-6fksx\" (UID: \"2a785a0f-ff3d-4e46-9883-f604a6fec502\") " pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:20.179821 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:20.179452 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:20.179821 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:20.179517 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret podName:2a785a0f-ff3d-4e46-9883-f604a6fec502 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:24.179501209 +0000 UTC m=+16.560498806 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret") pod "global-pull-secret-syncer-6fksx" (UID: "2a785a0f-ff3d-4e46-9883-f604a6fec502") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:20.221416 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:20.221388 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:20.221576 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:20.221418 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:20.221576 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:20.221395 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:20.221576 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:20.221495 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cp47z" podUID="d37fc185-ce8f-4c06-ace2-ca0a852977db" Apr 16 14:52:20.221764 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:20.221595 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bppkn" podUID="ed427102-c549-468d-8146-32fba6da0a45" Apr 16 14:52:20.221764 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:20.221676 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6fksx" podUID="2a785a0f-ff3d-4e46-9883-f604a6fec502" Apr 16 14:52:22.221193 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:22.221159 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:22.221193 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:22.221190 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:22.221691 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:22.221161 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:22.221691 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:22.221299 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bppkn" podUID="ed427102-c549-468d-8146-32fba6da0a45" Apr 16 14:52:22.221691 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:22.221382 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cp47z" podUID="d37fc185-ce8f-4c06-ace2-ca0a852977db" Apr 16 14:52:22.221691 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:22.221472 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6fksx" podUID="2a785a0f-ff3d-4e46-9883-f604a6fec502" Apr 16 14:52:24.214034 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:24.213998 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret\") pod \"global-pull-secret-syncer-6fksx\" (UID: \"2a785a0f-ff3d-4e46-9883-f604a6fec502\") " pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:24.214481 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:24.214163 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:24.214481 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:24.214234 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret podName:2a785a0f-ff3d-4e46-9883-f604a6fec502 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:32.21421642 +0000 UTC m=+24.595214013 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret") pod "global-pull-secret-syncer-6fksx" (UID: "2a785a0f-ff3d-4e46-9883-f604a6fec502") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:24.221077 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:24.221037 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:24.221077 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:24.221057 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:24.221269 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:24.221043 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:24.221269 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:24.221165 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6fksx" podUID="2a785a0f-ff3d-4e46-9883-f604a6fec502" Apr 16 14:52:24.221369 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:24.221275 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cp47z" podUID="d37fc185-ce8f-4c06-ace2-ca0a852977db" Apr 16 14:52:24.221369 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:24.221358 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bppkn" podUID="ed427102-c549-468d-8146-32fba6da0a45" Apr 16 14:52:25.824203 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:25.824165 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs\") pod \"network-metrics-daemon-bppkn\" (UID: \"ed427102-c549-468d-8146-32fba6da0a45\") " pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:25.824638 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:25.824288 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:25.824638 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:25.824350 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs podName:ed427102-c549-468d-8146-32fba6da0a45 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:41.824332808 +0000 UTC m=+34.205330412 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs") pod "network-metrics-daemon-bppkn" (UID: "ed427102-c549-468d-8146-32fba6da0a45") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:25.925503 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:25.925472 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snssf\" (UniqueName: \"kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf\") pod \"network-check-target-cp47z\" (UID: \"d37fc185-ce8f-4c06-ace2-ca0a852977db\") " pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:25.925677 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:25.925621 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:25.925677 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:25.925642 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:25.925677 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:25.925655 2561 projected.go:194] Error preparing data for projected volume kube-api-access-snssf for pod openshift-network-diagnostics/network-check-target-cp47z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:25.925865 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:25.925716 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf podName:d37fc185-ce8f-4c06-ace2-ca0a852977db nodeName:}" failed. No retries permitted until 2026-04-16 14:52:41.925695885 +0000 UTC m=+34.306693498 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-snssf" (UniqueName: "kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf") pod "network-check-target-cp47z" (UID: "d37fc185-ce8f-4c06-ace2-ca0a852977db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:26.221888 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:26.221797 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:26.221888 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:26.221846 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:26.222106 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:26.221797 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:26.222106 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:26.221926 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cp47z" podUID="d37fc185-ce8f-4c06-ace2-ca0a852977db" Apr 16 14:52:26.222106 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:26.222001 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bppkn" podUID="ed427102-c549-468d-8146-32fba6da0a45" Apr 16 14:52:26.222106 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:26.222085 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6fksx" podUID="2a785a0f-ff3d-4e46-9883-f604a6fec502" Apr 16 14:52:27.442917 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:27.442892 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72695b7c_a9cd_4e46_80e9_10740ab7de94.slice/crio-ed244b6ed2bc014e4fecae22cbf8141dc493c6477e668aa54d0ff147abd61d9a WatchSource:0}: Error finding container ed244b6ed2bc014e4fecae22cbf8141dc493c6477e668aa54d0ff147abd61d9a: Status 404 returned error can't find the container with id ed244b6ed2bc014e4fecae22cbf8141dc493c6477e668aa54d0ff147abd61d9a Apr 16 14:52:28.222825 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.222126 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:28.222825 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:28.222415 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bppkn" podUID="ed427102-c549-468d-8146-32fba6da0a45" Apr 16 14:52:28.222825 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.222218 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:28.222825 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.222192 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:28.222825 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:28.222496 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6fksx" podUID="2a785a0f-ff3d-4e46-9883-f604a6fec502" Apr 16 14:52:28.222825 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:28.222596 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cp47z" podUID="d37fc185-ce8f-4c06-ace2-ca0a852977db" Apr 16 14:52:28.328403 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.328304 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tkx2t" event={"ID":"f593ceff-50dd-4533-bbeb-0dcc375c12b9","Type":"ContainerStarted","Data":"7804240fed22c64794cccfeb288881ebeacbccfcfebb6680a6577d656952e2d8"} Apr 16 14:52:28.329696 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.329666 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4h6ft" event={"ID":"c6c72ed9-22d4-4b46-99b0-c1f258e78270","Type":"ContainerStarted","Data":"5b78dc6b3e4baeb5db40622aa5498532cc8837d4da772a0e81ada3dc98046fe2"} Apr 16 14:52:28.331178 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.331150 2561 generic.go:358] "Generic (PLEG): container finished" podID="c1f3fc48-12ff-4b51-a439-082e842f2b08" containerID="21110e59257de87d3cb9a24f6e1d321e27af6dcb33a05eba194d469b291086d3" exitCode=0 Apr 16 14:52:28.331297 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.331221 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-575fz" event={"ID":"c1f3fc48-12ff-4b51-a439-082e842f2b08","Type":"ContainerDied","Data":"21110e59257de87d3cb9a24f6e1d321e27af6dcb33a05eba194d469b291086d3"} Apr 16 14:52:28.332628 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.332519 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h54t4" event={"ID":"72695b7c-a9cd-4e46-80e9-10740ab7de94","Type":"ContainerStarted","Data":"3749939337b7225b3f43f6688f818755653757cb12ebbb209b28332468aa763d"} Apr 16 14:52:28.332628 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.332543 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h54t4" event={"ID":"72695b7c-a9cd-4e46-80e9-10740ab7de94","Type":"ContainerStarted","Data":"ed244b6ed2bc014e4fecae22cbf8141dc493c6477e668aa54d0ff147abd61d9a"} Apr 16 14:52:28.333994 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.333974 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" event={"ID":"3fc647f3-6334-4298-9319-057ba1d04ae8","Type":"ContainerStarted","Data":"7c9640b0dbd5f8ea43080b15aebcd40e7ba2a9fd8c8edfff6e8a54665baa129f"} Apr 16 14:52:28.336548 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.336524 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" event={"ID":"cf2d6237-3c32-44f2-bf46-6f36e887e3c2","Type":"ContainerStarted","Data":"6df07c4e58910cc48ff34abf4f90ad2414f09806ab07e63b05ad3111c626144c"} Apr 16 14:52:28.336665 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.336554 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" event={"ID":"cf2d6237-3c32-44f2-bf46-6f36e887e3c2","Type":"ContainerStarted","Data":"92058e67ca0d9798d6a4f20ae4db759fff4aab06e666fdb37eae5dc52cf6f2f3"} Apr 16 14:52:28.336665 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.336566 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" event={"ID":"cf2d6237-3c32-44f2-bf46-6f36e887e3c2","Type":"ContainerStarted","Data":"998fc17eff6269cfddf356ad209225f665cb6ab4881449c3948b8bbebb170113"} Apr 16 14:52:28.336665 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.336577 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" event={"ID":"cf2d6237-3c32-44f2-bf46-6f36e887e3c2","Type":"ContainerStarted","Data":"1f92dd3ab9018e2add4621ef76bca4361d5ea2e433f20ca1741ceae939b4248f"} Apr 16 14:52:28.336665 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.336589 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" event={"ID":"cf2d6237-3c32-44f2-bf46-6f36e887e3c2","Type":"ContainerStarted","Data":"9d1c3e82add547953ccdbe4d79d4382d7e0fdce9ae7abe72ef939b46f7f1dfe4"} Apr 16 14:52:28.337909 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.337865 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" event={"ID":"59571617-fe46-4cd8-8766-d1b4dbb300e8","Type":"ContainerStarted","Data":"3844d55775bff0ee7b7fe4a8e985d22c1b8138d6bc76c31d7a4ddba6196d8f0b"} Apr 16 14:52:28.340870 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.340752 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nzfb6" event={"ID":"c9a20462-c104-4c17-a123-4c9d7acb06df","Type":"ContainerStarted","Data":"b6cc33557429d2c9ea83f8926ef99de10ea8f6a47bafc4af9827bc02e35cf76f"} Apr 16 14:52:28.345091 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.345020 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tkx2t" podStartSLOduration=3.883889624 podStartE2EDuration="20.345002475s" podCreationTimestamp="2026-04-16 14:52:08 +0000 UTC" firstStartedPulling="2026-04-16 14:52:10.802439331 +0000 UTC m=+3.183436928" lastFinishedPulling="2026-04-16 14:52:27.263552173 +0000 UTC m=+19.644549779" observedRunningTime="2026-04-16 14:52:28.34377108 +0000 UTC m=+20.724768697" watchObservedRunningTime="2026-04-16 14:52:28.345002475 +0000 UTC m=+20.726000089" Apr 16 14:52:28.358081 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.358047 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4h6ft" podStartSLOduration=3.639645254 podStartE2EDuration="20.358036512s" podCreationTimestamp="2026-04-16 14:52:08 +0000 UTC" firstStartedPulling="2026-04-16 14:52:10.7778396 +0000 UTC m=+3.158837197" lastFinishedPulling="2026-04-16 14:52:27.496230863 +0000 UTC m=+19.877228455" observedRunningTime="2026-04-16 14:52:28.357664921 +0000 UTC m=+20.738662544" watchObservedRunningTime="2026-04-16 14:52:28.358036512 +0000 UTC m=+20.739034155" Apr 16 14:52:28.389836 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.389799 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-nzfb6" podStartSLOduration=11.933462423 podStartE2EDuration="20.389788687s" podCreationTimestamp="2026-04-16 14:52:08 +0000 UTC" firstStartedPulling="2026-04-16 14:52:10.805034487 +0000 UTC m=+3.186032083" lastFinishedPulling="2026-04-16 14:52:19.26136075 +0000 UTC m=+11.642358347" observedRunningTime="2026-04-16 14:52:28.3894648 +0000 UTC m=+20.770462414" watchObservedRunningTime="2026-04-16 14:52:28.389788687 +0000 UTC m=+20.770786322" Apr 16 14:52:28.400812 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.400767 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-h54t4" podStartSLOduration=13.400753598 podStartE2EDuration="13.400753598s" podCreationTimestamp="2026-04-16 14:52:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:28.400448724 +0000 UTC m=+20.781446340" watchObservedRunningTime="2026-04-16 14:52:28.400753598 +0000 UTC m=+20.781751212" Apr 16 14:52:28.414793 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.414752 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bg8fh" podStartSLOduration=3.706143605 podStartE2EDuration="20.414739015s" podCreationTimestamp="2026-04-16 14:52:08 +0000 UTC" firstStartedPulling="2026-04-16 14:52:10.77639502 +0000 UTC m=+3.157392613" lastFinishedPulling="2026-04-16 14:52:27.484990415 +0000 UTC m=+19.865988023" observedRunningTime="2026-04-16 14:52:28.414679174 +0000 UTC m=+20.795676785" watchObservedRunningTime="2026-04-16 14:52:28.414739015 +0000 UTC m=+20.795736631" Apr 16 14:52:28.603106 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:28.603084 2561 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 14:52:29.143333 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:29.143231 2561 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T14:52:28.603100595Z","UUID":"cded77d2-ae6c-472a-bc66-0d7ba81c2c6b","Handler":null,"Name":"","Endpoint":""} Apr 16 14:52:29.145099 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:29.145075 2561 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 14:52:29.145099 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:29.145104 2561 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 14:52:29.344987 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:29.344937 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-577nk" event={"ID":"227b75fb-5d17-43d6-a870-e0959b3989c4","Type":"ContainerStarted","Data":"397a3b043a559fdfd6a48780c81240448034aa916700e23d89274dad56be0bb7"} Apr 16 14:52:29.346920 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:29.346886 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" event={"ID":"3fc647f3-6334-4298-9319-057ba1d04ae8","Type":"ContainerStarted","Data":"c682462e25a013c18324eb6f14741e430bf9a65c05fc80f4f2d8c9a7d2564fbe"} Apr 16 14:52:29.349933 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:29.349903 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" event={"ID":"cf2d6237-3c32-44f2-bf46-6f36e887e3c2","Type":"ContainerStarted","Data":"a3d415e37459fd51df165358e09dfc24ea036777778a870116cb11682b176b09"} Apr 16 14:52:29.357937 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:29.357889 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-577nk" podStartSLOduration=4.896687683 podStartE2EDuration="21.357871643s" podCreationTimestamp="2026-04-16 14:52:08 +0000 UTC" firstStartedPulling="2026-04-16 14:52:10.802630315 +0000 UTC m=+3.183627917" lastFinishedPulling="2026-04-16 14:52:27.263814285 +0000 UTC m=+19.644811877" observedRunningTime="2026-04-16 14:52:29.357621363 +0000 UTC m=+21.738618968" watchObservedRunningTime="2026-04-16 14:52:29.357871643 +0000 UTC m=+21.738869260" Apr 16 14:52:30.221256 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:30.221222 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:30.221256 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:30.221255 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:30.221769 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:30.221222 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:30.221769 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:30.221339 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6fksx" podUID="2a785a0f-ff3d-4e46-9883-f604a6fec502" Apr 16 14:52:30.221769 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:30.221429 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cp47z" podUID="d37fc185-ce8f-4c06-ace2-ca0a852977db" Apr 16 14:52:30.221769 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:30.221590 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bppkn" podUID="ed427102-c549-468d-8146-32fba6da0a45" Apr 16 14:52:30.353337 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:30.353297 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" event={"ID":"3fc647f3-6334-4298-9319-057ba1d04ae8","Type":"ContainerStarted","Data":"6beb20565bcd6071a0a86e4ef6672a9510e022c649b004ff19eccab86b86e8ab"} Apr 16 14:52:30.364882 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:30.364853 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-nzfb6" Apr 16 14:52:30.380767 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:30.380715 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-94bvm" podStartSLOduration=3.797116543 podStartE2EDuration="22.380697237s" podCreationTimestamp="2026-04-16 14:52:08 +0000 UTC" firstStartedPulling="2026-04-16 14:52:10.80218733 +0000 UTC m=+3.183184923" lastFinishedPulling="2026-04-16 14:52:29.385768022 +0000 UTC m=+21.766765617" observedRunningTime="2026-04-16 14:52:30.378965529 +0000 UTC m=+22.759963145" watchObservedRunningTime="2026-04-16 14:52:30.380697237 +0000 UTC m=+22.761694853" Apr 16 14:52:31.358885 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:31.358850 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" event={"ID":"cf2d6237-3c32-44f2-bf46-6f36e887e3c2","Type":"ContainerStarted","Data":"6bd439977b0ef43c17c7d6eafe36b97092f4a5b4ef8c0a0b4dbf086addea6862"} Apr 16 14:52:32.221761 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:32.221734 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:32.221922 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:32.221772 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:32.221922 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:32.221733 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:32.221922 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:32.221843 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bppkn" podUID="ed427102-c549-468d-8146-32fba6da0a45" Apr 16 14:52:32.221922 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:32.221907 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cp47z" podUID="d37fc185-ce8f-4c06-ace2-ca0a852977db" Apr 16 14:52:32.222135 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:32.221971 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6fksx" podUID="2a785a0f-ff3d-4e46-9883-f604a6fec502" Apr 16 14:52:32.279977 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:32.279953 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret\") pod \"global-pull-secret-syncer-6fksx\" (UID: \"2a785a0f-ff3d-4e46-9883-f604a6fec502\") " pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:32.280096 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:32.280041 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:32.280096 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:32.280082 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret podName:2a785a0f-ff3d-4e46-9883-f604a6fec502 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:48.280069023 +0000 UTC m=+40.661066615 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret") pod "global-pull-secret-syncer-6fksx" (UID: "2a785a0f-ff3d-4e46-9883-f604a6fec502") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:32.928004 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:32.927811 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-nzfb6" Apr 16 14:52:32.928625 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:32.928327 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-nzfb6" Apr 16 14:52:33.363426 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:33.363392 2561 generic.go:358] "Generic (PLEG): container finished" podID="c1f3fc48-12ff-4b51-a439-082e842f2b08" containerID="656073550da9545c9269728ec228df652ff2265be47362912352301620108777" exitCode=0 Apr 16 14:52:33.363584 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:33.363484 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-575fz" event={"ID":"c1f3fc48-12ff-4b51-a439-082e842f2b08","Type":"ContainerDied","Data":"656073550da9545c9269728ec228df652ff2265be47362912352301620108777"} Apr 16 14:52:33.366754 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:33.366733 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" event={"ID":"cf2d6237-3c32-44f2-bf46-6f36e887e3c2","Type":"ContainerStarted","Data":"fd8f54ae68c0405d8e5a8ac8c53ed236235df0990b4f78962e562e58f492ecd1"} Apr 16 14:52:33.367004 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:33.366984 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:33.367115 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:33.367092 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:33.367115 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:33.367112 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:33.367498 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:33.367482 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-nzfb6" Apr 16 14:52:33.380756 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:33.380733 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:33.380847 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:33.380798 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:52:33.418049 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:33.418009 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" podStartSLOduration=8.383707984 podStartE2EDuration="25.417997033s" podCreationTimestamp="2026-04-16 14:52:08 +0000 UTC" firstStartedPulling="2026-04-16 14:52:10.797370987 +0000 UTC m=+3.178368580" lastFinishedPulling="2026-04-16 14:52:27.831660013 +0000 UTC m=+20.212657629" observedRunningTime="2026-04-16 14:52:33.416820216 +0000 UTC m=+25.797817842" watchObservedRunningTime="2026-04-16 14:52:33.417997033 +0000 UTC m=+25.798994647" Apr 16 14:52:34.221556 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:34.221527 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:34.221993 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:34.221665 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:34.221993 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:34.221673 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cp47z" podUID="d37fc185-ce8f-4c06-ace2-ca0a852977db" Apr 16 14:52:34.221993 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:34.221788 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6fksx" podUID="2a785a0f-ff3d-4e46-9883-f604a6fec502" Apr 16 14:52:34.221993 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:34.221847 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:34.221993 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:34.221918 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bppkn" podUID="ed427102-c549-468d-8146-32fba6da0a45" Apr 16 14:52:34.371523 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:34.371494 2561 generic.go:358] "Generic (PLEG): container finished" podID="c1f3fc48-12ff-4b51-a439-082e842f2b08" containerID="1a2dadd3065152efa2ae72d91e36f55d7da2a718aa61c634b30241e9555548e9" exitCode=0 Apr 16 14:52:34.371697 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:34.371579 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-575fz" event={"ID":"c1f3fc48-12ff-4b51-a439-082e842f2b08","Type":"ContainerDied","Data":"1a2dadd3065152efa2ae72d91e36f55d7da2a718aa61c634b30241e9555548e9"} Apr 16 14:52:34.588190 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:34.588162 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cp47z"] Apr 16 14:52:34.588365 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:34.588250 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:34.588365 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:34.588328 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cp47z" podUID="d37fc185-ce8f-4c06-ace2-ca0a852977db" Apr 16 14:52:34.591709 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:34.591675 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6fksx"] Apr 16 14:52:34.591820 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:34.591757 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:34.591893 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:34.591837 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6fksx" podUID="2a785a0f-ff3d-4e46-9883-f604a6fec502" Apr 16 14:52:34.592407 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:34.592382 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bppkn"] Apr 16 14:52:34.592478 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:34.592468 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:34.592565 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:34.592547 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bppkn" podUID="ed427102-c549-468d-8146-32fba6da0a45" Apr 16 14:52:35.375681 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:35.375653 2561 generic.go:358] "Generic (PLEG): container finished" podID="c1f3fc48-12ff-4b51-a439-082e842f2b08" containerID="2f0a9160adbc015d05e255263496f8291a47cdd2e3d86dc9ef25cfb457098c67" exitCode=0 Apr 16 14:52:35.376010 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:35.375739 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-575fz" event={"ID":"c1f3fc48-12ff-4b51-a439-082e842f2b08","Type":"ContainerDied","Data":"2f0a9160adbc015d05e255263496f8291a47cdd2e3d86dc9ef25cfb457098c67"} Apr 16 14:52:36.221506 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:36.221468 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:36.221703 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:36.221468 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:36.221703 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:36.221602 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bppkn" podUID="ed427102-c549-468d-8146-32fba6da0a45" Apr 16 14:52:36.221703 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:36.221468 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:36.221867 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:36.221706 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6fksx" podUID="2a785a0f-ff3d-4e46-9883-f604a6fec502" Apr 16 14:52:36.221867 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:36.221729 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cp47z" podUID="d37fc185-ce8f-4c06-ace2-ca0a852977db" Apr 16 14:52:38.222573 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:38.222391 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:38.223109 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:38.222430 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:38.223109 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:38.222463 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:38.223109 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:38.222766 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6fksx" podUID="2a785a0f-ff3d-4e46-9883-f604a6fec502" Apr 16 14:52:38.223109 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:38.222738 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cp47z" podUID="d37fc185-ce8f-4c06-ace2-ca0a852977db" Apr 16 14:52:38.223109 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:38.222870 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bppkn" podUID="ed427102-c549-468d-8146-32fba6da0a45" Apr 16 14:52:40.221245 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.221213 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:40.221737 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.221223 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:40.221737 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:40.221346 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6fksx" podUID="2a785a0f-ff3d-4e46-9883-f604a6fec502" Apr 16 14:52:40.221737 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.221223 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:40.221737 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:40.221444 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bppkn" podUID="ed427102-c549-468d-8146-32fba6da0a45" Apr 16 14:52:40.221737 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:40.221525 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cp47z" podUID="d37fc185-ce8f-4c06-ace2-ca0a852977db" Apr 16 14:52:40.489117 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.489043 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-86.ec2.internal" event="NodeReady" Apr 16 14:52:40.489283 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.489173 2561 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 14:52:40.547318 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.547288 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-68684b6fc-df98m"] Apr 16 14:52:40.580996 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.580968 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5lzf7"] Apr 16 14:52:40.581171 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.581154 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.583579 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.583555 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:52:40.583905 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.583828 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:52:40.584026 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.583906 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:52:40.584250 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.584211 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-gptk5\"" Apr 16 14:52:40.590066 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.590045 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:52:40.596842 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.596823 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-68684b6fc-df98m"] Apr 16 14:52:40.596925 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.596848 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vlvr9"] Apr 16 14:52:40.596981 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.596959 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5lzf7" Apr 16 14:52:40.599304 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.599284 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 14:52:40.599422 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.599365 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-njff8\"" Apr 16 14:52:40.599484 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.599455 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 14:52:40.613444 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.613367 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5lzf7"] Apr 16 14:52:40.613553 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.613458 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vlvr9"] Apr 16 14:52:40.613553 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.613471 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vlvr9" Apr 16 14:52:40.618761 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.616026 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gcjjc\"" Apr 16 14:52:40.618761 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.616323 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 14:52:40.618761 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.616337 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 14:52:40.618761 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.616423 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 14:52:40.747280 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.747198 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-bound-sa-token\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.747280 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.747236 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx8c6\" (UniqueName: \"kubernetes.io/projected/18926de0-0561-424c-845b-6ea1059c821a-kube-api-access-wx8c6\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:52:40.747492 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.747346 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-certificates\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.747492 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.747375 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d105a85e-d684-4862-877b-ab11c5b1ca26-installation-pull-secrets\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.747492 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.747404 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:52:40.747492 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.747461 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d105a85e-d684-4862-877b-ab11c5b1ca26-trusted-ca\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.747492 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.747489 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18926de0-0561-424c-845b-6ea1059c821a-config-volume\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:52:40.747755 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.747523 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmqd7\" (UniqueName: \"kubernetes.io/projected/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-kube-api-access-vmqd7\") pod \"ingress-canary-vlvr9\" (UID: \"d01b24a9-f9f3-4d8c-830c-38ff2cc50292\") " pod="openshift-ingress-canary/ingress-canary-vlvr9" Apr 16 14:52:40.747755 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.747592 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/18926de0-0561-424c-845b-6ea1059c821a-tmp-dir\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:52:40.747755 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.747647 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d105a85e-d684-4862-877b-ab11c5b1ca26-ca-trust-extracted\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.747755 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.747671 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert\") pod \"ingress-canary-vlvr9\" (UID: \"d01b24a9-f9f3-4d8c-830c-38ff2cc50292\") " pod="openshift-ingress-canary/ingress-canary-vlvr9" Apr 16 14:52:40.747755 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.747703 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d105a85e-d684-4862-877b-ab11c5b1ca26-image-registry-private-configuration\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.747997 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.747759 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sghm\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-kube-api-access-6sghm\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.747997 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.747793 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.848545 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.848512 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18926de0-0561-424c-845b-6ea1059c821a-config-volume\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:52:40.848545 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.848552 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmqd7\" (UniqueName: \"kubernetes.io/projected/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-kube-api-access-vmqd7\") pod \"ingress-canary-vlvr9\" (UID: \"d01b24a9-f9f3-4d8c-830c-38ff2cc50292\") " pod="openshift-ingress-canary/ingress-canary-vlvr9" Apr 16 14:52:40.848829 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.848572 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/18926de0-0561-424c-845b-6ea1059c821a-tmp-dir\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:52:40.848829 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.848592 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d105a85e-d684-4862-877b-ab11c5b1ca26-ca-trust-extracted\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.848829 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.848633 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert\") pod \"ingress-canary-vlvr9\" (UID: \"d01b24a9-f9f3-4d8c-830c-38ff2cc50292\") " pod="openshift-ingress-canary/ingress-canary-vlvr9" Apr 16 14:52:40.848829 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.848659 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d105a85e-d684-4862-877b-ab11c5b1ca26-image-registry-private-configuration\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.848829 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.848679 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6sghm\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-kube-api-access-6sghm\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.848829 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.848704 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.848829 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.848737 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-bound-sa-token\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.849163 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:40.848944 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:40.849163 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.848947 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/18926de0-0561-424c-845b-6ea1059c821a-tmp-dir\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:52:40.849163 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.848980 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d105a85e-d684-4862-877b-ab11c5b1ca26-ca-trust-extracted\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.849163 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.848998 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wx8c6\" (UniqueName: \"kubernetes.io/projected/18926de0-0561-424c-845b-6ea1059c821a-kube-api-access-wx8c6\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:52:40.849163 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:40.849033 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert podName:d01b24a9-f9f3-4d8c-830c-38ff2cc50292 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:41.349012059 +0000 UTC m=+33.730009655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert") pod "ingress-canary-vlvr9" (UID: "d01b24a9-f9f3-4d8c-830c-38ff2cc50292") : secret "canary-serving-cert" not found Apr 16 14:52:40.849163 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:40.849063 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:52:40.849163 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:40.849074 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68684b6fc-df98m: secret "image-registry-tls" not found Apr 16 14:52:40.849163 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.849086 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-certificates\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.849163 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:40.849117 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls podName:d105a85e-d684-4862-877b-ab11c5b1ca26 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:41.349105982 +0000 UTC m=+33.730103574 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls") pod "image-registry-68684b6fc-df98m" (UID: "d105a85e-d684-4862-877b-ab11c5b1ca26") : secret "image-registry-tls" not found Apr 16 14:52:40.849163 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.849132 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d105a85e-d684-4862-877b-ab11c5b1ca26-installation-pull-secrets\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.849163 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.849150 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:52:40.849553 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.849183 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d105a85e-d684-4862-877b-ab11c5b1ca26-trusted-ca\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.849553 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.849210 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18926de0-0561-424c-845b-6ea1059c821a-config-volume\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:52:40.849553 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:40.849259 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:40.849553 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:40.849297 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls podName:18926de0-0561-424c-845b-6ea1059c821a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:41.349283288 +0000 UTC m=+33.730280889 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls") pod "dns-default-5lzf7" (UID: "18926de0-0561-424c-845b-6ea1059c821a") : secret "dns-default-metrics-tls" not found Apr 16 14:52:40.849707 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.849658 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-certificates\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.850078 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.850055 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d105a85e-d684-4862-877b-ab11c5b1ca26-trusted-ca\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.852632 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.852588 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d105a85e-d684-4862-877b-ab11c5b1ca26-installation-pull-secrets\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.852632 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.852586 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d105a85e-d684-4862-877b-ab11c5b1ca26-image-registry-private-configuration\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.857795 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.857769 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmqd7\" (UniqueName: \"kubernetes.io/projected/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-kube-api-access-vmqd7\") pod \"ingress-canary-vlvr9\" (UID: \"d01b24a9-f9f3-4d8c-830c-38ff2cc50292\") " pod="openshift-ingress-canary/ingress-canary-vlvr9" Apr 16 14:52:40.858084 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.858066 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx8c6\" (UniqueName: \"kubernetes.io/projected/18926de0-0561-424c-845b-6ea1059c821a-kube-api-access-wx8c6\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:52:40.858418 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.858397 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-bound-sa-token\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:40.858470 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:40.858409 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sghm\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-kube-api-access-6sghm\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:41.354409 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:41.354386 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert\") pod \"ingress-canary-vlvr9\" (UID: \"d01b24a9-f9f3-4d8c-830c-38ff2cc50292\") " pod="openshift-ingress-canary/ingress-canary-vlvr9" Apr 16 14:52:41.354894 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:41.354438 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:41.354894 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:41.354487 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:52:41.354894 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:41.354528 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:41.354894 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:41.354584 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert podName:d01b24a9-f9f3-4d8c-830c-38ff2cc50292 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:42.354569389 +0000 UTC m=+34.735566981 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert") pod "ingress-canary-vlvr9" (UID: "d01b24a9-f9f3-4d8c-830c-38ff2cc50292") : secret "canary-serving-cert" not found Apr 16 14:52:41.354894 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:41.354593 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:52:41.354894 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:41.354624 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68684b6fc-df98m: secret "image-registry-tls" not found Apr 16 14:52:41.354894 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:41.354636 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:41.354894 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:41.354673 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls podName:d105a85e-d684-4862-877b-ab11c5b1ca26 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:42.354657929 +0000 UTC m=+34.735655526 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls") pod "image-registry-68684b6fc-df98m" (UID: "d105a85e-d684-4862-877b-ab11c5b1ca26") : secret "image-registry-tls" not found Apr 16 14:52:41.354894 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:41.354685 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls podName:18926de0-0561-424c-845b-6ea1059c821a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:42.354679732 +0000 UTC m=+34.735677327 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls") pod "dns-default-5lzf7" (UID: "18926de0-0561-424c-845b-6ea1059c821a") : secret "dns-default-metrics-tls" not found Apr 16 14:52:41.858497 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:41.858467 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs\") pod \"network-metrics-daemon-bppkn\" (UID: \"ed427102-c549-468d-8146-32fba6da0a45\") " pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:41.858732 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:41.858589 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:41.858732 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:41.858666 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs podName:ed427102-c549-468d-8146-32fba6da0a45 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:13.858647592 +0000 UTC m=+66.239645184 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs") pod "network-metrics-daemon-bppkn" (UID: "ed427102-c549-468d-8146-32fba6da0a45") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:41.959027 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:41.959004 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snssf\" (UniqueName: \"kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf\") pod \"network-check-target-cp47z\" (UID: \"d37fc185-ce8f-4c06-ace2-ca0a852977db\") " pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:41.959172 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:41.959155 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:41.959213 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:41.959177 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:41.959213 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:41.959187 2561 projected.go:194] Error preparing data for projected volume kube-api-access-snssf for pod openshift-network-diagnostics/network-check-target-cp47z: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:41.959278 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:41.959231 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf podName:d37fc185-ce8f-4c06-ace2-ca0a852977db nodeName:}" failed. No retries permitted until 2026-04-16 14:53:13.959218607 +0000 UTC m=+66.340216199 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-snssf" (UniqueName: "kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf") pod "network-check-target-cp47z" (UID: "d37fc185-ce8f-4c06-ace2-ca0a852977db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:42.221238 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:42.221158 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:42.221397 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:42.221163 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:52:42.221465 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:42.221171 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:52:42.226270 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:42.226252 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:52:42.226403 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:42.226321 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:52:42.227398 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:42.227378 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:52:42.227561 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:42.227541 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hst6r\"" Apr 16 14:52:42.227659 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:42.227629 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:52:42.227659 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:42.227633 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dpmcp\"" Apr 16 14:52:42.362127 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:42.362099 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert\") pod \"ingress-canary-vlvr9\" (UID: \"d01b24a9-f9f3-4d8c-830c-38ff2cc50292\") " pod="openshift-ingress-canary/ingress-canary-vlvr9" Apr 16 14:52:42.362463 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:42.362147 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:42.362463 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:42.362199 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:52:42.362463 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:42.362242 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:42.362463 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:42.362297 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:42.362463 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:42.362309 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert podName:d01b24a9-f9f3-4d8c-830c-38ff2cc50292 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:44.362293973 +0000 UTC m=+36.743291564 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert") pod "ingress-canary-vlvr9" (UID: "d01b24a9-f9f3-4d8c-830c-38ff2cc50292") : secret "canary-serving-cert" not found Apr 16 14:52:42.362463 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:42.362310 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:52:42.362463 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:42.362330 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68684b6fc-df98m: secret "image-registry-tls" not found Apr 16 14:52:42.362463 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:42.362333 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls podName:18926de0-0561-424c-845b-6ea1059c821a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:44.362323259 +0000 UTC m=+36.743320851 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls") pod "dns-default-5lzf7" (UID: "18926de0-0561-424c-845b-6ea1059c821a") : secret "dns-default-metrics-tls" not found Apr 16 14:52:42.362463 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:42.362372 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls podName:d105a85e-d684-4862-877b-ab11c5b1ca26 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:44.362358133 +0000 UTC m=+36.743355739 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls") pod "image-registry-68684b6fc-df98m" (UID: "d105a85e-d684-4862-877b-ab11c5b1ca26") : secret "image-registry-tls" not found Apr 16 14:52:42.390922 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:42.390900 2561 generic.go:358] "Generic (PLEG): container finished" podID="c1f3fc48-12ff-4b51-a439-082e842f2b08" containerID="29a493d84f63c87f8fb7dc1dfe82a01d30c7b315338d3f45b429e34b31c7d1fe" exitCode=0 Apr 16 14:52:42.391026 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:42.390928 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-575fz" event={"ID":"c1f3fc48-12ff-4b51-a439-082e842f2b08","Type":"ContainerDied","Data":"29a493d84f63c87f8fb7dc1dfe82a01d30c7b315338d3f45b429e34b31c7d1fe"} Apr 16 14:52:43.398331 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:43.398131 2561 generic.go:358] "Generic (PLEG): container finished" podID="c1f3fc48-12ff-4b51-a439-082e842f2b08" containerID="57fb82608944deb41b23f92727e900c74d4ea96d5bbb6db70a1adb1fd336d919" exitCode=0 Apr 16 14:52:43.398713 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:43.398212 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-575fz" event={"ID":"c1f3fc48-12ff-4b51-a439-082e842f2b08","Type":"ContainerDied","Data":"57fb82608944deb41b23f92727e900c74d4ea96d5bbb6db70a1adb1fd336d919"} Apr 16 14:52:44.379600 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:44.379527 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert\") pod \"ingress-canary-vlvr9\" (UID: \"d01b24a9-f9f3-4d8c-830c-38ff2cc50292\") " pod="openshift-ingress-canary/ingress-canary-vlvr9" Apr 16 14:52:44.379600 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:44.379573 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:44.379804 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:44.379640 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:52:44.379804 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:44.379677 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:44.379804 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:44.379732 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert podName:d01b24a9-f9f3-4d8c-830c-38ff2cc50292 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:48.37971769 +0000 UTC m=+40.760715281 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert") pod "ingress-canary-vlvr9" (UID: "d01b24a9-f9f3-4d8c-830c-38ff2cc50292") : secret "canary-serving-cert" not found Apr 16 14:52:44.379804 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:44.379750 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:44.379804 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:44.379792 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls podName:18926de0-0561-424c-845b-6ea1059c821a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:48.379781371 +0000 UTC m=+40.760778962 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls") pod "dns-default-5lzf7" (UID: "18926de0-0561-424c-845b-6ea1059c821a") : secret "dns-default-metrics-tls" not found Apr 16 14:52:44.379804 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:44.379753 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:52:44.379804 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:44.379809 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68684b6fc-df98m: secret "image-registry-tls" not found Apr 16 14:52:44.380034 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:44.379837 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls podName:d105a85e-d684-4862-877b-ab11c5b1ca26 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:48.379827783 +0000 UTC m=+40.760825375 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls") pod "image-registry-68684b6fc-df98m" (UID: "d105a85e-d684-4862-877b-ab11c5b1ca26") : secret "image-registry-tls" not found Apr 16 14:52:44.403935 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:44.403913 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-575fz" event={"ID":"c1f3fc48-12ff-4b51-a439-082e842f2b08","Type":"ContainerStarted","Data":"b2fd814df613346473ca55d3def05818e03523abcabfa5cb724a56502ef964fc"} Apr 16 14:52:44.425117 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:44.425076 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-575fz" podStartSLOduration=5.951269304 podStartE2EDuration="36.425062896s" podCreationTimestamp="2026-04-16 14:52:08 +0000 UTC" firstStartedPulling="2026-04-16 14:52:10.776401752 +0000 UTC m=+3.157399349" lastFinishedPulling="2026-04-16 14:52:41.250195335 +0000 UTC m=+33.631192941" observedRunningTime="2026-04-16 14:52:44.42485561 +0000 UTC m=+36.805853226" watchObservedRunningTime="2026-04-16 14:52:44.425062896 +0000 UTC m=+36.806060511" Apr 16 14:52:48.303725 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:48.303684 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret\") pod \"global-pull-secret-syncer-6fksx\" (UID: \"2a785a0f-ff3d-4e46-9883-f604a6fec502\") " pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:48.306871 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:48.306854 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2a785a0f-ff3d-4e46-9883-f604a6fec502-original-pull-secret\") pod \"global-pull-secret-syncer-6fksx\" (UID: \"2a785a0f-ff3d-4e46-9883-f604a6fec502\") " pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:48.404089 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:48.404056 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:48.404241 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:48.404117 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:52:48.404241 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:48.404177 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert\") pod \"ingress-canary-vlvr9\" (UID: \"d01b24a9-f9f3-4d8c-830c-38ff2cc50292\") " pod="openshift-ingress-canary/ingress-canary-vlvr9" Apr 16 14:52:48.404241 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:48.404225 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:52:48.404397 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:48.404245 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68684b6fc-df98m: secret "image-registry-tls" not found Apr 16 14:52:48.404397 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:48.404265 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:48.404397 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:48.404292 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:48.404397 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:48.404315 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls podName:d105a85e-d684-4862-877b-ab11c5b1ca26 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:56.404294954 +0000 UTC m=+48.785292554 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls") pod "image-registry-68684b6fc-df98m" (UID: "d105a85e-d684-4862-877b-ab11c5b1ca26") : secret "image-registry-tls" not found Apr 16 14:52:48.404397 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:48.404329 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert podName:d01b24a9-f9f3-4d8c-830c-38ff2cc50292 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:56.404323147 +0000 UTC m=+48.785320739 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert") pod "ingress-canary-vlvr9" (UID: "d01b24a9-f9f3-4d8c-830c-38ff2cc50292") : secret "canary-serving-cert" not found Apr 16 14:52:48.404397 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:48.404365 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls podName:18926de0-0561-424c-845b-6ea1059c821a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:56.404346166 +0000 UTC m=+48.785343772 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls") pod "dns-default-5lzf7" (UID: "18926de0-0561-424c-845b-6ea1059c821a") : secret "dns-default-metrics-tls" not found Apr 16 14:52:48.531622 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:48.531570 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6fksx" Apr 16 14:52:48.687971 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:48.687939 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6fksx"] Apr 16 14:52:48.691114 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:52:48.691089 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a785a0f_ff3d_4e46_9883_f604a6fec502.slice/crio-eb9ed2d72da31a8a5e9399cb130371c90e9de792524c5fffc3cdb22980793b09 WatchSource:0}: Error finding container eb9ed2d72da31a8a5e9399cb130371c90e9de792524c5fffc3cdb22980793b09: Status 404 returned error can't find the container with id eb9ed2d72da31a8a5e9399cb130371c90e9de792524c5fffc3cdb22980793b09 Apr 16 14:52:49.413152 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:49.413110 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6fksx" event={"ID":"2a785a0f-ff3d-4e46-9883-f604a6fec502","Type":"ContainerStarted","Data":"eb9ed2d72da31a8a5e9399cb130371c90e9de792524c5fffc3cdb22980793b09"} Apr 16 14:52:53.421869 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:53.421830 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6fksx" event={"ID":"2a785a0f-ff3d-4e46-9883-f604a6fec502","Type":"ContainerStarted","Data":"349fc6663327e00a0e8fb67d4f3ce5914dd0c15c3dab04ad5105de5ed0cfbc0b"} Apr 16 14:52:56.463450 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:56.463411 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert\") pod \"ingress-canary-vlvr9\" (UID: \"d01b24a9-f9f3-4d8c-830c-38ff2cc50292\") " pod="openshift-ingress-canary/ingress-canary-vlvr9" Apr 16 14:52:56.463450 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:56.463456 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:52:56.463951 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:52:56.463503 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:52:56.463951 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:56.463546 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:56.463951 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:56.463603 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert podName:d01b24a9-f9f3-4d8c-830c-38ff2cc50292 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:12.463588792 +0000 UTC m=+64.844586384 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert") pod "ingress-canary-vlvr9" (UID: "d01b24a9-f9f3-4d8c-830c-38ff2cc50292") : secret "canary-serving-cert" not found Apr 16 14:52:56.463951 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:56.463602 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:52:56.463951 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:56.463602 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:56.463951 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:56.463677 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls podName:18926de0-0561-424c-845b-6ea1059c821a nodeName:}" failed. No retries permitted until 2026-04-16 14:53:12.463661969 +0000 UTC m=+64.844659565 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls") pod "dns-default-5lzf7" (UID: "18926de0-0561-424c-845b-6ea1059c821a") : secret "dns-default-metrics-tls" not found Apr 16 14:52:56.463951 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:56.463634 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68684b6fc-df98m: secret "image-registry-tls" not found Apr 16 14:52:56.463951 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:52:56.463720 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls podName:d105a85e-d684-4862-877b-ab11c5b1ca26 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:12.463709566 +0000 UTC m=+64.844707161 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls") pod "image-registry-68684b6fc-df98m" (UID: "d105a85e-d684-4862-877b-ab11c5b1ca26") : secret "image-registry-tls" not found Apr 16 14:53:05.395872 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:05.395843 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vtjk7" Apr 16 14:53:05.421190 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:05.421147 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6fksx" podStartSLOduration=45.52872271 podStartE2EDuration="49.421133676s" podCreationTimestamp="2026-04-16 14:52:16 +0000 UTC" firstStartedPulling="2026-04-16 14:52:48.692844026 +0000 UTC m=+41.073841633" lastFinishedPulling="2026-04-16 14:52:52.585255004 +0000 UTC m=+44.966252599" observedRunningTime="2026-04-16 14:52:53.441560133 +0000 UTC m=+45.822557746" watchObservedRunningTime="2026-04-16 14:53:05.421133676 +0000 UTC m=+57.802131289" Apr 16 14:53:12.475393 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:12.475361 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert\") pod \"ingress-canary-vlvr9\" (UID: \"d01b24a9-f9f3-4d8c-830c-38ff2cc50292\") " pod="openshift-ingress-canary/ingress-canary-vlvr9" Apr 16 14:53:12.475393 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:12.475397 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:53:12.475815 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:12.475431 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:53:12.475815 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:53:12.475504 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:12.475815 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:53:12.475508 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:12.475815 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:53:12.475546 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls podName:18926de0-0561-424c-845b-6ea1059c821a nodeName:}" failed. No retries permitted until 2026-04-16 14:53:44.475533299 +0000 UTC m=+96.856530890 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls") pod "dns-default-5lzf7" (UID: "18926de0-0561-424c-845b-6ea1059c821a") : secret "dns-default-metrics-tls" not found Apr 16 14:53:12.475815 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:53:12.475565 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert podName:d01b24a9-f9f3-4d8c-830c-38ff2cc50292 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:44.475551987 +0000 UTC m=+96.856549583 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert") pod "ingress-canary-vlvr9" (UID: "d01b24a9-f9f3-4d8c-830c-38ff2cc50292") : secret "canary-serving-cert" not found Apr 16 14:53:12.475815 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:53:12.475509 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:12.475815 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:53:12.475578 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68684b6fc-df98m: secret "image-registry-tls" not found Apr 16 14:53:12.475815 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:53:12.475600 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls podName:d105a85e-d684-4862-877b-ab11c5b1ca26 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:44.475594147 +0000 UTC m=+96.856591739 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls") pod "image-registry-68684b6fc-df98m" (UID: "d105a85e-d684-4862-877b-ab11c5b1ca26") : secret "image-registry-tls" not found Apr 16 14:53:13.883924 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:13.883886 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs\") pod \"network-metrics-daemon-bppkn\" (UID: \"ed427102-c549-468d-8146-32fba6da0a45\") " pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:53:13.886557 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:13.886539 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:53:13.894685 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:53:13.894663 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:53:13.894773 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:53:13.894730 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs podName:ed427102-c549-468d-8146-32fba6da0a45 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:17.894710483 +0000 UTC m=+130.275708075 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs") pod "network-metrics-daemon-bppkn" (UID: "ed427102-c549-468d-8146-32fba6da0a45") : secret "metrics-daemon-secret" not found Apr 16 14:53:13.984649 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:13.984618 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snssf\" (UniqueName: \"kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf\") pod \"network-check-target-cp47z\" (UID: \"d37fc185-ce8f-4c06-ace2-ca0a852977db\") " pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:53:13.987262 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:13.987244 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:53:13.997413 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:13.997390 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:53:14.007654 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:14.007632 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snssf\" (UniqueName: \"kubernetes.io/projected/d37fc185-ce8f-4c06-ace2-ca0a852977db-kube-api-access-snssf\") pod \"network-check-target-cp47z\" (UID: \"d37fc185-ce8f-4c06-ace2-ca0a852977db\") " pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:53:14.044012 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:14.043989 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dpmcp\"" Apr 16 14:53:14.051934 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:14.051915 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:53:14.176188 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:14.176157 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cp47z"] Apr 16 14:53:14.179188 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:53:14.179158 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd37fc185_ce8f_4c06_ace2_ca0a852977db.slice/crio-b89d468400d55584ccd79de2b4f54db9f952e2c0a18157875d8556c6a0d8217b WatchSource:0}: Error finding container b89d468400d55584ccd79de2b4f54db9f952e2c0a18157875d8556c6a0d8217b: Status 404 returned error can't find the container with id b89d468400d55584ccd79de2b4f54db9f952e2c0a18157875d8556c6a0d8217b Apr 16 14:53:14.458926 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:14.458857 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cp47z" event={"ID":"d37fc185-ce8f-4c06-ace2-ca0a852977db","Type":"ContainerStarted","Data":"b89d468400d55584ccd79de2b4f54db9f952e2c0a18157875d8556c6a0d8217b"} Apr 16 14:53:17.465570 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:17.465535 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cp47z" event={"ID":"d37fc185-ce8f-4c06-ace2-ca0a852977db","Type":"ContainerStarted","Data":"5b558a94825fb758a61eac2f90eb9f87943c6c0acb833dd6d451091d62148fa2"} Apr 16 14:53:17.465977 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:17.465638 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:53:17.484309 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:17.484255 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-cp47z" podStartSLOduration=66.862112029 podStartE2EDuration="1m9.484239768s" podCreationTimestamp="2026-04-16 14:52:08 +0000 UTC" firstStartedPulling="2026-04-16 14:53:14.180978346 +0000 UTC m=+66.561975938" lastFinishedPulling="2026-04-16 14:53:16.80310608 +0000 UTC m=+69.184103677" observedRunningTime="2026-04-16 14:53:17.481502466 +0000 UTC m=+69.862500081" watchObservedRunningTime="2026-04-16 14:53:17.484239768 +0000 UTC m=+69.865237382" Apr 16 14:53:44.497353 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:44.497261 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert\") pod \"ingress-canary-vlvr9\" (UID: \"d01b24a9-f9f3-4d8c-830c-38ff2cc50292\") " pod="openshift-ingress-canary/ingress-canary-vlvr9" Apr 16 14:53:44.497353 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:44.497311 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:53:44.497353 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:44.497348 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:53:44.497793 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:53:44.497403 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:44.497793 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:53:44.497432 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:44.497793 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:53:44.497455 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:44.497793 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:53:44.497469 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68684b6fc-df98m: secret "image-registry-tls" not found Apr 16 14:53:44.497793 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:53:44.497473 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert podName:d01b24a9-f9f3-4d8c-830c-38ff2cc50292 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:48.49745689 +0000 UTC m=+160.878454485 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert") pod "ingress-canary-vlvr9" (UID: "d01b24a9-f9f3-4d8c-830c-38ff2cc50292") : secret "canary-serving-cert" not found Apr 16 14:53:44.497793 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:53:44.497488 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls podName:18926de0-0561-424c-845b-6ea1059c821a nodeName:}" failed. No retries permitted until 2026-04-16 14:54:48.497482154 +0000 UTC m=+160.878479746 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls") pod "dns-default-5lzf7" (UID: "18926de0-0561-424c-845b-6ea1059c821a") : secret "dns-default-metrics-tls" not found Apr 16 14:53:44.497793 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:53:44.497515 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls podName:d105a85e-d684-4862-877b-ab11c5b1ca26 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:48.497500247 +0000 UTC m=+160.878497844 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls") pod "image-registry-68684b6fc-df98m" (UID: "d105a85e-d684-4862-877b-ab11c5b1ca26") : secret "image-registry-tls" not found Apr 16 14:53:48.469971 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:53:48.469944 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-cp47z" Apr 16 14:54:17.928264 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:17.928228 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs\") pod \"network-metrics-daemon-bppkn\" (UID: \"ed427102-c549-468d-8146-32fba6da0a45\") " pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:54:17.928754 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:17.928365 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:54:17.928754 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:17.928436 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs podName:ed427102-c549-468d-8146-32fba6da0a45 nodeName:}" failed. No retries permitted until 2026-04-16 14:56:19.928419104 +0000 UTC m=+252.309416696 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs") pod "network-metrics-daemon-bppkn" (UID: "ed427102-c549-468d-8146-32fba6da0a45") : secret "metrics-daemon-secret" not found Apr 16 14:54:32.702547 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.702510 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xlt7h"] Apr 16 14:54:32.705169 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.705154 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xlt7h" Apr 16 14:54:32.707878 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.707858 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:54:32.708914 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.708890 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 14:54:32.709054 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.709021 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv"] Apr 16 14:54:32.709174 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.709135 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-tqv2m\"" Apr 16 14:54:32.711560 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.711543 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-sx65q"] Apr 16 14:54:32.714858 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.714840 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-55d8df8b-t92xc"] Apr 16 14:54:32.714958 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.714892 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv" Apr 16 14:54:32.715018 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.715002 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" Apr 16 14:54:32.717739 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.717721 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xlt7h"] Apr 16 14:54:32.717822 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.717810 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:32.718358 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.718329 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 14:54:32.718453 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.718435 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 14:54:32.718595 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.718578 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-2bt2c\"" Apr 16 14:54:32.718691 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.718582 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:54:32.718691 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.718647 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:54:32.718691 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.718678 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 14:54:32.718831 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.718706 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 14:54:32.718979 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.718962 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:54:32.719074 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.719056 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 14:54:32.719140 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.719073 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-6wd8w\"" Apr 16 14:54:32.720857 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.720840 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 14:54:32.721050 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.721035 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 14:54:32.721273 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.721259 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-9fknh\"" Apr 16 14:54:32.721556 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.721539 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 14:54:32.721650 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.721561 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 14:54:32.721650 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.721584 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 14:54:32.721650 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.721539 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 14:54:32.724130 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.724110 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv"] Apr 16 14:54:32.724947 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.724926 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jvldv\" (UID: \"db20efb8-03e9-4adc-82bf-e69768c8c347\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv" Apr 16 14:54:32.725032 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.724984 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hjww\" (UniqueName: \"kubernetes.io/projected/db20efb8-03e9-4adc-82bf-e69768c8c347-kube-api-access-2hjww\") pod \"cluster-monitoring-operator-6667474d89-jvldv\" (UID: \"db20efb8-03e9-4adc-82bf-e69768c8c347\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv" Apr 16 14:54:32.725094 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.725079 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/db20efb8-03e9-4adc-82bf-e69768c8c347-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-jvldv\" (UID: \"db20efb8-03e9-4adc-82bf-e69768c8c347\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv" Apr 16 14:54:32.725150 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.725136 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt8b8\" (UniqueName: \"kubernetes.io/projected/4b6350c7-96e4-468b-b761-620bbf50fa63-kube-api-access-mt8b8\") pod \"volume-data-source-validator-7d955d5dd4-xlt7h\" (UID: \"4b6350c7-96e4-468b-b761-620bbf50fa63\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xlt7h" Apr 16 14:54:32.725624 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.725591 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 14:54:32.729792 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.729776 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-sx65q"] Apr 16 14:54:32.730834 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.730817 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-55d8df8b-t92xc"] Apr 16 14:54:32.826329 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.826296 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mt8b8\" (UniqueName: \"kubernetes.io/projected/4b6350c7-96e4-468b-b761-620bbf50fa63-kube-api-access-mt8b8\") pod \"volume-data-source-validator-7d955d5dd4-xlt7h\" (UID: \"4b6350c7-96e4-468b-b761-620bbf50fa63\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xlt7h" Apr 16 14:54:32.826329 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.826332 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e23b4d8a-18ac-44f6-bf57-10e1ac5b3247-config\") pod \"console-operator-d87b8d5fc-sx65q\" (UID: \"e23b4d8a-18ac-44f6-bf57-10e1ac5b3247\") " pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" Apr 16 14:54:32.826508 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.826371 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dftfl\" (UniqueName: \"kubernetes.io/projected/e23b4d8a-18ac-44f6-bf57-10e1ac5b3247-kube-api-access-dftfl\") pod \"console-operator-d87b8d5fc-sx65q\" (UID: \"e23b4d8a-18ac-44f6-bf57-10e1ac5b3247\") " pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" Apr 16 14:54:32.826508 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.826430 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e23b4d8a-18ac-44f6-bf57-10e1ac5b3247-trusted-ca\") pod \"console-operator-d87b8d5fc-sx65q\" (UID: \"e23b4d8a-18ac-44f6-bf57-10e1ac5b3247\") " pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" Apr 16 14:54:32.826508 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.826480 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-stats-auth\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:32.826634 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.826506 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hjww\" (UniqueName: \"kubernetes.io/projected/db20efb8-03e9-4adc-82bf-e69768c8c347-kube-api-access-2hjww\") pod \"cluster-monitoring-operator-6667474d89-jvldv\" (UID: \"db20efb8-03e9-4adc-82bf-e69768c8c347\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv" Apr 16 14:54:32.826634 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.826553 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e23b4d8a-18ac-44f6-bf57-10e1ac5b3247-serving-cert\") pod \"console-operator-d87b8d5fc-sx65q\" (UID: \"e23b4d8a-18ac-44f6-bf57-10e1ac5b3247\") " pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" Apr 16 14:54:32.826634 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.826599 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-default-certificate\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:32.826740 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.826669 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jvldv\" (UID: \"db20efb8-03e9-4adc-82bf-e69768c8c347\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv" Apr 16 14:54:32.826740 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.826707 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:32.826740 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.826735 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:32.826840 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:32.826753 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:32.826840 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.826761 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt4xs\" (UniqueName: \"kubernetes.io/projected/856eaa92-f51a-4a81-8e75-e2010da158d8-kube-api-access-xt4xs\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:32.826840 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.826788 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/db20efb8-03e9-4adc-82bf-e69768c8c347-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-jvldv\" (UID: \"db20efb8-03e9-4adc-82bf-e69768c8c347\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv" Apr 16 14:54:32.826840 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:32.826810 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls podName:db20efb8-03e9-4adc-82bf-e69768c8c347 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:33.326796195 +0000 UTC m=+145.707793790 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jvldv" (UID: "db20efb8-03e9-4adc-82bf-e69768c8c347") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:32.827325 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.827307 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/db20efb8-03e9-4adc-82bf-e69768c8c347-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-jvldv\" (UID: \"db20efb8-03e9-4adc-82bf-e69768c8c347\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv" Apr 16 14:54:32.838052 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.838029 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt8b8\" (UniqueName: \"kubernetes.io/projected/4b6350c7-96e4-468b-b761-620bbf50fa63-kube-api-access-mt8b8\") pod \"volume-data-source-validator-7d955d5dd4-xlt7h\" (UID: \"4b6350c7-96e4-468b-b761-620bbf50fa63\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xlt7h" Apr 16 14:54:32.838169 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.838036 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hjww\" (UniqueName: \"kubernetes.io/projected/db20efb8-03e9-4adc-82bf-e69768c8c347-kube-api-access-2hjww\") pod \"cluster-monitoring-operator-6667474d89-jvldv\" (UID: \"db20efb8-03e9-4adc-82bf-e69768c8c347\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv" Apr 16 14:54:32.927983 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.927959 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-default-certificate\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:32.928083 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.928001 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:32.928083 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.928020 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:32.928083 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.928047 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt4xs\" (UniqueName: \"kubernetes.io/projected/856eaa92-f51a-4a81-8e75-e2010da158d8-kube-api-access-xt4xs\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:32.928083 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.928080 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e23b4d8a-18ac-44f6-bf57-10e1ac5b3247-config\") pod \"console-operator-d87b8d5fc-sx65q\" (UID: \"e23b4d8a-18ac-44f6-bf57-10e1ac5b3247\") " pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" Apr 16 14:54:32.928221 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.928103 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dftfl\" (UniqueName: \"kubernetes.io/projected/e23b4d8a-18ac-44f6-bf57-10e1ac5b3247-kube-api-access-dftfl\") pod \"console-operator-d87b8d5fc-sx65q\" (UID: \"e23b4d8a-18ac-44f6-bf57-10e1ac5b3247\") " pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" Apr 16 14:54:32.928221 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.928135 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e23b4d8a-18ac-44f6-bf57-10e1ac5b3247-trusted-ca\") pod \"console-operator-d87b8d5fc-sx65q\" (UID: \"e23b4d8a-18ac-44f6-bf57-10e1ac5b3247\") " pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" Apr 16 14:54:32.928221 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.928163 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-stats-auth\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:32.928221 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:32.928184 2561 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:54:32.928221 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.928204 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e23b4d8a-18ac-44f6-bf57-10e1ac5b3247-serving-cert\") pod \"console-operator-d87b8d5fc-sx65q\" (UID: \"e23b4d8a-18ac-44f6-bf57-10e1ac5b3247\") " pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" Apr 16 14:54:32.928461 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:32.928246 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs podName:856eaa92-f51a-4a81-8e75-e2010da158d8 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:33.42822775 +0000 UTC m=+145.809225364 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs") pod "router-default-55d8df8b-t92xc" (UID: "856eaa92-f51a-4a81-8e75-e2010da158d8") : secret "router-metrics-certs-default" not found Apr 16 14:54:32.928461 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:32.928276 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle podName:856eaa92-f51a-4a81-8e75-e2010da158d8 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:33.428265852 +0000 UTC m=+145.809263447 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle") pod "router-default-55d8df8b-t92xc" (UID: "856eaa92-f51a-4a81-8e75-e2010da158d8") : configmap references non-existent config key: service-ca.crt Apr 16 14:54:32.928841 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.928815 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e23b4d8a-18ac-44f6-bf57-10e1ac5b3247-config\") pod \"console-operator-d87b8d5fc-sx65q\" (UID: \"e23b4d8a-18ac-44f6-bf57-10e1ac5b3247\") " pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" Apr 16 14:54:32.929458 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.929323 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e23b4d8a-18ac-44f6-bf57-10e1ac5b3247-trusted-ca\") pod \"console-operator-d87b8d5fc-sx65q\" (UID: \"e23b4d8a-18ac-44f6-bf57-10e1ac5b3247\") " pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" Apr 16 14:54:32.930525 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.930502 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-default-certificate\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:32.930626 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.930585 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-stats-auth\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:32.930673 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.930603 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e23b4d8a-18ac-44f6-bf57-10e1ac5b3247-serving-cert\") pod \"console-operator-d87b8d5fc-sx65q\" (UID: \"e23b4d8a-18ac-44f6-bf57-10e1ac5b3247\") " pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" Apr 16 14:54:32.937260 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.937239 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt4xs\" (UniqueName: \"kubernetes.io/projected/856eaa92-f51a-4a81-8e75-e2010da158d8-kube-api-access-xt4xs\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:32.937578 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:32.937562 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dftfl\" (UniqueName: \"kubernetes.io/projected/e23b4d8a-18ac-44f6-bf57-10e1ac5b3247-kube-api-access-dftfl\") pod \"console-operator-d87b8d5fc-sx65q\" (UID: \"e23b4d8a-18ac-44f6-bf57-10e1ac5b3247\") " pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" Apr 16 14:54:33.014623 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:33.014559 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xlt7h" Apr 16 14:54:33.034521 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:33.034499 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" Apr 16 14:54:33.136324 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:33.136295 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xlt7h"] Apr 16 14:54:33.139650 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:54:33.139600 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b6350c7_96e4_468b_b761_620bbf50fa63.slice/crio-d572f21bf95ac6846353504ef16fcb569d10bc4b289ee97befc4b86c048738fb WatchSource:0}: Error finding container d572f21bf95ac6846353504ef16fcb569d10bc4b289ee97befc4b86c048738fb: Status 404 returned error can't find the container with id d572f21bf95ac6846353504ef16fcb569d10bc4b289ee97befc4b86c048738fb Apr 16 14:54:33.152046 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:33.152022 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-sx65q"] Apr 16 14:54:33.155269 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:54:33.155247 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode23b4d8a_18ac_44f6_bf57_10e1ac5b3247.slice/crio-f31e61df50da0e0189a60bc41331011a3241f548855bc96a560e7a56c0f0f3e3 WatchSource:0}: Error finding container f31e61df50da0e0189a60bc41331011a3241f548855bc96a560e7a56c0f0f3e3: Status 404 returned error can't find the container with id f31e61df50da0e0189a60bc41331011a3241f548855bc96a560e7a56c0f0f3e3 Apr 16 14:54:33.330638 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:33.330597 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jvldv\" (UID: \"db20efb8-03e9-4adc-82bf-e69768c8c347\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv" Apr 16 14:54:33.330755 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:33.330737 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:33.330807 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:33.330798 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls podName:db20efb8-03e9-4adc-82bf-e69768c8c347 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:34.330783193 +0000 UTC m=+146.711780785 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jvldv" (UID: "db20efb8-03e9-4adc-82bf-e69768c8c347") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:33.431152 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:33.431105 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:33.431152 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:33.431154 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:33.431393 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:33.431264 2561 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:54:33.431393 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:33.431294 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle podName:856eaa92-f51a-4a81-8e75-e2010da158d8 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:34.431253293 +0000 UTC m=+146.812250909 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle") pod "router-default-55d8df8b-t92xc" (UID: "856eaa92-f51a-4a81-8e75-e2010da158d8") : configmap references non-existent config key: service-ca.crt Apr 16 14:54:33.431393 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:33.431322 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs podName:856eaa92-f51a-4a81-8e75-e2010da158d8 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:34.431313073 +0000 UTC m=+146.812310669 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs") pod "router-default-55d8df8b-t92xc" (UID: "856eaa92-f51a-4a81-8e75-e2010da158d8") : secret "router-metrics-certs-default" not found Apr 16 14:54:33.605647 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:33.605553 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xlt7h" event={"ID":"4b6350c7-96e4-468b-b761-620bbf50fa63","Type":"ContainerStarted","Data":"d572f21bf95ac6846353504ef16fcb569d10bc4b289ee97befc4b86c048738fb"} Apr 16 14:54:33.606500 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:33.606476 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" event={"ID":"e23b4d8a-18ac-44f6-bf57-10e1ac5b3247","Type":"ContainerStarted","Data":"f31e61df50da0e0189a60bc41331011a3241f548855bc96a560e7a56c0f0f3e3"} Apr 16 14:54:34.338870 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.338829 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jvldv\" (UID: \"db20efb8-03e9-4adc-82bf-e69768c8c347\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv" Apr 16 14:54:34.339341 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:34.338984 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:34.339341 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:34.339073 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls podName:db20efb8-03e9-4adc-82bf-e69768c8c347 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:36.339053061 +0000 UTC m=+148.720050677 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jvldv" (UID: "db20efb8-03e9-4adc-82bf-e69768c8c347") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:34.406167 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.406124 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth4n"] Apr 16 14:54:34.409422 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.409400 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth4n" Apr 16 14:54:34.412579 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.412261 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 14:54:34.412579 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.412502 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-n4f2j\"" Apr 16 14:54:34.412861 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.412835 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 14:54:34.413651 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.413432 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:54:34.413651 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.413475 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 14:54:34.419718 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.419351 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth4n"] Apr 16 14:54:34.440176 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.440151 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmpqj\" (UniqueName: \"kubernetes.io/projected/46f7af62-73ce-4f89-a210-d2280368ebfc-kube-api-access-dmpqj\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nth4n\" (UID: \"46f7af62-73ce-4f89-a210-d2280368ebfc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth4n" Apr 16 14:54:34.440269 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.440201 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:34.440323 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.440260 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:34.440323 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:34.440300 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle podName:856eaa92-f51a-4a81-8e75-e2010da158d8 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:36.440281303 +0000 UTC m=+148.821278895 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle") pod "router-default-55d8df8b-t92xc" (UID: "856eaa92-f51a-4a81-8e75-e2010da158d8") : configmap references non-existent config key: service-ca.crt Apr 16 14:54:34.440423 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:34.440357 2561 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:54:34.440423 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.440397 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46f7af62-73ce-4f89-a210-d2280368ebfc-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nth4n\" (UID: \"46f7af62-73ce-4f89-a210-d2280368ebfc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth4n" Apr 16 14:54:34.440423 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:34.440404 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs podName:856eaa92-f51a-4a81-8e75-e2010da158d8 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:36.440388746 +0000 UTC m=+148.821386345 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs") pod "router-default-55d8df8b-t92xc" (UID: "856eaa92-f51a-4a81-8e75-e2010da158d8") : secret "router-metrics-certs-default" not found Apr 16 14:54:34.440558 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.440441 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46f7af62-73ce-4f89-a210-d2280368ebfc-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nth4n\" (UID: \"46f7af62-73ce-4f89-a210-d2280368ebfc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth4n" Apr 16 14:54:34.541208 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.541173 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46f7af62-73ce-4f89-a210-d2280368ebfc-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nth4n\" (UID: \"46f7af62-73ce-4f89-a210-d2280368ebfc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth4n" Apr 16 14:54:34.541393 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.541315 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46f7af62-73ce-4f89-a210-d2280368ebfc-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nth4n\" (UID: \"46f7af62-73ce-4f89-a210-d2280368ebfc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth4n" Apr 16 14:54:34.541456 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.541408 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmpqj\" (UniqueName: \"kubernetes.io/projected/46f7af62-73ce-4f89-a210-d2280368ebfc-kube-api-access-dmpqj\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nth4n\" (UID: \"46f7af62-73ce-4f89-a210-d2280368ebfc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth4n" Apr 16 14:54:34.541787 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.541756 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46f7af62-73ce-4f89-a210-d2280368ebfc-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nth4n\" (UID: \"46f7af62-73ce-4f89-a210-d2280368ebfc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth4n" Apr 16 14:54:34.544113 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.544083 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46f7af62-73ce-4f89-a210-d2280368ebfc-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nth4n\" (UID: \"46f7af62-73ce-4f89-a210-d2280368ebfc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth4n" Apr 16 14:54:34.549493 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.549449 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmpqj\" (UniqueName: \"kubernetes.io/projected/46f7af62-73ce-4f89-a210-d2280368ebfc-kube-api-access-dmpqj\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nth4n\" (UID: \"46f7af62-73ce-4f89-a210-d2280368ebfc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth4n" Apr 16 14:54:34.722563 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:34.722483 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth4n" Apr 16 14:54:35.223281 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:35.223254 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth4n"] Apr 16 14:54:35.226322 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:54:35.226288 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46f7af62_73ce_4f89_a210_d2280368ebfc.slice/crio-f5699f30a3ffba2c86d2b9d1cd9375557e45ada4128997587efa0b59fcdb7491 WatchSource:0}: Error finding container f5699f30a3ffba2c86d2b9d1cd9375557e45ada4128997587efa0b59fcdb7491: Status 404 returned error can't find the container with id f5699f30a3ffba2c86d2b9d1cd9375557e45ada4128997587efa0b59fcdb7491 Apr 16 14:54:35.611577 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:35.611540 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth4n" event={"ID":"46f7af62-73ce-4f89-a210-d2280368ebfc","Type":"ContainerStarted","Data":"f5699f30a3ffba2c86d2b9d1cd9375557e45ada4128997587efa0b59fcdb7491"} Apr 16 14:54:35.612773 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:35.612750 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xlt7h" event={"ID":"4b6350c7-96e4-468b-b761-620bbf50fa63","Type":"ContainerStarted","Data":"777f345a634a7b9427e9666efb2c3284e86abef00a40c6ffee9e71ba59fa3c01"} Apr 16 14:54:35.613995 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:35.613978 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-sx65q_e23b4d8a-18ac-44f6-bf57-10e1ac5b3247/console-operator/0.log" Apr 16 14:54:35.614073 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:35.614012 2561 generic.go:358] "Generic (PLEG): container finished" podID="e23b4d8a-18ac-44f6-bf57-10e1ac5b3247" containerID="21f826f26c138969a1f669dade87e23136a763cc4d3e2d8cd6d0c9f90548bba5" exitCode=255 Apr 16 14:54:35.614073 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:35.614034 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" event={"ID":"e23b4d8a-18ac-44f6-bf57-10e1ac5b3247","Type":"ContainerDied","Data":"21f826f26c138969a1f669dade87e23136a763cc4d3e2d8cd6d0c9f90548bba5"} Apr 16 14:54:35.614265 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:35.614251 2561 scope.go:117] "RemoveContainer" containerID="21f826f26c138969a1f669dade87e23136a763cc4d3e2d8cd6d0c9f90548bba5" Apr 16 14:54:35.626540 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:35.626504 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-xlt7h" podStartSLOduration=1.657805848 podStartE2EDuration="3.626489841s" podCreationTimestamp="2026-04-16 14:54:32 +0000 UTC" firstStartedPulling="2026-04-16 14:54:33.141532319 +0000 UTC m=+145.522529911" lastFinishedPulling="2026-04-16 14:54:35.110216297 +0000 UTC m=+147.491213904" observedRunningTime="2026-04-16 14:54:35.625953889 +0000 UTC m=+148.006951504" watchObservedRunningTime="2026-04-16 14:54:35.626489841 +0000 UTC m=+148.007487456" Apr 16 14:54:36.356153 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:36.356118 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jvldv\" (UID: \"db20efb8-03e9-4adc-82bf-e69768c8c347\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv" Apr 16 14:54:36.356313 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:36.356254 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:36.356354 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:36.356318 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls podName:db20efb8-03e9-4adc-82bf-e69768c8c347 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:40.356302383 +0000 UTC m=+152.737299975 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jvldv" (UID: "db20efb8-03e9-4adc-82bf-e69768c8c347") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:36.457145 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:36.457060 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:36.457145 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:36.457100 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:36.457340 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:36.457193 2561 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:54:36.457340 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:36.457212 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle podName:856eaa92-f51a-4a81-8e75-e2010da158d8 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:40.457190983 +0000 UTC m=+152.838188582 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle") pod "router-default-55d8df8b-t92xc" (UID: "856eaa92-f51a-4a81-8e75-e2010da158d8") : configmap references non-existent config key: service-ca.crt Apr 16 14:54:36.457340 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:36.457240 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs podName:856eaa92-f51a-4a81-8e75-e2010da158d8 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:40.457225939 +0000 UTC m=+152.838223532 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs") pod "router-default-55d8df8b-t92xc" (UID: "856eaa92-f51a-4a81-8e75-e2010da158d8") : secret "router-metrics-certs-default" not found Apr 16 14:54:36.617260 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:36.617232 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-sx65q_e23b4d8a-18ac-44f6-bf57-10e1ac5b3247/console-operator/1.log" Apr 16 14:54:36.617694 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:36.617677 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-sx65q_e23b4d8a-18ac-44f6-bf57-10e1ac5b3247/console-operator/0.log" Apr 16 14:54:36.617761 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:36.617722 2561 generic.go:358] "Generic (PLEG): container finished" podID="e23b4d8a-18ac-44f6-bf57-10e1ac5b3247" containerID="1c7df864ed164359da8cc304f3a716f9b7e39d299703864810431b1ce058a70f" exitCode=255 Apr 16 14:54:36.617834 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:36.617811 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" event={"ID":"e23b4d8a-18ac-44f6-bf57-10e1ac5b3247","Type":"ContainerDied","Data":"1c7df864ed164359da8cc304f3a716f9b7e39d299703864810431b1ce058a70f"} Apr 16 14:54:36.617883 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:36.617857 2561 scope.go:117] "RemoveContainer" containerID="21f826f26c138969a1f669dade87e23136a763cc4d3e2d8cd6d0c9f90548bba5" Apr 16 14:54:36.618101 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:36.618071 2561 scope.go:117] "RemoveContainer" containerID="1c7df864ed164359da8cc304f3a716f9b7e39d299703864810431b1ce058a70f" Apr 16 14:54:36.618328 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:36.618306 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-sx65q_openshift-console-operator(e23b4d8a-18ac-44f6-bf57-10e1ac5b3247)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" podUID="e23b4d8a-18ac-44f6-bf57-10e1ac5b3247" Apr 16 14:54:37.433543 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.433510 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-2lqmg"] Apr 16 14:54:37.436376 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.436359 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-2lqmg" Apr 16 14:54:37.439053 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.439032 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 14:54:37.439256 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.439228 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-phchz\"" Apr 16 14:54:37.439370 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.439228 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 14:54:37.439370 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.439287 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:54:37.439370 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.439296 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 14:54:37.443148 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.443120 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-2lqmg"] Apr 16 14:54:37.466077 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.466055 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ps99\" (UniqueName: \"kubernetes.io/projected/267060b3-88e5-4515-b499-8a01192a414b-kube-api-access-6ps99\") pod \"service-ca-operator-69965bb79d-2lqmg\" (UID: \"267060b3-88e5-4515-b499-8a01192a414b\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-2lqmg" Apr 16 14:54:37.466192 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.466096 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267060b3-88e5-4515-b499-8a01192a414b-config\") pod \"service-ca-operator-69965bb79d-2lqmg\" (UID: \"267060b3-88e5-4515-b499-8a01192a414b\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-2lqmg" Apr 16 14:54:37.466192 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.466170 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/267060b3-88e5-4515-b499-8a01192a414b-serving-cert\") pod \"service-ca-operator-69965bb79d-2lqmg\" (UID: \"267060b3-88e5-4515-b499-8a01192a414b\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-2lqmg" Apr 16 14:54:37.567597 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.567566 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ps99\" (UniqueName: \"kubernetes.io/projected/267060b3-88e5-4515-b499-8a01192a414b-kube-api-access-6ps99\") pod \"service-ca-operator-69965bb79d-2lqmg\" (UID: \"267060b3-88e5-4515-b499-8a01192a414b\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-2lqmg" Apr 16 14:54:37.567744 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.567631 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267060b3-88e5-4515-b499-8a01192a414b-config\") pod \"service-ca-operator-69965bb79d-2lqmg\" (UID: \"267060b3-88e5-4515-b499-8a01192a414b\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-2lqmg" Apr 16 14:54:37.567804 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.567746 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/267060b3-88e5-4515-b499-8a01192a414b-serving-cert\") pod \"service-ca-operator-69965bb79d-2lqmg\" (UID: \"267060b3-88e5-4515-b499-8a01192a414b\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-2lqmg" Apr 16 14:54:37.568191 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.568173 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/267060b3-88e5-4515-b499-8a01192a414b-config\") pod \"service-ca-operator-69965bb79d-2lqmg\" (UID: \"267060b3-88e5-4515-b499-8a01192a414b\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-2lqmg" Apr 16 14:54:37.569860 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.569833 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/267060b3-88e5-4515-b499-8a01192a414b-serving-cert\") pod \"service-ca-operator-69965bb79d-2lqmg\" (UID: \"267060b3-88e5-4515-b499-8a01192a414b\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-2lqmg" Apr 16 14:54:37.576633 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.576584 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ps99\" (UniqueName: \"kubernetes.io/projected/267060b3-88e5-4515-b499-8a01192a414b-kube-api-access-6ps99\") pod \"service-ca-operator-69965bb79d-2lqmg\" (UID: \"267060b3-88e5-4515-b499-8a01192a414b\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-2lqmg" Apr 16 14:54:37.620914 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.620884 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth4n" event={"ID":"46f7af62-73ce-4f89-a210-d2280368ebfc","Type":"ContainerStarted","Data":"5a260ac684a3d0715d62b739a8451486e4cf77058963d4c67334b2198da70174"} Apr 16 14:54:37.622218 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.622201 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-sx65q_e23b4d8a-18ac-44f6-bf57-10e1ac5b3247/console-operator/1.log" Apr 16 14:54:37.622519 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.622505 2561 scope.go:117] "RemoveContainer" containerID="1c7df864ed164359da8cc304f3a716f9b7e39d299703864810431b1ce058a70f" Apr 16 14:54:37.622720 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:37.622702 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-sx65q_openshift-console-operator(e23b4d8a-18ac-44f6-bf57-10e1ac5b3247)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" podUID="e23b4d8a-18ac-44f6-bf57-10e1ac5b3247" Apr 16 14:54:37.638957 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.638920 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth4n" podStartSLOduration=1.872729245 podStartE2EDuration="3.638909211s" podCreationTimestamp="2026-04-16 14:54:34 +0000 UTC" firstStartedPulling="2026-04-16 14:54:35.228456545 +0000 UTC m=+147.609454137" lastFinishedPulling="2026-04-16 14:54:36.994636506 +0000 UTC m=+149.375634103" observedRunningTime="2026-04-16 14:54:37.638857405 +0000 UTC m=+150.019855020" watchObservedRunningTime="2026-04-16 14:54:37.638909211 +0000 UTC m=+150.019906852" Apr 16 14:54:37.745523 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.745449 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-2lqmg" Apr 16 14:54:37.859939 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.859917 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-2lqmg"] Apr 16 14:54:37.861865 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:54:37.861838 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod267060b3_88e5_4515_b499_8a01192a414b.slice/crio-fe72bb4e58fa53b82d7d11b34a5c04b344494a8f85b01e460419e8f5857b2eab WatchSource:0}: Error finding container fe72bb4e58fa53b82d7d11b34a5c04b344494a8f85b01e460419e8f5857b2eab: Status 404 returned error can't find the container with id fe72bb4e58fa53b82d7d11b34a5c04b344494a8f85b01e460419e8f5857b2eab Apr 16 14:54:37.989217 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:37.989197 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-h54t4_72695b7c-a9cd-4e46-80e9-10740ab7de94/dns-node-resolver/0.log" Apr 16 14:54:38.625381 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:38.625340 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-2lqmg" event={"ID":"267060b3-88e5-4515-b499-8a01192a414b","Type":"ContainerStarted","Data":"fe72bb4e58fa53b82d7d11b34a5c04b344494a8f85b01e460419e8f5857b2eab"} Apr 16 14:54:38.991247 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:38.991183 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tkx2t_f593ceff-50dd-4533-bbeb-0dcc375c12b9/node-ca/0.log" Apr 16 14:54:40.391439 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:40.391406 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jvldv\" (UID: \"db20efb8-03e9-4adc-82bf-e69768c8c347\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv" Apr 16 14:54:40.391856 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:40.391518 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:40.391856 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:40.391568 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls podName:db20efb8-03e9-4adc-82bf-e69768c8c347 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:48.391554282 +0000 UTC m=+160.772551874 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jvldv" (UID: "db20efb8-03e9-4adc-82bf-e69768c8c347") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:40.492521 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:40.492490 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:40.492662 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:40.492530 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:40.492662 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:40.492658 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle podName:856eaa92-f51a-4a81-8e75-e2010da158d8 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:48.492637168 +0000 UTC m=+160.873634763 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle") pod "router-default-55d8df8b-t92xc" (UID: "856eaa92-f51a-4a81-8e75-e2010da158d8") : configmap references non-existent config key: service-ca.crt Apr 16 14:54:40.492779 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:40.492663 2561 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:54:40.492779 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:40.492716 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs podName:856eaa92-f51a-4a81-8e75-e2010da158d8 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:48.492699335 +0000 UTC m=+160.873696947 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs") pod "router-default-55d8df8b-t92xc" (UID: "856eaa92-f51a-4a81-8e75-e2010da158d8") : secret "router-metrics-certs-default" not found Apr 16 14:54:40.607236 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:40.607208 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-7htx8"] Apr 16 14:54:40.610096 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:40.610081 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7htx8" Apr 16 14:54:40.613253 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:40.612852 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 14:54:40.613253 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:40.612868 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 14:54:40.613253 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:40.613118 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-r94mf\"" Apr 16 14:54:40.619198 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:40.619170 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-7htx8"] Apr 16 14:54:40.630878 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:40.630857 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-2lqmg" event={"ID":"267060b3-88e5-4515-b499-8a01192a414b","Type":"ContainerStarted","Data":"efd5f26a39c7b037483e2245137b62e725ebbc03cd311428ba457dcb79554f9e"} Apr 16 14:54:40.647480 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:40.647405 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-2lqmg" podStartSLOduration=1.632357856 podStartE2EDuration="3.647391242s" podCreationTimestamp="2026-04-16 14:54:37 +0000 UTC" firstStartedPulling="2026-04-16 14:54:37.864269344 +0000 UTC m=+150.245266939" lastFinishedPulling="2026-04-16 14:54:39.879302734 +0000 UTC m=+152.260300325" observedRunningTime="2026-04-16 14:54:40.646263289 +0000 UTC m=+153.027260903" watchObservedRunningTime="2026-04-16 14:54:40.647391242 +0000 UTC m=+153.028388858" Apr 16 14:54:40.693809 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:40.693778 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dc9b1052-22c3-43fc-82e3-c5e203f94377-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-7htx8\" (UID: \"dc9b1052-22c3-43fc-82e3-c5e203f94377\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7htx8" Apr 16 14:54:40.693922 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:40.693908 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dc9b1052-22c3-43fc-82e3-c5e203f94377-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-7htx8\" (UID: \"dc9b1052-22c3-43fc-82e3-c5e203f94377\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7htx8" Apr 16 14:54:40.794853 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:40.794823 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dc9b1052-22c3-43fc-82e3-c5e203f94377-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-7htx8\" (UID: \"dc9b1052-22c3-43fc-82e3-c5e203f94377\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7htx8" Apr 16 14:54:40.794964 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:40.794913 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dc9b1052-22c3-43fc-82e3-c5e203f94377-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-7htx8\" (UID: \"dc9b1052-22c3-43fc-82e3-c5e203f94377\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7htx8" Apr 16 14:54:40.794964 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:40.794943 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:54:40.795074 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:40.794998 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc9b1052-22c3-43fc-82e3-c5e203f94377-networking-console-plugin-cert podName:dc9b1052-22c3-43fc-82e3-c5e203f94377 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:41.294984303 +0000 UTC m=+153.675981895 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/dc9b1052-22c3-43fc-82e3-c5e203f94377-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-7htx8" (UID: "dc9b1052-22c3-43fc-82e3-c5e203f94377") : secret "networking-console-plugin-cert" not found Apr 16 14:54:40.795501 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:40.795481 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dc9b1052-22c3-43fc-82e3-c5e203f94377-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-7htx8\" (UID: \"dc9b1052-22c3-43fc-82e3-c5e203f94377\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7htx8" Apr 16 14:54:41.300534 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:41.300502 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dc9b1052-22c3-43fc-82e3-c5e203f94377-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-7htx8\" (UID: \"dc9b1052-22c3-43fc-82e3-c5e203f94377\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7htx8" Apr 16 14:54:41.300731 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:41.300665 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:54:41.300790 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:41.300740 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc9b1052-22c3-43fc-82e3-c5e203f94377-networking-console-plugin-cert podName:dc9b1052-22c3-43fc-82e3-c5e203f94377 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:42.30072038 +0000 UTC m=+154.681717978 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/dc9b1052-22c3-43fc-82e3-c5e203f94377-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-7htx8" (UID: "dc9b1052-22c3-43fc-82e3-c5e203f94377") : secret "networking-console-plugin-cert" not found Apr 16 14:54:42.309650 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:42.309619 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dc9b1052-22c3-43fc-82e3-c5e203f94377-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-7htx8\" (UID: \"dc9b1052-22c3-43fc-82e3-c5e203f94377\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7htx8" Apr 16 14:54:42.310003 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:42.309765 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:54:42.310003 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:42.309835 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc9b1052-22c3-43fc-82e3-c5e203f94377-networking-console-plugin-cert podName:dc9b1052-22c3-43fc-82e3-c5e203f94377 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:44.309820103 +0000 UTC m=+156.690817700 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/dc9b1052-22c3-43fc-82e3-c5e203f94377-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-7htx8" (UID: "dc9b1052-22c3-43fc-82e3-c5e203f94377") : secret "networking-console-plugin-cert" not found Apr 16 14:54:43.035601 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:43.035565 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" Apr 16 14:54:43.035601 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:43.035599 2561 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" Apr 16 14:54:43.035964 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:43.035947 2561 scope.go:117] "RemoveContainer" containerID="1c7df864ed164359da8cc304f3a716f9b7e39d299703864810431b1ce058a70f" Apr 16 14:54:43.036116 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:43.036099 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-sx65q_openshift-console-operator(e23b4d8a-18ac-44f6-bf57-10e1ac5b3247)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" podUID="e23b4d8a-18ac-44f6-bf57-10e1ac5b3247" Apr 16 14:54:43.593309 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:43.593271 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-68684b6fc-df98m" podUID="d105a85e-d684-4862-877b-ab11c5b1ca26" Apr 16 14:54:43.607430 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:43.607401 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-5lzf7" podUID="18926de0-0561-424c-845b-6ea1059c821a" Apr 16 14:54:43.625661 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:43.625637 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-vlvr9" podUID="d01b24a9-f9f3-4d8c-830c-38ff2cc50292" Apr 16 14:54:43.638443 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:43.638419 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vlvr9" Apr 16 14:54:43.638548 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:43.638459 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:54:43.638645 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:43.638629 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5lzf7" Apr 16 14:54:43.940657 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:43.940585 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-cl6mc"] Apr 16 14:54:43.943686 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:43.943672 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-cl6mc" Apr 16 14:54:43.946131 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:43.946101 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 14:54:43.946370 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:43.946354 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 14:54:43.947150 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:43.947130 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 14:54:43.947248 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:43.947131 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 14:54:43.947248 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:43.947137 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-q7znb\"" Apr 16 14:54:43.950692 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:43.950673 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-cl6mc"] Apr 16 14:54:44.021917 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:44.021894 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjt5w\" (UniqueName: \"kubernetes.io/projected/32d22cbb-616a-452b-94d4-3b9045fe3041-kube-api-access-qjt5w\") pod \"service-ca-bfc587fb7-cl6mc\" (UID: \"32d22cbb-616a-452b-94d4-3b9045fe3041\") " pod="openshift-service-ca/service-ca-bfc587fb7-cl6mc" Apr 16 14:54:44.022021 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:44.022004 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/32d22cbb-616a-452b-94d4-3b9045fe3041-signing-key\") pod \"service-ca-bfc587fb7-cl6mc\" (UID: \"32d22cbb-616a-452b-94d4-3b9045fe3041\") " pod="openshift-service-ca/service-ca-bfc587fb7-cl6mc" Apr 16 14:54:44.022063 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:44.022030 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/32d22cbb-616a-452b-94d4-3b9045fe3041-signing-cabundle\") pod \"service-ca-bfc587fb7-cl6mc\" (UID: \"32d22cbb-616a-452b-94d4-3b9045fe3041\") " pod="openshift-service-ca/service-ca-bfc587fb7-cl6mc" Apr 16 14:54:44.122374 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:44.122348 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/32d22cbb-616a-452b-94d4-3b9045fe3041-signing-key\") pod \"service-ca-bfc587fb7-cl6mc\" (UID: \"32d22cbb-616a-452b-94d4-3b9045fe3041\") " pod="openshift-service-ca/service-ca-bfc587fb7-cl6mc" Apr 16 14:54:44.122487 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:44.122376 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/32d22cbb-616a-452b-94d4-3b9045fe3041-signing-cabundle\") pod \"service-ca-bfc587fb7-cl6mc\" (UID: \"32d22cbb-616a-452b-94d4-3b9045fe3041\") " pod="openshift-service-ca/service-ca-bfc587fb7-cl6mc" Apr 16 14:54:44.122559 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:44.122541 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjt5w\" (UniqueName: \"kubernetes.io/projected/32d22cbb-616a-452b-94d4-3b9045fe3041-kube-api-access-qjt5w\") pod \"service-ca-bfc587fb7-cl6mc\" (UID: \"32d22cbb-616a-452b-94d4-3b9045fe3041\") " pod="openshift-service-ca/service-ca-bfc587fb7-cl6mc" Apr 16 14:54:44.122958 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:44.122940 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/32d22cbb-616a-452b-94d4-3b9045fe3041-signing-cabundle\") pod \"service-ca-bfc587fb7-cl6mc\" (UID: \"32d22cbb-616a-452b-94d4-3b9045fe3041\") " pod="openshift-service-ca/service-ca-bfc587fb7-cl6mc" Apr 16 14:54:44.124618 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:44.124591 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/32d22cbb-616a-452b-94d4-3b9045fe3041-signing-key\") pod \"service-ca-bfc587fb7-cl6mc\" (UID: \"32d22cbb-616a-452b-94d4-3b9045fe3041\") " pod="openshift-service-ca/service-ca-bfc587fb7-cl6mc" Apr 16 14:54:44.130412 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:44.130392 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjt5w\" (UniqueName: \"kubernetes.io/projected/32d22cbb-616a-452b-94d4-3b9045fe3041-kube-api-access-qjt5w\") pod \"service-ca-bfc587fb7-cl6mc\" (UID: \"32d22cbb-616a-452b-94d4-3b9045fe3041\") " pod="openshift-service-ca/service-ca-bfc587fb7-cl6mc" Apr 16 14:54:44.252786 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:44.252718 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-cl6mc" Apr 16 14:54:44.324294 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:44.324264 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dc9b1052-22c3-43fc-82e3-c5e203f94377-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-7htx8\" (UID: \"dc9b1052-22c3-43fc-82e3-c5e203f94377\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7htx8" Apr 16 14:54:44.324434 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:44.324417 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:54:44.324495 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:44.324485 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc9b1052-22c3-43fc-82e3-c5e203f94377-networking-console-plugin-cert podName:dc9b1052-22c3-43fc-82e3-c5e203f94377 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:48.324467649 +0000 UTC m=+160.705465260 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/dc9b1052-22c3-43fc-82e3-c5e203f94377-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-7htx8" (UID: "dc9b1052-22c3-43fc-82e3-c5e203f94377") : secret "networking-console-plugin-cert" not found Apr 16 14:54:44.361733 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:44.361708 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-cl6mc"] Apr 16 14:54:44.364962 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:54:44.364938 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32d22cbb_616a_452b_94d4_3b9045fe3041.slice/crio-9f2976b472592cda4f89eb4bf2f12e90f1008b5e3d78d64c32173c0d4586657e WatchSource:0}: Error finding container 9f2976b472592cda4f89eb4bf2f12e90f1008b5e3d78d64c32173c0d4586657e: Status 404 returned error can't find the container with id 9f2976b472592cda4f89eb4bf2f12e90f1008b5e3d78d64c32173c0d4586657e Apr 16 14:54:44.642249 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:44.642216 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-cl6mc" event={"ID":"32d22cbb-616a-452b-94d4-3b9045fe3041","Type":"ContainerStarted","Data":"44e7de6556bc36b3d0737822141c324c5b971045d6b59fd65d434cd9b064228f"} Apr 16 14:54:44.642249 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:44.642250 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-cl6mc" event={"ID":"32d22cbb-616a-452b-94d4-3b9045fe3041","Type":"ContainerStarted","Data":"9f2976b472592cda4f89eb4bf2f12e90f1008b5e3d78d64c32173c0d4586657e"} Apr 16 14:54:44.660407 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:44.660366 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-cl6mc" podStartSLOduration=1.6603542340000002 podStartE2EDuration="1.660354234s" podCreationTimestamp="2026-04-16 14:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:54:44.659437949 +0000 UTC m=+157.040435568" watchObservedRunningTime="2026-04-16 14:54:44.660354234 +0000 UTC m=+157.041351847" Apr 16 14:54:45.236459 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:45.236418 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-bppkn" podUID="ed427102-c549-468d-8146-32fba6da0a45" Apr 16 14:54:48.357942 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:48.357909 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dc9b1052-22c3-43fc-82e3-c5e203f94377-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-7htx8\" (UID: \"dc9b1052-22c3-43fc-82e3-c5e203f94377\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7htx8" Apr 16 14:54:48.358413 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:48.358081 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:54:48.358413 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:48.358167 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc9b1052-22c3-43fc-82e3-c5e203f94377-networking-console-plugin-cert podName:dc9b1052-22c3-43fc-82e3-c5e203f94377 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:56.358138207 +0000 UTC m=+168.739135802 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/dc9b1052-22c3-43fc-82e3-c5e203f94377-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-7htx8" (UID: "dc9b1052-22c3-43fc-82e3-c5e203f94377") : secret "networking-console-plugin-cert" not found Apr 16 14:54:48.458396 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:48.458366 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jvldv\" (UID: \"db20efb8-03e9-4adc-82bf-e69768c8c347\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv" Apr 16 14:54:48.458566 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:48.458528 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:48.458698 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:48.458600 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls podName:db20efb8-03e9-4adc-82bf-e69768c8c347 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:04.458579569 +0000 UTC m=+176.839577162 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jvldv" (UID: "db20efb8-03e9-4adc-82bf-e69768c8c347") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:48.559102 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:48.559072 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:54:48.559239 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:48.559135 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert\") pod \"ingress-canary-vlvr9\" (UID: \"d01b24a9-f9f3-4d8c-830c-38ff2cc50292\") " pod="openshift-ingress-canary/ingress-canary-vlvr9" Apr 16 14:54:48.559239 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:48.559170 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls\") pod \"image-registry-68684b6fc-df98m\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:54:48.559239 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:48.559211 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:54:48.559379 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:48.559242 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:48.559379 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:48.559270 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:54:48.559379 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:48.559280 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls podName:18926de0-0561-424c-845b-6ea1059c821a nodeName:}" failed. No retries permitted until 2026-04-16 14:56:50.559261399 +0000 UTC m=+282.940259015 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls") pod "dns-default-5lzf7" (UID: "18926de0-0561-424c-845b-6ea1059c821a") : secret "dns-default-metrics-tls" not found Apr 16 14:54:48.559379 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:48.559306 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:54:48.559379 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:48.559314 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert podName:d01b24a9-f9f3-4d8c-830c-38ff2cc50292 nodeName:}" failed. No retries permitted until 2026-04-16 14:56:50.559300376 +0000 UTC m=+282.940297972 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert") pod "ingress-canary-vlvr9" (UID: "d01b24a9-f9f3-4d8c-830c-38ff2cc50292") : secret "canary-serving-cert" not found Apr 16 14:54:48.559379 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:48.559343 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle podName:856eaa92-f51a-4a81-8e75-e2010da158d8 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:04.559328447 +0000 UTC m=+176.940326052 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle") pod "router-default-55d8df8b-t92xc" (UID: "856eaa92-f51a-4a81-8e75-e2010da158d8") : configmap references non-existent config key: service-ca.crt Apr 16 14:54:48.559379 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:48.559363 2561 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:54:48.559379 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:48.559372 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:54:48.559684 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:48.559388 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs podName:856eaa92-f51a-4a81-8e75-e2010da158d8 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:04.559380315 +0000 UTC m=+176.940377912 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs") pod "router-default-55d8df8b-t92xc" (UID: "856eaa92-f51a-4a81-8e75-e2010da158d8") : secret "router-metrics-certs-default" not found Apr 16 14:54:48.559684 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:48.559392 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68684b6fc-df98m: secret "image-registry-tls" not found Apr 16 14:54:48.559684 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:48.559442 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls podName:d105a85e-d684-4862-877b-ab11c5b1ca26 nodeName:}" failed. No retries permitted until 2026-04-16 14:56:50.55942776 +0000 UTC m=+282.940425365 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls") pod "image-registry-68684b6fc-df98m" (UID: "d105a85e-d684-4862-877b-ab11c5b1ca26") : secret "image-registry-tls" not found Apr 16 14:54:56.420272 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:56.420233 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dc9b1052-22c3-43fc-82e3-c5e203f94377-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-7htx8\" (UID: \"dc9b1052-22c3-43fc-82e3-c5e203f94377\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7htx8" Apr 16 14:54:56.422660 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:56.422632 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dc9b1052-22c3-43fc-82e3-c5e203f94377-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-7htx8\" (UID: \"dc9b1052-22c3-43fc-82e3-c5e203f94377\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7htx8" Apr 16 14:54:56.519442 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:56.519414 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7htx8" Apr 16 14:54:56.628894 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:56.628865 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-7htx8"] Apr 16 14:54:56.631671 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:54:56.631644 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc9b1052_22c3_43fc_82e3_c5e203f94377.slice/crio-9777c5ccb9b67dfb7f6f793da406fe3db48ac9a99e9facc6c23b41abdd5cdf74 WatchSource:0}: Error finding container 9777c5ccb9b67dfb7f6f793da406fe3db48ac9a99e9facc6c23b41abdd5cdf74: Status 404 returned error can't find the container with id 9777c5ccb9b67dfb7f6f793da406fe3db48ac9a99e9facc6c23b41abdd5cdf74 Apr 16 14:54:56.671958 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:56.671898 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7htx8" event={"ID":"dc9b1052-22c3-43fc-82e3-c5e203f94377","Type":"ContainerStarted","Data":"9777c5ccb9b67dfb7f6f793da406fe3db48ac9a99e9facc6c23b41abdd5cdf74"} Apr 16 14:54:57.221441 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:57.221411 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:54:57.221767 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:57.221748 2561 scope.go:117] "RemoveContainer" containerID="1c7df864ed164359da8cc304f3a716f9b7e39d299703864810431b1ce058a70f" Apr 16 14:54:57.675325 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:57.675306 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-sx65q_e23b4d8a-18ac-44f6-bf57-10e1ac5b3247/console-operator/2.log" Apr 16 14:54:57.675709 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:57.675696 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-sx65q_e23b4d8a-18ac-44f6-bf57-10e1ac5b3247/console-operator/1.log" Apr 16 14:54:57.675755 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:57.675727 2561 generic.go:358] "Generic (PLEG): container finished" podID="e23b4d8a-18ac-44f6-bf57-10e1ac5b3247" containerID="24c8ba653f740e9cd1f34787b6a62132d21b14bc60da2ce3b13f2e9734468619" exitCode=255 Apr 16 14:54:57.675788 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:57.675754 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" event={"ID":"e23b4d8a-18ac-44f6-bf57-10e1ac5b3247","Type":"ContainerDied","Data":"24c8ba653f740e9cd1f34787b6a62132d21b14bc60da2ce3b13f2e9734468619"} Apr 16 14:54:57.675788 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:57.675780 2561 scope.go:117] "RemoveContainer" containerID="1c7df864ed164359da8cc304f3a716f9b7e39d299703864810431b1ce058a70f" Apr 16 14:54:57.676146 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:57.676120 2561 scope.go:117] "RemoveContainer" containerID="24c8ba653f740e9cd1f34787b6a62132d21b14bc60da2ce3b13f2e9734468619" Apr 16 14:54:57.676344 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:54:57.676324 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-sx65q_openshift-console-operator(e23b4d8a-18ac-44f6-bf57-10e1ac5b3247)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" podUID="e23b4d8a-18ac-44f6-bf57-10e1ac5b3247" Apr 16 14:54:58.679003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:58.678974 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-sx65q_e23b4d8a-18ac-44f6-bf57-10e1ac5b3247/console-operator/2.log" Apr 16 14:54:58.680193 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:58.680168 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7htx8" event={"ID":"dc9b1052-22c3-43fc-82e3-c5e203f94377","Type":"ContainerStarted","Data":"53c67a541400483135db2a03481fd55cf5eafd6d69fe78db024562bbc255cbc8"} Apr 16 14:54:58.696938 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:54:58.696900 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-7htx8" podStartSLOduration=17.689758521999998 podStartE2EDuration="18.696887764s" podCreationTimestamp="2026-04-16 14:54:40 +0000 UTC" firstStartedPulling="2026-04-16 14:54:56.633504638 +0000 UTC m=+169.014502230" lastFinishedPulling="2026-04-16 14:54:57.640633877 +0000 UTC m=+170.021631472" observedRunningTime="2026-04-16 14:54:58.694873353 +0000 UTC m=+171.075870970" watchObservedRunningTime="2026-04-16 14:54:58.696887764 +0000 UTC m=+171.077885431" Apr 16 14:55:03.035480 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.035439 2561 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" Apr 16 14:55:03.035480 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.035486 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" Apr 16 14:55:03.035933 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.035829 2561 scope.go:117] "RemoveContainer" containerID="24c8ba653f740e9cd1f34787b6a62132d21b14bc60da2ce3b13f2e9734468619" Apr 16 14:55:03.035997 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:55:03.035980 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-sx65q_openshift-console-operator(e23b4d8a-18ac-44f6-bf57-10e1ac5b3247)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" podUID="e23b4d8a-18ac-44f6-bf57-10e1ac5b3247" Apr 16 14:55:03.318318 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.318244 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-4c465"] Apr 16 14:55:03.323099 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.323072 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4c465" Apr 16 14:55:03.325984 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.325959 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:55:03.326132 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.325993 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:55:03.326132 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.325960 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:55:03.326249 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.326200 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:55:03.327115 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.327095 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-clwz5\"" Apr 16 14:55:03.338199 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.338181 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4c465"] Apr 16 14:55:03.478300 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.478268 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/88cecfa9-1dbd-4fa5-a33b-463543fb9b31-crio-socket\") pod \"insights-runtime-extractor-4c465\" (UID: \"88cecfa9-1dbd-4fa5-a33b-463543fb9b31\") " pod="openshift-insights/insights-runtime-extractor-4c465" Apr 16 14:55:03.478300 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.478307 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45bsv\" (UniqueName: \"kubernetes.io/projected/88cecfa9-1dbd-4fa5-a33b-463543fb9b31-kube-api-access-45bsv\") pod \"insights-runtime-extractor-4c465\" (UID: \"88cecfa9-1dbd-4fa5-a33b-463543fb9b31\") " pod="openshift-insights/insights-runtime-extractor-4c465" Apr 16 14:55:03.478557 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.478384 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/88cecfa9-1dbd-4fa5-a33b-463543fb9b31-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4c465\" (UID: \"88cecfa9-1dbd-4fa5-a33b-463543fb9b31\") " pod="openshift-insights/insights-runtime-extractor-4c465" Apr 16 14:55:03.478557 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.478416 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/88cecfa9-1dbd-4fa5-a33b-463543fb9b31-data-volume\") pod \"insights-runtime-extractor-4c465\" (UID: \"88cecfa9-1dbd-4fa5-a33b-463543fb9b31\") " pod="openshift-insights/insights-runtime-extractor-4c465" Apr 16 14:55:03.478557 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.478525 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/88cecfa9-1dbd-4fa5-a33b-463543fb9b31-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4c465\" (UID: \"88cecfa9-1dbd-4fa5-a33b-463543fb9b31\") " pod="openshift-insights/insights-runtime-extractor-4c465" Apr 16 14:55:03.579908 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.579814 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/88cecfa9-1dbd-4fa5-a33b-463543fb9b31-data-volume\") pod \"insights-runtime-extractor-4c465\" (UID: \"88cecfa9-1dbd-4fa5-a33b-463543fb9b31\") " pod="openshift-insights/insights-runtime-extractor-4c465" Apr 16 14:55:03.579908 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.579903 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/88cecfa9-1dbd-4fa5-a33b-463543fb9b31-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4c465\" (UID: \"88cecfa9-1dbd-4fa5-a33b-463543fb9b31\") " pod="openshift-insights/insights-runtime-extractor-4c465" Apr 16 14:55:03.580096 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.579932 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/88cecfa9-1dbd-4fa5-a33b-463543fb9b31-crio-socket\") pod \"insights-runtime-extractor-4c465\" (UID: \"88cecfa9-1dbd-4fa5-a33b-463543fb9b31\") " pod="openshift-insights/insights-runtime-extractor-4c465" Apr 16 14:55:03.580096 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.579949 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45bsv\" (UniqueName: \"kubernetes.io/projected/88cecfa9-1dbd-4fa5-a33b-463543fb9b31-kube-api-access-45bsv\") pod \"insights-runtime-extractor-4c465\" (UID: \"88cecfa9-1dbd-4fa5-a33b-463543fb9b31\") " pod="openshift-insights/insights-runtime-extractor-4c465" Apr 16 14:55:03.580096 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.580036 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/88cecfa9-1dbd-4fa5-a33b-463543fb9b31-crio-socket\") pod \"insights-runtime-extractor-4c465\" (UID: \"88cecfa9-1dbd-4fa5-a33b-463543fb9b31\") " pod="openshift-insights/insights-runtime-extractor-4c465" Apr 16 14:55:03.580212 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.580167 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/88cecfa9-1dbd-4fa5-a33b-463543fb9b31-data-volume\") pod \"insights-runtime-extractor-4c465\" (UID: \"88cecfa9-1dbd-4fa5-a33b-463543fb9b31\") " pod="openshift-insights/insights-runtime-extractor-4c465" Apr 16 14:55:03.580212 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.580198 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/88cecfa9-1dbd-4fa5-a33b-463543fb9b31-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4c465\" (UID: \"88cecfa9-1dbd-4fa5-a33b-463543fb9b31\") " pod="openshift-insights/insights-runtime-extractor-4c465" Apr 16 14:55:03.580627 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.580583 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/88cecfa9-1dbd-4fa5-a33b-463543fb9b31-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4c465\" (UID: \"88cecfa9-1dbd-4fa5-a33b-463543fb9b31\") " pod="openshift-insights/insights-runtime-extractor-4c465" Apr 16 14:55:03.582291 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.582274 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/88cecfa9-1dbd-4fa5-a33b-463543fb9b31-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4c465\" (UID: \"88cecfa9-1dbd-4fa5-a33b-463543fb9b31\") " pod="openshift-insights/insights-runtime-extractor-4c465" Apr 16 14:55:03.592918 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.592896 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45bsv\" (UniqueName: \"kubernetes.io/projected/88cecfa9-1dbd-4fa5-a33b-463543fb9b31-kube-api-access-45bsv\") pod \"insights-runtime-extractor-4c465\" (UID: \"88cecfa9-1dbd-4fa5-a33b-463543fb9b31\") " pod="openshift-insights/insights-runtime-extractor-4c465" Apr 16 14:55:03.632274 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.632249 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4c465" Apr 16 14:55:03.747462 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:03.747428 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4c465"] Apr 16 14:55:03.750944 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:55:03.750918 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88cecfa9_1dbd_4fa5_a33b_463543fb9b31.slice/crio-9c6286b098ef04e2eca7b8104b750dad39f41ed2570f06686a5ecfee2bee416d WatchSource:0}: Error finding container 9c6286b098ef04e2eca7b8104b750dad39f41ed2570f06686a5ecfee2bee416d: Status 404 returned error can't find the container with id 9c6286b098ef04e2eca7b8104b750dad39f41ed2570f06686a5ecfee2bee416d Apr 16 14:55:04.487899 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:04.487872 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jvldv\" (UID: \"db20efb8-03e9-4adc-82bf-e69768c8c347\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv" Apr 16 14:55:04.490111 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:04.490087 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/db20efb8-03e9-4adc-82bf-e69768c8c347-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jvldv\" (UID: \"db20efb8-03e9-4adc-82bf-e69768c8c347\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv" Apr 16 14:55:04.525773 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:04.525752 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv" Apr 16 14:55:04.588438 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:04.588405 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:55:04.588585 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:04.588453 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:55:04.589021 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:04.588996 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/856eaa92-f51a-4a81-8e75-e2010da158d8-service-ca-bundle\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:55:04.591206 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:04.591183 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/856eaa92-f51a-4a81-8e75-e2010da158d8-metrics-certs\") pod \"router-default-55d8df8b-t92xc\" (UID: \"856eaa92-f51a-4a81-8e75-e2010da158d8\") " pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:55:04.640369 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:04.640338 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv"] Apr 16 14:55:04.643144 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:55:04.643117 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb20efb8_03e9_4adc_82bf_e69768c8c347.slice/crio-b2b18a61f86e8049d11a3b59e799610b64a7dc7c98c636052a837d37fd3f6426 WatchSource:0}: Error finding container b2b18a61f86e8049d11a3b59e799610b64a7dc7c98c636052a837d37fd3f6426: Status 404 returned error can't find the container with id b2b18a61f86e8049d11a3b59e799610b64a7dc7c98c636052a837d37fd3f6426 Apr 16 14:55:04.697443 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:04.697406 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv" event={"ID":"db20efb8-03e9-4adc-82bf-e69768c8c347","Type":"ContainerStarted","Data":"b2b18a61f86e8049d11a3b59e799610b64a7dc7c98c636052a837d37fd3f6426"} Apr 16 14:55:04.698844 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:04.698825 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4c465" event={"ID":"88cecfa9-1dbd-4fa5-a33b-463543fb9b31","Type":"ContainerStarted","Data":"fc7c51a6f1ffa1f4f52464f4e5080ac6e00c4861ccfd26b7b868d343d52b500e"} Apr 16 14:55:04.698929 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:04.698850 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4c465" event={"ID":"88cecfa9-1dbd-4fa5-a33b-463543fb9b31","Type":"ContainerStarted","Data":"0c9a09121323617f89aa02d2769091625ffc4656099f761f53057ceca5a33123"} Apr 16 14:55:04.698929 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:04.698860 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4c465" event={"ID":"88cecfa9-1dbd-4fa5-a33b-463543fb9b31","Type":"ContainerStarted","Data":"9c6286b098ef04e2eca7b8104b750dad39f41ed2570f06686a5ecfee2bee416d"} Apr 16 14:55:04.842275 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:04.842250 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:55:04.956327 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:04.956304 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-55d8df8b-t92xc"] Apr 16 14:55:04.958113 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:55:04.958089 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod856eaa92_f51a_4a81_8e75_e2010da158d8.slice/crio-b6a9c35d2bc4ff7e2f9790900b8fde5bbffebec4214ab38665f7f45bce79a913 WatchSource:0}: Error finding container b6a9c35d2bc4ff7e2f9790900b8fde5bbffebec4214ab38665f7f45bce79a913: Status 404 returned error can't find the container with id b6a9c35d2bc4ff7e2f9790900b8fde5bbffebec4214ab38665f7f45bce79a913 Apr 16 14:55:05.702901 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:05.702861 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-55d8df8b-t92xc" event={"ID":"856eaa92-f51a-4a81-8e75-e2010da158d8","Type":"ContainerStarted","Data":"828fda2711721f28b89d35b806b2ffa6602cab9128c43846f3a81bfdd9289231"} Apr 16 14:55:05.702901 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:05.702905 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-55d8df8b-t92xc" event={"ID":"856eaa92-f51a-4a81-8e75-e2010da158d8","Type":"ContainerStarted","Data":"b6a9c35d2bc4ff7e2f9790900b8fde5bbffebec4214ab38665f7f45bce79a913"} Apr 16 14:55:05.722102 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:05.722050 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-55d8df8b-t92xc" podStartSLOduration=33.72203206 podStartE2EDuration="33.72203206s" podCreationTimestamp="2026-04-16 14:54:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:55:05.720514781 +0000 UTC m=+178.101512395" watchObservedRunningTime="2026-04-16 14:55:05.72203206 +0000 UTC m=+178.103029675" Apr 16 14:55:05.842789 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:05.842754 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:55:05.845864 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:05.845838 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:55:06.706688 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:06.706601 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv" event={"ID":"db20efb8-03e9-4adc-82bf-e69768c8c347","Type":"ContainerStarted","Data":"652a2051eaa1d5cf39305ef68b40491977e62b83b408ff7cf50643dfac57ab96"} Apr 16 14:55:06.708452 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:06.708420 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4c465" event={"ID":"88cecfa9-1dbd-4fa5-a33b-463543fb9b31","Type":"ContainerStarted","Data":"067681390feff85f48d5b5d73786e90751d153523fe2df46dd100767775cd35c"} Apr 16 14:55:06.708703 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:06.708677 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:55:06.709779 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:06.709762 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-55d8df8b-t92xc" Apr 16 14:55:06.774771 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:06.774727 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jvldv" podStartSLOduration=32.97772518 podStartE2EDuration="34.77471582s" podCreationTimestamp="2026-04-16 14:54:32 +0000 UTC" firstStartedPulling="2026-04-16 14:55:04.644922956 +0000 UTC m=+177.025920548" lastFinishedPulling="2026-04-16 14:55:06.441913582 +0000 UTC m=+178.822911188" observedRunningTime="2026-04-16 14:55:06.735621882 +0000 UTC m=+179.116619493" watchObservedRunningTime="2026-04-16 14:55:06.77471582 +0000 UTC m=+179.155713488" Apr 16 14:55:06.796600 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:06.796554 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-4c465" podStartSLOduration=1.186947329 podStartE2EDuration="3.796543054s" podCreationTimestamp="2026-04-16 14:55:03 +0000 UTC" firstStartedPulling="2026-04-16 14:55:03.829370826 +0000 UTC m=+176.210368419" lastFinishedPulling="2026-04-16 14:55:06.438966544 +0000 UTC m=+178.819964144" observedRunningTime="2026-04-16 14:55:06.794698602 +0000 UTC m=+179.175696213" watchObservedRunningTime="2026-04-16 14:55:06.796543054 +0000 UTC m=+179.177540669" Apr 16 14:55:06.974864 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:06.974794 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bpszm"] Apr 16 14:55:06.977704 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:06.977688 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bpszm" Apr 16 14:55:06.980319 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:06.980292 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 14:55:06.980425 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:06.980346 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-mc85n\"" Apr 16 14:55:06.988234 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:06.988210 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bpszm"] Apr 16 14:55:07.108825 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:07.108787 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/54b1b226-0457-4415-85be-c66a6e47c41b-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-bpszm\" (UID: \"54b1b226-0457-4415-85be-c66a6e47c41b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bpszm" Apr 16 14:55:07.209783 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:07.209753 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/54b1b226-0457-4415-85be-c66a6e47c41b-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-bpszm\" (UID: \"54b1b226-0457-4415-85be-c66a6e47c41b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bpszm" Apr 16 14:55:07.212154 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:07.212128 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/54b1b226-0457-4415-85be-c66a6e47c41b-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-bpszm\" (UID: \"54b1b226-0457-4415-85be-c66a6e47c41b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bpszm" Apr 16 14:55:07.286367 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:07.286338 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bpszm" Apr 16 14:55:07.396501 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:07.396474 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bpszm"] Apr 16 14:55:07.399220 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:55:07.399188 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54b1b226_0457_4415_85be_c66a6e47c41b.slice/crio-262ed219660f544988826b62b615df5793853ce800a34eb0883b8f6b40742d3a WatchSource:0}: Error finding container 262ed219660f544988826b62b615df5793853ce800a34eb0883b8f6b40742d3a: Status 404 returned error can't find the container with id 262ed219660f544988826b62b615df5793853ce800a34eb0883b8f6b40742d3a Apr 16 14:55:07.711851 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:07.711763 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bpszm" event={"ID":"54b1b226-0457-4415-85be-c66a6e47c41b","Type":"ContainerStarted","Data":"262ed219660f544988826b62b615df5793853ce800a34eb0883b8f6b40742d3a"} Apr 16 14:55:08.715243 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:08.715209 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bpszm" event={"ID":"54b1b226-0457-4415-85be-c66a6e47c41b","Type":"ContainerStarted","Data":"0d31f24f2596ac8b1ceb192c5b81b99f6c0cc4fe20d9af6aa38a7029f13b89d2"} Apr 16 14:55:08.737711 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:08.734172 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bpszm" podStartSLOduration=1.555102878 podStartE2EDuration="2.73414993s" podCreationTimestamp="2026-04-16 14:55:06 +0000 UTC" firstStartedPulling="2026-04-16 14:55:07.401155146 +0000 UTC m=+179.782152738" lastFinishedPulling="2026-04-16 14:55:08.580202188 +0000 UTC m=+180.961199790" observedRunningTime="2026-04-16 14:55:08.733664861 +0000 UTC m=+181.114662488" watchObservedRunningTime="2026-04-16 14:55:08.73414993 +0000 UTC m=+181.115147543" Apr 16 14:55:09.717670 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:09.717640 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bpszm" Apr 16 14:55:09.723317 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:09.723287 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-bpszm" Apr 16 14:55:10.032495 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.032461 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-2jfnl"] Apr 16 14:55:10.035715 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.035699 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-2jfnl" Apr 16 14:55:10.038722 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.038698 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 14:55:10.039164 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.039145 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 14:55:10.039258 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.039153 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-6fgt6\"" Apr 16 14:55:10.039930 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.039910 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:55:10.047690 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.047670 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-2jfnl"] Apr 16 14:55:10.131208 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.131183 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfzf6\" (UniqueName: \"kubernetes.io/projected/dd8d9682-b82b-4d5a-b119-96819a1ff714-kube-api-access-gfzf6\") pod \"prometheus-operator-78f957474d-2jfnl\" (UID: \"dd8d9682-b82b-4d5a-b119-96819a1ff714\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2jfnl" Apr 16 14:55:10.131335 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.131222 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd8d9682-b82b-4d5a-b119-96819a1ff714-metrics-client-ca\") pod \"prometheus-operator-78f957474d-2jfnl\" (UID: \"dd8d9682-b82b-4d5a-b119-96819a1ff714\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2jfnl" Apr 16 14:55:10.131335 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.131241 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd8d9682-b82b-4d5a-b119-96819a1ff714-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-2jfnl\" (UID: \"dd8d9682-b82b-4d5a-b119-96819a1ff714\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2jfnl" Apr 16 14:55:10.131335 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.131258 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/dd8d9682-b82b-4d5a-b119-96819a1ff714-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-2jfnl\" (UID: \"dd8d9682-b82b-4d5a-b119-96819a1ff714\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2jfnl" Apr 16 14:55:10.232001 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.231969 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfzf6\" (UniqueName: \"kubernetes.io/projected/dd8d9682-b82b-4d5a-b119-96819a1ff714-kube-api-access-gfzf6\") pod \"prometheus-operator-78f957474d-2jfnl\" (UID: \"dd8d9682-b82b-4d5a-b119-96819a1ff714\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2jfnl" Apr 16 14:55:10.232171 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.232019 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd8d9682-b82b-4d5a-b119-96819a1ff714-metrics-client-ca\") pod \"prometheus-operator-78f957474d-2jfnl\" (UID: \"dd8d9682-b82b-4d5a-b119-96819a1ff714\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2jfnl" Apr 16 14:55:10.232171 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.232045 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd8d9682-b82b-4d5a-b119-96819a1ff714-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-2jfnl\" (UID: \"dd8d9682-b82b-4d5a-b119-96819a1ff714\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2jfnl" Apr 16 14:55:10.232171 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.232071 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/dd8d9682-b82b-4d5a-b119-96819a1ff714-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-2jfnl\" (UID: \"dd8d9682-b82b-4d5a-b119-96819a1ff714\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2jfnl" Apr 16 14:55:10.232625 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.232574 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd8d9682-b82b-4d5a-b119-96819a1ff714-metrics-client-ca\") pod \"prometheus-operator-78f957474d-2jfnl\" (UID: \"dd8d9682-b82b-4d5a-b119-96819a1ff714\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2jfnl" Apr 16 14:55:10.234365 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.234342 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/dd8d9682-b82b-4d5a-b119-96819a1ff714-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-2jfnl\" (UID: \"dd8d9682-b82b-4d5a-b119-96819a1ff714\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2jfnl" Apr 16 14:55:10.234365 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.234353 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd8d9682-b82b-4d5a-b119-96819a1ff714-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-2jfnl\" (UID: \"dd8d9682-b82b-4d5a-b119-96819a1ff714\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2jfnl" Apr 16 14:55:10.240063 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.240042 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfzf6\" (UniqueName: \"kubernetes.io/projected/dd8d9682-b82b-4d5a-b119-96819a1ff714-kube-api-access-gfzf6\") pod \"prometheus-operator-78f957474d-2jfnl\" (UID: \"dd8d9682-b82b-4d5a-b119-96819a1ff714\") " pod="openshift-monitoring/prometheus-operator-78f957474d-2jfnl" Apr 16 14:55:10.345284 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.345224 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-2jfnl" Apr 16 14:55:10.459602 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.459578 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-2jfnl"] Apr 16 14:55:10.461981 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:55:10.461952 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd8d9682_b82b_4d5a_b119_96819a1ff714.slice/crio-65f50b8d1e32d5a40ccaa50f75e0629af8a82af3e2d4bc85af4da28a689c04cd WatchSource:0}: Error finding container 65f50b8d1e32d5a40ccaa50f75e0629af8a82af3e2d4bc85af4da28a689c04cd: Status 404 returned error can't find the container with id 65f50b8d1e32d5a40ccaa50f75e0629af8a82af3e2d4bc85af4da28a689c04cd Apr 16 14:55:10.721078 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:10.721001 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-2jfnl" event={"ID":"dd8d9682-b82b-4d5a-b119-96819a1ff714","Type":"ContainerStarted","Data":"65f50b8d1e32d5a40ccaa50f75e0629af8a82af3e2d4bc85af4da28a689c04cd"} Apr 16 14:55:11.725375 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:11.725342 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-2jfnl" event={"ID":"dd8d9682-b82b-4d5a-b119-96819a1ff714","Type":"ContainerStarted","Data":"9a5970ad52399ee5c70bba1ca8351de379b4e063f8253399c4f0454294ba2a52"} Apr 16 14:55:11.725717 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:11.725388 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-2jfnl" event={"ID":"dd8d9682-b82b-4d5a-b119-96819a1ff714","Type":"ContainerStarted","Data":"6fed41bc9744556a5d55b6a8c68618cd4a9f21acf8f2a38cebf9259035a755a4"} Apr 16 14:55:13.385587 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.385540 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-2jfnl" podStartSLOduration=2.284536631 podStartE2EDuration="3.385521836s" podCreationTimestamp="2026-04-16 14:55:10 +0000 UTC" firstStartedPulling="2026-04-16 14:55:10.463877705 +0000 UTC m=+182.844875298" lastFinishedPulling="2026-04-16 14:55:11.564862897 +0000 UTC m=+183.945860503" observedRunningTime="2026-04-16 14:55:11.746696136 +0000 UTC m=+184.127693750" watchObservedRunningTime="2026-04-16 14:55:13.385521836 +0000 UTC m=+185.766519450" Apr 16 14:55:13.386695 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.386672 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j"] Apr 16 14:55:13.389963 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.389947 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j" Apr 16 14:55:13.393864 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.393840 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-wgx5p\"" Apr 16 14:55:13.393864 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.393840 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 14:55:13.394019 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.393939 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 14:55:13.402309 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.402288 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j"] Apr 16 14:55:13.446871 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.446840 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-vf9jk"] Apr 16 14:55:13.449990 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.449970 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.453785 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.453766 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:55:13.453893 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.453857 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:55:13.453893 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.453861 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mcggm\"" Apr 16 14:55:13.462555 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.462538 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:55:13.559794 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.559767 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-node-exporter-tls\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.559905 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.559798 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pnn9\" (UniqueName: \"kubernetes.io/projected/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-kube-api-access-9pnn9\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.559905 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.559817 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-root\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.559905 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.559836 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.560011 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.559951 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-node-exporter-wtmp\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.560011 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.559970 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3367a751-ed62-4f70-b4a8-77d1c4071c5e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-mfp4j\" (UID: \"3367a751-ed62-4f70-b4a8-77d1c4071c5e\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j" Apr 16 14:55:13.560011 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.559989 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mfjp\" (UniqueName: \"kubernetes.io/projected/3367a751-ed62-4f70-b4a8-77d1c4071c5e-kube-api-access-4mfjp\") pod \"openshift-state-metrics-5669946b84-mfp4j\" (UID: \"3367a751-ed62-4f70-b4a8-77d1c4071c5e\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j" Apr 16 14:55:13.560011 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.560007 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3367a751-ed62-4f70-b4a8-77d1c4071c5e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-mfp4j\" (UID: \"3367a751-ed62-4f70-b4a8-77d1c4071c5e\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j" Apr 16 14:55:13.560184 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.560032 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-node-exporter-textfile\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.560184 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.560088 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-node-exporter-accelerators-collector-config\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.560184 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.560129 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-sys\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.560184 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.560178 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-metrics-client-ca\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.560343 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.560207 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3367a751-ed62-4f70-b4a8-77d1c4071c5e-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-mfp4j\" (UID: \"3367a751-ed62-4f70-b4a8-77d1c4071c5e\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j" Apr 16 14:55:13.661477 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.661405 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-node-exporter-textfile\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.661477 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.661437 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-node-exporter-accelerators-collector-config\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.661477 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.661460 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-sys\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.661750 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.661490 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-metrics-client-ca\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.661750 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.661540 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-sys\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.661750 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.661589 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3367a751-ed62-4f70-b4a8-77d1c4071c5e-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-mfp4j\" (UID: \"3367a751-ed62-4f70-b4a8-77d1c4071c5e\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j" Apr 16 14:55:13.661750 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.661692 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-node-exporter-tls\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.661750 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.661724 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pnn9\" (UniqueName: \"kubernetes.io/projected/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-kube-api-access-9pnn9\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.661750 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.661748 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-root\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.662040 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.661774 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.662040 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.661833 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-node-exporter-textfile\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.662040 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.661869 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-node-exporter-wtmp\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.662040 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.661896 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3367a751-ed62-4f70-b4a8-77d1c4071c5e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-mfp4j\" (UID: \"3367a751-ed62-4f70-b4a8-77d1c4071c5e\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j" Apr 16 14:55:13.662040 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.661910 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-root\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.662040 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.661922 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mfjp\" (UniqueName: \"kubernetes.io/projected/3367a751-ed62-4f70-b4a8-77d1c4071c5e-kube-api-access-4mfjp\") pod \"openshift-state-metrics-5669946b84-mfp4j\" (UID: \"3367a751-ed62-4f70-b4a8-77d1c4071c5e\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j" Apr 16 14:55:13.662040 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.661944 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3367a751-ed62-4f70-b4a8-77d1c4071c5e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-mfp4j\" (UID: \"3367a751-ed62-4f70-b4a8-77d1c4071c5e\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j" Apr 16 14:55:13.662040 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.662024 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-node-exporter-wtmp\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.662367 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.662346 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3367a751-ed62-4f70-b4a8-77d1c4071c5e-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-mfp4j\" (UID: \"3367a751-ed62-4f70-b4a8-77d1c4071c5e\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j" Apr 16 14:55:13.662721 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.662662 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-metrics-client-ca\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.662843 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.662751 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-node-exporter-accelerators-collector-config\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.664489 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.664467 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-node-exporter-tls\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.664624 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.664516 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3367a751-ed62-4f70-b4a8-77d1c4071c5e-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-mfp4j\" (UID: \"3367a751-ed62-4f70-b4a8-77d1c4071c5e\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j" Apr 16 14:55:13.664706 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.664695 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.664790 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.664770 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3367a751-ed62-4f70-b4a8-77d1c4071c5e-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-mfp4j\" (UID: \"3367a751-ed62-4f70-b4a8-77d1c4071c5e\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j" Apr 16 14:55:13.670341 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.670317 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pnn9\" (UniqueName: \"kubernetes.io/projected/cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46-kube-api-access-9pnn9\") pod \"node-exporter-vf9jk\" (UID: \"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46\") " pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.670341 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.670325 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mfjp\" (UniqueName: \"kubernetes.io/projected/3367a751-ed62-4f70-b4a8-77d1c4071c5e-kube-api-access-4mfjp\") pod \"openshift-state-metrics-5669946b84-mfp4j\" (UID: \"3367a751-ed62-4f70-b4a8-77d1c4071c5e\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j" Apr 16 14:55:13.699261 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.699232 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j" Apr 16 14:55:13.759385 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.759356 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vf9jk" Apr 16 14:55:13.819288 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:13.819235 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j"] Apr 16 14:55:13.822056 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:55:13.822023 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3367a751_ed62_4f70_b4a8_77d1c4071c5e.slice/crio-697f78edc9c62148fc6f20fd7eb380d11cca8dcb2b1607ad6a66acf2a9935f7a WatchSource:0}: Error finding container 697f78edc9c62148fc6f20fd7eb380d11cca8dcb2b1607ad6a66acf2a9935f7a: Status 404 returned error can't find the container with id 697f78edc9c62148fc6f20fd7eb380d11cca8dcb2b1607ad6a66acf2a9935f7a Apr 16 14:55:14.221751 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:14.221667 2561 scope.go:117] "RemoveContainer" containerID="24c8ba653f740e9cd1f34787b6a62132d21b14bc60da2ce3b13f2e9734468619" Apr 16 14:55:14.221943 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:55:14.221920 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-sx65q_openshift-console-operator(e23b4d8a-18ac-44f6-bf57-10e1ac5b3247)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" podUID="e23b4d8a-18ac-44f6-bf57-10e1ac5b3247" Apr 16 14:55:14.735243 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:14.735171 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j" event={"ID":"3367a751-ed62-4f70-b4a8-77d1c4071c5e","Type":"ContainerStarted","Data":"1324986f44bf809388b5e9ac795b064137c6593b20f538c4d2f0a2d75533ab9c"} Apr 16 14:55:14.735243 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:14.735215 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j" event={"ID":"3367a751-ed62-4f70-b4a8-77d1c4071c5e","Type":"ContainerStarted","Data":"598fa4f1fb489c1fa5c71dfad5a61c23f5e84301ab6375b50c5b9e439593c435"} Apr 16 14:55:14.735243 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:14.735230 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j" event={"ID":"3367a751-ed62-4f70-b4a8-77d1c4071c5e","Type":"ContainerStarted","Data":"697f78edc9c62148fc6f20fd7eb380d11cca8dcb2b1607ad6a66acf2a9935f7a"} Apr 16 14:55:14.736991 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:14.736962 2561 generic.go:358] "Generic (PLEG): container finished" podID="cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46" containerID="0818bfd13d257f1372f68c12f1b927047ef197208fdfd641cbc9461139b342e9" exitCode=0 Apr 16 14:55:14.737100 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:14.737013 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vf9jk" event={"ID":"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46","Type":"ContainerDied","Data":"0818bfd13d257f1372f68c12f1b927047ef197208fdfd641cbc9461139b342e9"} Apr 16 14:55:14.737100 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:14.737041 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vf9jk" event={"ID":"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46","Type":"ContainerStarted","Data":"955d783d4ea274be18f08b22651376558dc3c371d2c9b8752e9baa995f455caa"} Apr 16 14:55:15.741496 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:15.741459 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vf9jk" event={"ID":"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46","Type":"ContainerStarted","Data":"4a732e211afecb3c28516d63733fc72291bd963118810db9d5577ae9613072a6"} Apr 16 14:55:15.741496 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:15.741501 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vf9jk" event={"ID":"cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46","Type":"ContainerStarted","Data":"76940b5cbe5d5a7038335a766813e4d2f0c21fef51212658fd59712c2c653e5b"} Apr 16 14:55:15.743280 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:15.743256 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j" event={"ID":"3367a751-ed62-4f70-b4a8-77d1c4071c5e","Type":"ContainerStarted","Data":"faebd88329dedcc831a8195b83d76ad425decd2ed6753110ef42306ae9dbee00"} Apr 16 14:55:15.759550 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:15.759512 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-vf9jk" podStartSLOduration=2.052385813 podStartE2EDuration="2.759501892s" podCreationTimestamp="2026-04-16 14:55:13 +0000 UTC" firstStartedPulling="2026-04-16 14:55:13.773396187 +0000 UTC m=+186.154393779" lastFinishedPulling="2026-04-16 14:55:14.480512251 +0000 UTC m=+186.861509858" observedRunningTime="2026-04-16 14:55:15.758390274 +0000 UTC m=+188.139387889" watchObservedRunningTime="2026-04-16 14:55:15.759501892 +0000 UTC m=+188.140499506" Apr 16 14:55:15.773621 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:15.773571 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-mfp4j" podStartSLOduration=1.790475678 podStartE2EDuration="2.773562094s" podCreationTimestamp="2026-04-16 14:55:13 +0000 UTC" firstStartedPulling="2026-04-16 14:55:13.942991533 +0000 UTC m=+186.323989129" lastFinishedPulling="2026-04-16 14:55:14.92607794 +0000 UTC m=+187.307075545" observedRunningTime="2026-04-16 14:55:15.773044225 +0000 UTC m=+188.154041852" watchObservedRunningTime="2026-04-16 14:55:15.773562094 +0000 UTC m=+188.154559707" Apr 16 14:55:17.716324 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.716285 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-56f49c9dcd-c869v"] Apr 16 14:55:17.719797 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.719775 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.724300 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.724275 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 14:55:17.724300 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.724288 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-3n4im9bi22kvd\"" Apr 16 14:55:17.724300 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.724282 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 14:55:17.724508 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.724339 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 14:55:17.724508 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.724291 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-zj7hj\"" Apr 16 14:55:17.724658 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.724633 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:55:17.730586 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.730565 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-56f49c9dcd-c869v"] Apr 16 14:55:17.796586 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.796559 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3debccbd-ccd0-4888-be40-8734a1730d29-audit-log\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.796734 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.796594 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3debccbd-ccd0-4888-be40-8734a1730d29-secret-metrics-server-client-certs\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.796734 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.796720 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3debccbd-ccd0-4888-be40-8734a1730d29-secret-metrics-server-tls\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.796846 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.796800 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3debccbd-ccd0-4888-be40-8734a1730d29-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.796846 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.796831 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3debccbd-ccd0-4888-be40-8734a1730d29-metrics-server-audit-profiles\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.796948 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.796890 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3debccbd-ccd0-4888-be40-8734a1730d29-client-ca-bundle\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.797001 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.796962 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbphp\" (UniqueName: \"kubernetes.io/projected/3debccbd-ccd0-4888-be40-8734a1730d29-kube-api-access-vbphp\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.898141 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.898103 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3debccbd-ccd0-4888-be40-8734a1730d29-client-ca-bundle\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.898280 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.898158 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbphp\" (UniqueName: \"kubernetes.io/projected/3debccbd-ccd0-4888-be40-8734a1730d29-kube-api-access-vbphp\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.898280 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.898183 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3debccbd-ccd0-4888-be40-8734a1730d29-audit-log\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.898280 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.898203 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3debccbd-ccd0-4888-be40-8734a1730d29-secret-metrics-server-client-certs\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.898280 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.898247 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3debccbd-ccd0-4888-be40-8734a1730d29-secret-metrics-server-tls\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.898434 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.898301 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3debccbd-ccd0-4888-be40-8734a1730d29-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.898434 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.898318 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3debccbd-ccd0-4888-be40-8734a1730d29-metrics-server-audit-profiles\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.899396 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.899372 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3debccbd-ccd0-4888-be40-8734a1730d29-audit-log\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.900136 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.900112 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3debccbd-ccd0-4888-be40-8734a1730d29-metrics-server-audit-profiles\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.900663 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.900637 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3debccbd-ccd0-4888-be40-8734a1730d29-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.900839 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.900817 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3debccbd-ccd0-4888-be40-8734a1730d29-secret-metrics-server-client-certs\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.900898 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.900826 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3debccbd-ccd0-4888-be40-8734a1730d29-secret-metrics-server-tls\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.900898 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.900879 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3debccbd-ccd0-4888-be40-8734a1730d29-client-ca-bundle\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:17.909922 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:17.909901 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbphp\" (UniqueName: \"kubernetes.io/projected/3debccbd-ccd0-4888-be40-8734a1730d29-kube-api-access-vbphp\") pod \"metrics-server-56f49c9dcd-c869v\" (UID: \"3debccbd-ccd0-4888-be40-8734a1730d29\") " pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:18.029415 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:18.029389 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:18.156654 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:18.156629 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-56f49c9dcd-c869v"] Apr 16 14:55:18.159077 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:55:18.159050 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3debccbd_ccd0_4888_be40_8734a1730d29.slice/crio-7f701c306f973ade720af4510e6c9b6daa76b5fc7b69252b40cb6d6f6c3a7cea WatchSource:0}: Error finding container 7f701c306f973ade720af4510e6c9b6daa76b5fc7b69252b40cb6d6f6c3a7cea: Status 404 returned error can't find the container with id 7f701c306f973ade720af4510e6c9b6daa76b5fc7b69252b40cb6d6f6c3a7cea Apr 16 14:55:18.752766 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:18.752725 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" event={"ID":"3debccbd-ccd0-4888-be40-8734a1730d29","Type":"ContainerStarted","Data":"7f701c306f973ade720af4510e6c9b6daa76b5fc7b69252b40cb6d6f6c3a7cea"} Apr 16 14:55:19.756998 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:19.756918 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" event={"ID":"3debccbd-ccd0-4888-be40-8734a1730d29","Type":"ContainerStarted","Data":"98fa0bf7c55bc8f04fcc7cafeba3a9df3b664d5cae1fc542e0c2722323c47431"} Apr 16 14:55:19.775879 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:19.775831 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" podStartSLOduration=1.480960711 podStartE2EDuration="2.775816898s" podCreationTimestamp="2026-04-16 14:55:17 +0000 UTC" firstStartedPulling="2026-04-16 14:55:18.160979348 +0000 UTC m=+190.541976940" lastFinishedPulling="2026-04-16 14:55:19.45583552 +0000 UTC m=+191.836833127" observedRunningTime="2026-04-16 14:55:19.775089216 +0000 UTC m=+192.156086830" watchObservedRunningTime="2026-04-16 14:55:19.775816898 +0000 UTC m=+192.156814513" Apr 16 14:55:25.265396 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.265363 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-68684b6fc-df98m"] Apr 16 14:55:25.265846 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:55:25.265525 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-68684b6fc-df98m" podUID="d105a85e-d684-4862-877b-ab11c5b1ca26" Apr 16 14:55:25.771627 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.771583 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:55:25.775773 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.775748 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:55:25.864351 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.864325 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d105a85e-d684-4862-877b-ab11c5b1ca26-installation-pull-secrets\") pod \"d105a85e-d684-4862-877b-ab11c5b1ca26\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " Apr 16 14:55:25.864455 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.864372 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sghm\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-kube-api-access-6sghm\") pod \"d105a85e-d684-4862-877b-ab11c5b1ca26\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " Apr 16 14:55:25.864455 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.864395 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-bound-sa-token\") pod \"d105a85e-d684-4862-877b-ab11c5b1ca26\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " Apr 16 14:55:25.864455 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.864424 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d105a85e-d684-4862-877b-ab11c5b1ca26-trusted-ca\") pod \"d105a85e-d684-4862-877b-ab11c5b1ca26\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " Apr 16 14:55:25.864455 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.864445 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d105a85e-d684-4862-877b-ab11c5b1ca26-image-registry-private-configuration\") pod \"d105a85e-d684-4862-877b-ab11c5b1ca26\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " Apr 16 14:55:25.864659 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.864585 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d105a85e-d684-4862-877b-ab11c5b1ca26-ca-trust-extracted\") pod \"d105a85e-d684-4862-877b-ab11c5b1ca26\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " Apr 16 14:55:25.864721 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.864703 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-certificates\") pod \"d105a85e-d684-4862-877b-ab11c5b1ca26\" (UID: \"d105a85e-d684-4862-877b-ab11c5b1ca26\") " Apr 16 14:55:25.864901 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.864872 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d105a85e-d684-4862-877b-ab11c5b1ca26-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d105a85e-d684-4862-877b-ab11c5b1ca26" (UID: "d105a85e-d684-4862-877b-ab11c5b1ca26"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:55:25.865003 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.864898 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d105a85e-d684-4862-877b-ab11c5b1ca26-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d105a85e-d684-4862-877b-ab11c5b1ca26" (UID: "d105a85e-d684-4862-877b-ab11c5b1ca26"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:25.865047 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.865003 2561 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d105a85e-d684-4862-877b-ab11c5b1ca26-trusted-ca\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:55:25.865047 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.865023 2561 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d105a85e-d684-4862-877b-ab11c5b1ca26-ca-trust-extracted\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:55:25.865171 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.865144 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d105a85e-d684-4862-877b-ab11c5b1ca26" (UID: "d105a85e-d684-4862-877b-ab11c5b1ca26"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:25.866751 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.866729 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-kube-api-access-6sghm" (OuterVolumeSpecName: "kube-api-access-6sghm") pod "d105a85e-d684-4862-877b-ab11c5b1ca26" (UID: "d105a85e-d684-4862-877b-ab11c5b1ca26"). InnerVolumeSpecName "kube-api-access-6sghm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:25.866833 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.866756 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d105a85e-d684-4862-877b-ab11c5b1ca26-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "d105a85e-d684-4862-877b-ab11c5b1ca26" (UID: "d105a85e-d684-4862-877b-ab11c5b1ca26"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:55:25.867056 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.867034 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d105a85e-d684-4862-877b-ab11c5b1ca26" (UID: "d105a85e-d684-4862-877b-ab11c5b1ca26"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:25.867128 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.867061 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d105a85e-d684-4862-877b-ab11c5b1ca26-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d105a85e-d684-4862-877b-ab11c5b1ca26" (UID: "d105a85e-d684-4862-877b-ab11c5b1ca26"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:55:25.965565 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.965539 2561 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-certificates\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:55:25.965565 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.965562 2561 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d105a85e-d684-4862-877b-ab11c5b1ca26-installation-pull-secrets\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:55:25.965725 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.965572 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6sghm\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-kube-api-access-6sghm\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:55:25.965725 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.965582 2561 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-bound-sa-token\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:55:25.965725 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:25.965591 2561 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d105a85e-d684-4862-877b-ab11c5b1ca26-image-registry-private-configuration\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:55:26.774404 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:26.774367 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68684b6fc-df98m" Apr 16 14:55:26.814792 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:26.814765 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-68684b6fc-df98m"] Apr 16 14:55:26.819049 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:26.819023 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-68684b6fc-df98m"] Apr 16 14:55:26.873309 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:26.873282 2561 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d105a85e-d684-4862-877b-ab11c5b1ca26-registry-tls\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:55:27.221472 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:27.221387 2561 scope.go:117] "RemoveContainer" containerID="24c8ba653f740e9cd1f34787b6a62132d21b14bc60da2ce3b13f2e9734468619" Apr 16 14:55:27.779085 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:27.779058 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-sx65q_e23b4d8a-18ac-44f6-bf57-10e1ac5b3247/console-operator/2.log" Apr 16 14:55:27.779538 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:27.779122 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" event={"ID":"e23b4d8a-18ac-44f6-bf57-10e1ac5b3247","Type":"ContainerStarted","Data":"c389d1ca2f77f63ba03ba4928c516d8365ce088a2a1f6faf8bcd943ac310be6d"} Apr 16 14:55:27.779538 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:27.779386 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" Apr 16 14:55:27.785189 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:27.785169 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" Apr 16 14:55:27.806558 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:27.806512 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-sx65q" podStartSLOduration=53.847864008 podStartE2EDuration="55.806499074s" podCreationTimestamp="2026-04-16 14:54:32 +0000 UTC" firstStartedPulling="2026-04-16 14:54:33.156778506 +0000 UTC m=+145.537776098" lastFinishedPulling="2026-04-16 14:54:35.115413569 +0000 UTC m=+147.496411164" observedRunningTime="2026-04-16 14:55:27.805147246 +0000 UTC m=+200.186144860" watchObservedRunningTime="2026-04-16 14:55:27.806499074 +0000 UTC m=+200.187496687" Apr 16 14:55:27.833487 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:27.833458 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-bmqpp"] Apr 16 14:55:27.836530 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:27.836507 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-bmqpp" Apr 16 14:55:27.840866 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:27.840844 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 14:55:27.840977 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:27.840938 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-27jns\"" Apr 16 14:55:27.842075 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:27.842056 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 14:55:27.848174 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:27.848152 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-bmqpp"] Apr 16 14:55:27.881667 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:27.881642 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhl8x\" (UniqueName: \"kubernetes.io/projected/67dfdfc1-e78f-4a26-ac78-22a18df6d6a7-kube-api-access-rhl8x\") pod \"downloads-586b57c7b4-bmqpp\" (UID: \"67dfdfc1-e78f-4a26-ac78-22a18df6d6a7\") " pod="openshift-console/downloads-586b57c7b4-bmqpp" Apr 16 14:55:27.982881 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:27.982854 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhl8x\" (UniqueName: \"kubernetes.io/projected/67dfdfc1-e78f-4a26-ac78-22a18df6d6a7-kube-api-access-rhl8x\") pod \"downloads-586b57c7b4-bmqpp\" (UID: \"67dfdfc1-e78f-4a26-ac78-22a18df6d6a7\") " pod="openshift-console/downloads-586b57c7b4-bmqpp" Apr 16 14:55:27.996306 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:27.996285 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhl8x\" (UniqueName: \"kubernetes.io/projected/67dfdfc1-e78f-4a26-ac78-22a18df6d6a7-kube-api-access-rhl8x\") pod \"downloads-586b57c7b4-bmqpp\" (UID: \"67dfdfc1-e78f-4a26-ac78-22a18df6d6a7\") " pod="openshift-console/downloads-586b57c7b4-bmqpp" Apr 16 14:55:28.145800 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:28.145731 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-bmqpp" Apr 16 14:55:28.231807 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:28.231779 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d105a85e-d684-4862-877b-ab11c5b1ca26" path="/var/lib/kubelet/pods/d105a85e-d684-4862-877b-ab11c5b1ca26/volumes" Apr 16 14:55:28.262601 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:28.262417 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-bmqpp"] Apr 16 14:55:28.265012 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:55:28.264986 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67dfdfc1_e78f_4a26_ac78_22a18df6d6a7.slice/crio-b00a53a41c9588e559253b999d857a3e2051bfa121206379d5f3715f9f671d9d WatchSource:0}: Error finding container b00a53a41c9588e559253b999d857a3e2051bfa121206379d5f3715f9f671d9d: Status 404 returned error can't find the container with id b00a53a41c9588e559253b999d857a3e2051bfa121206379d5f3715f9f671d9d Apr 16 14:55:28.782859 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:28.782804 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-bmqpp" event={"ID":"67dfdfc1-e78f-4a26-ac78-22a18df6d6a7","Type":"ContainerStarted","Data":"b00a53a41c9588e559253b999d857a3e2051bfa121206379d5f3715f9f671d9d"} Apr 16 14:55:36.752452 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.751177 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d67657d7b-tsgb6"] Apr 16 14:55:36.753951 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.753929 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:36.758244 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.758066 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 14:55:36.758244 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.758093 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 14:55:36.758244 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.758066 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 14:55:36.758244 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.758108 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 14:55:36.758244 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.758122 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 14:55:36.758244 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.758095 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-ctms5\"" Apr 16 14:55:36.767292 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.767271 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d67657d7b-tsgb6"] Apr 16 14:55:36.872464 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.872430 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-oauth-serving-cert\") pod \"console-6d67657d7b-tsgb6\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:36.872674 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.872499 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-console-oauth-config\") pod \"console-6d67657d7b-tsgb6\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:36.872674 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.872529 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fngn5\" (UniqueName: \"kubernetes.io/projected/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-kube-api-access-fngn5\") pod \"console-6d67657d7b-tsgb6\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:36.872674 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.872627 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-service-ca\") pod \"console-6d67657d7b-tsgb6\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:36.872674 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.872671 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-console-config\") pod \"console-6d67657d7b-tsgb6\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:36.872874 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.872723 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-console-serving-cert\") pod \"console-6d67657d7b-tsgb6\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:36.973697 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.973659 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-console-serving-cert\") pod \"console-6d67657d7b-tsgb6\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:36.973890 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.973794 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-oauth-serving-cert\") pod \"console-6d67657d7b-tsgb6\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:36.973890 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.973830 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-console-oauth-config\") pod \"console-6d67657d7b-tsgb6\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:36.973890 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.973847 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fngn5\" (UniqueName: \"kubernetes.io/projected/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-kube-api-access-fngn5\") pod \"console-6d67657d7b-tsgb6\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:36.973890 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.973865 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-service-ca\") pod \"console-6d67657d7b-tsgb6\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:36.973890 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.973891 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-console-config\") pod \"console-6d67657d7b-tsgb6\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:36.974591 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.974560 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-oauth-serving-cert\") pod \"console-6d67657d7b-tsgb6\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:36.975205 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.975179 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-console-config\") pod \"console-6d67657d7b-tsgb6\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:36.975350 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.975235 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-service-ca\") pod \"console-6d67657d7b-tsgb6\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:36.976490 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.976470 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-console-serving-cert\") pod \"console-6d67657d7b-tsgb6\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:36.976622 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.976545 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-console-oauth-config\") pod \"console-6d67657d7b-tsgb6\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:36.988203 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:36.988181 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fngn5\" (UniqueName: \"kubernetes.io/projected/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-kube-api-access-fngn5\") pod \"console-6d67657d7b-tsgb6\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:37.063241 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:37.063212 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:38.029859 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:38.029824 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:38.030291 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:38.029886 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:43.590944 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:43.590920 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d67657d7b-tsgb6"] Apr 16 14:55:43.597673 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:55:43.597638 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b9a172d_ab11_45d0_8094_dec3ae05cb5e.slice/crio-fac69d5591492dbc5bceb3d8cbf80eeee23f6faf5eec44e4391658e42a594d68 WatchSource:0}: Error finding container fac69d5591492dbc5bceb3d8cbf80eeee23f6faf5eec44e4391658e42a594d68: Status 404 returned error can't find the container with id fac69d5591492dbc5bceb3d8cbf80eeee23f6faf5eec44e4391658e42a594d68 Apr 16 14:55:43.828811 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:43.828765 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-bmqpp" event={"ID":"67dfdfc1-e78f-4a26-ac78-22a18df6d6a7","Type":"ContainerStarted","Data":"82e2acd5ef5c3df0e1b182e2511358615f62d573659038a63ea07d722b5aae50"} Apr 16 14:55:43.828999 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:43.828969 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-bmqpp" Apr 16 14:55:43.830079 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:43.830053 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d67657d7b-tsgb6" event={"ID":"5b9a172d-ab11-45d0-8094-dec3ae05cb5e","Type":"ContainerStarted","Data":"fac69d5591492dbc5bceb3d8cbf80eeee23f6faf5eec44e4391658e42a594d68"} Apr 16 14:55:43.845837 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:43.845785 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-bmqpp" podStartSLOduration=1.5465493559999999 podStartE2EDuration="16.845771636s" podCreationTimestamp="2026-04-16 14:55:27 +0000 UTC" firstStartedPulling="2026-04-16 14:55:28.267092787 +0000 UTC m=+200.648090382" lastFinishedPulling="2026-04-16 14:55:43.566315053 +0000 UTC m=+215.947312662" observedRunningTime="2026-04-16 14:55:43.844272042 +0000 UTC m=+216.225269658" watchObservedRunningTime="2026-04-16 14:55:43.845771636 +0000 UTC m=+216.226769251" Apr 16 14:55:43.845961 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:43.845891 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-bmqpp" Apr 16 14:55:45.934285 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:45.934245 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-79bbb545ff-stc72"] Apr 16 14:55:45.962798 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:45.962767 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79bbb545ff-stc72"] Apr 16 14:55:45.962965 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:45.962905 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:45.972596 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:45.972256 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 14:55:46.063089 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.062964 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52gc7\" (UniqueName: \"kubernetes.io/projected/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-kube-api-access-52gc7\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.063089 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.063036 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-trusted-ca-bundle\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.063468 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.063178 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-console-serving-cert\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.063468 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.063218 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-oauth-serving-cert\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.063468 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.063249 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-service-ca\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.063468 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.063376 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-console-oauth-config\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.063468 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.063473 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-console-config\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.164683 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.164642 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52gc7\" (UniqueName: \"kubernetes.io/projected/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-kube-api-access-52gc7\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.164860 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.164722 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-trusted-ca-bundle\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.164860 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.164776 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-console-serving-cert\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.164860 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.164801 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-oauth-serving-cert\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.164860 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.164820 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-service-ca\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.165076 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.164872 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-console-oauth-config\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.165076 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.164936 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-console-config\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.165759 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.165673 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-console-config\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.165759 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.165708 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-service-ca\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.166273 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.166220 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-oauth-serving-cert\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.166448 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.166423 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-trusted-ca-bundle\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.168521 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.168497 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-console-oauth-config\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.168954 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.168912 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-console-serving-cert\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.176514 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.176472 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52gc7\" (UniqueName: \"kubernetes.io/projected/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-kube-api-access-52gc7\") pod \"console-79bbb545ff-stc72\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.276656 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.276571 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:46.671214 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.671096 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79bbb545ff-stc72"] Apr 16 14:55:46.831478 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:55:46.831389 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c22f2f5_f7d8_4fb3_871e_ed91eb51d7ef.slice/crio-76dc3e7d0f3d4c1baa8c4f9c8325f420b278aeba69e25d970c1a4706c7997614 WatchSource:0}: Error finding container 76dc3e7d0f3d4c1baa8c4f9c8325f420b278aeba69e25d970c1a4706c7997614: Status 404 returned error can't find the container with id 76dc3e7d0f3d4c1baa8c4f9c8325f420b278aeba69e25d970c1a4706c7997614 Apr 16 14:55:46.840938 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:46.840904 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79bbb545ff-stc72" event={"ID":"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef","Type":"ContainerStarted","Data":"76dc3e7d0f3d4c1baa8c4f9c8325f420b278aeba69e25d970c1a4706c7997614"} Apr 16 14:55:47.845915 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:47.845874 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79bbb545ff-stc72" event={"ID":"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef","Type":"ContainerStarted","Data":"fdf26c1b5eaed27673ddcf4d653825abf2757b266cdebfccab11cc22f3f4b613"} Apr 16 14:55:47.847459 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:47.847432 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d67657d7b-tsgb6" event={"ID":"5b9a172d-ab11-45d0-8094-dec3ae05cb5e","Type":"ContainerStarted","Data":"591c90128372a9fb82f52445a16d560b0d5d807f991e1a4485cef23d8a89c3f5"} Apr 16 14:55:47.865799 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:47.865745 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-79bbb545ff-stc72" podStartSLOduration=2.8657319709999998 podStartE2EDuration="2.865731971s" podCreationTimestamp="2026-04-16 14:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:55:47.864291487 +0000 UTC m=+220.245289115" watchObservedRunningTime="2026-04-16 14:55:47.865731971 +0000 UTC m=+220.246729584" Apr 16 14:55:47.890170 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:47.890126 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d67657d7b-tsgb6" podStartSLOduration=8.625110858 podStartE2EDuration="11.890112567s" podCreationTimestamp="2026-04-16 14:55:36 +0000 UTC" firstStartedPulling="2026-04-16 14:55:43.599637017 +0000 UTC m=+215.980634623" lastFinishedPulling="2026-04-16 14:55:46.864638732 +0000 UTC m=+219.245636332" observedRunningTime="2026-04-16 14:55:47.889129333 +0000 UTC m=+220.270126946" watchObservedRunningTime="2026-04-16 14:55:47.890112567 +0000 UTC m=+220.271110183" Apr 16 14:55:50.858593 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:50.858556 2561 generic.go:358] "Generic (PLEG): container finished" podID="267060b3-88e5-4515-b499-8a01192a414b" containerID="efd5f26a39c7b037483e2245137b62e725ebbc03cd311428ba457dcb79554f9e" exitCode=0 Apr 16 14:55:50.859028 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:50.858603 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-2lqmg" event={"ID":"267060b3-88e5-4515-b499-8a01192a414b","Type":"ContainerDied","Data":"efd5f26a39c7b037483e2245137b62e725ebbc03cd311428ba457dcb79554f9e"} Apr 16 14:55:50.859028 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:50.858928 2561 scope.go:117] "RemoveContainer" containerID="efd5f26a39c7b037483e2245137b62e725ebbc03cd311428ba457dcb79554f9e" Apr 16 14:55:51.864198 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:51.864160 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-2lqmg" event={"ID":"267060b3-88e5-4515-b499-8a01192a414b","Type":"ContainerStarted","Data":"004f5021b849b9b780d77d58dd41eae10c9aa876b5c526ac0f9efb87c6a66493"} Apr 16 14:55:56.277487 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:56.277453 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:56.277958 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:56.277503 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:56.282806 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:56.282782 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:56.882738 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:56.882713 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:55:56.934842 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:56.934810 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d67657d7b-tsgb6"] Apr 16 14:55:57.063512 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:57.063484 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:55:58.035202 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:58.035174 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:55:58.038829 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:55:58.038811 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-56f49c9dcd-c869v" Apr 16 14:56:12.931015 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:12.930983 2561 generic.go:358] "Generic (PLEG): container finished" podID="46f7af62-73ce-4f89-a210-d2280368ebfc" containerID="5a260ac684a3d0715d62b739a8451486e4cf77058963d4c67334b2198da70174" exitCode=0 Apr 16 14:56:12.931437 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:12.931026 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth4n" event={"ID":"46f7af62-73ce-4f89-a210-d2280368ebfc","Type":"ContainerDied","Data":"5a260ac684a3d0715d62b739a8451486e4cf77058963d4c67334b2198da70174"} Apr 16 14:56:12.931437 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:12.931313 2561 scope.go:117] "RemoveContainer" containerID="5a260ac684a3d0715d62b739a8451486e4cf77058963d4c67334b2198da70174" Apr 16 14:56:13.941235 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:13.941201 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth4n" event={"ID":"46f7af62-73ce-4f89-a210-d2280368ebfc","Type":"ContainerStarted","Data":"fb4ab47732528f01c5ddcf16f1827af59d07ae468c2a126ed6be355b25f46c2b"} Apr 16 14:56:19.939004 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:19.938960 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs\") pod \"network-metrics-daemon-bppkn\" (UID: \"ed427102-c549-468d-8146-32fba6da0a45\") " pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:56:19.941190 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:19.941167 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed427102-c549-468d-8146-32fba6da0a45-metrics-certs\") pod \"network-metrics-daemon-bppkn\" (UID: \"ed427102-c549-468d-8146-32fba6da0a45\") " pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:56:20.025097 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:20.025068 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hst6r\"" Apr 16 14:56:20.033237 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:20.033219 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bppkn" Apr 16 14:56:20.150043 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:20.150021 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bppkn"] Apr 16 14:56:20.152547 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:56:20.152520 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded427102_c549_468d_8146_32fba6da0a45.slice/crio-2eb7acb2d12789d23ad6a713f44c9669a9766897ac73d26ce5b4447440b82cb8 WatchSource:0}: Error finding container 2eb7acb2d12789d23ad6a713f44c9669a9766897ac73d26ce5b4447440b82cb8: Status 404 returned error can't find the container with id 2eb7acb2d12789d23ad6a713f44c9669a9766897ac73d26ce5b4447440b82cb8 Apr 16 14:56:20.961771 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:20.961735 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bppkn" event={"ID":"ed427102-c549-468d-8146-32fba6da0a45","Type":"ContainerStarted","Data":"2eb7acb2d12789d23ad6a713f44c9669a9766897ac73d26ce5b4447440b82cb8"} Apr 16 14:56:21.954185 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:21.954145 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6d67657d7b-tsgb6" podUID="5b9a172d-ab11-45d0-8094-dec3ae05cb5e" containerName="console" containerID="cri-o://591c90128372a9fb82f52445a16d560b0d5d807f991e1a4485cef23d8a89c3f5" gracePeriod=15 Apr 16 14:56:21.966252 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:21.966215 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bppkn" event={"ID":"ed427102-c549-468d-8146-32fba6da0a45","Type":"ContainerStarted","Data":"29b5caeff454454423eb2cbb93d982a6f7afd9e8ffd1e11915198ed3fedbdec2"} Apr 16 14:56:21.966535 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:21.966261 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bppkn" event={"ID":"ed427102-c549-468d-8146-32fba6da0a45","Type":"ContainerStarted","Data":"9cc8cfb9407e109f1293ebbe384ce34f71bf13cee8c5249780cdfebcfe7b2a9c"} Apr 16 14:56:21.981834 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:21.981796 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bppkn" podStartSLOduration=252.591119277 podStartE2EDuration="4m13.981782608s" podCreationTimestamp="2026-04-16 14:52:08 +0000 UTC" firstStartedPulling="2026-04-16 14:56:20.154716621 +0000 UTC m=+252.535714213" lastFinishedPulling="2026-04-16 14:56:21.545379952 +0000 UTC m=+253.926377544" observedRunningTime="2026-04-16 14:56:21.980791362 +0000 UTC m=+254.361788979" watchObservedRunningTime="2026-04-16 14:56:21.981782608 +0000 UTC m=+254.362780222" Apr 16 14:56:22.215442 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.215421 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d67657d7b-tsgb6_5b9a172d-ab11-45d0-8094-dec3ae05cb5e/console/0.log" Apr 16 14:56:22.215545 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.215485 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:56:22.258484 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.258461 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fngn5\" (UniqueName: \"kubernetes.io/projected/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-kube-api-access-fngn5\") pod \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " Apr 16 14:56:22.258601 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.258494 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-console-oauth-config\") pod \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " Apr 16 14:56:22.258601 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.258520 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-oauth-serving-cert\") pod \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " Apr 16 14:56:22.258601 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.258594 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-console-serving-cert\") pod \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " Apr 16 14:56:22.258727 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.258649 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-console-config\") pod \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " Apr 16 14:56:22.258727 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.258683 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-service-ca\") pod \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\" (UID: \"5b9a172d-ab11-45d0-8094-dec3ae05cb5e\") " Apr 16 14:56:22.259012 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.258984 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5b9a172d-ab11-45d0-8094-dec3ae05cb5e" (UID: "5b9a172d-ab11-45d0-8094-dec3ae05cb5e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:22.259116 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.259024 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-console-config" (OuterVolumeSpecName: "console-config") pod "5b9a172d-ab11-45d0-8094-dec3ae05cb5e" (UID: "5b9a172d-ab11-45d0-8094-dec3ae05cb5e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:22.259116 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.259086 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-service-ca" (OuterVolumeSpecName: "service-ca") pod "5b9a172d-ab11-45d0-8094-dec3ae05cb5e" (UID: "5b9a172d-ab11-45d0-8094-dec3ae05cb5e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:56:22.260775 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.260747 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5b9a172d-ab11-45d0-8094-dec3ae05cb5e" (UID: "5b9a172d-ab11-45d0-8094-dec3ae05cb5e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:22.260775 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.260756 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-kube-api-access-fngn5" (OuterVolumeSpecName: "kube-api-access-fngn5") pod "5b9a172d-ab11-45d0-8094-dec3ae05cb5e" (UID: "5b9a172d-ab11-45d0-8094-dec3ae05cb5e"). InnerVolumeSpecName "kube-api-access-fngn5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:56:22.260924 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.260859 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5b9a172d-ab11-45d0-8094-dec3ae05cb5e" (UID: "5b9a172d-ab11-45d0-8094-dec3ae05cb5e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:56:22.359829 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.359769 2561 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-console-config\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:56:22.359829 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.359792 2561 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-service-ca\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:56:22.359829 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.359802 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fngn5\" (UniqueName: \"kubernetes.io/projected/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-kube-api-access-fngn5\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:56:22.359829 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.359812 2561 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-console-oauth-config\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:56:22.359829 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.359822 2561 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-oauth-serving-cert\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:56:22.359829 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.359832 2561 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b9a172d-ab11-45d0-8094-dec3ae05cb5e-console-serving-cert\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:56:22.969916 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.969892 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d67657d7b-tsgb6_5b9a172d-ab11-45d0-8094-dec3ae05cb5e/console/0.log" Apr 16 14:56:22.970386 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.969929 2561 generic.go:358] "Generic (PLEG): container finished" podID="5b9a172d-ab11-45d0-8094-dec3ae05cb5e" containerID="591c90128372a9fb82f52445a16d560b0d5d807f991e1a4485cef23d8a89c3f5" exitCode=2 Apr 16 14:56:22.970386 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.969961 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d67657d7b-tsgb6" event={"ID":"5b9a172d-ab11-45d0-8094-dec3ae05cb5e","Type":"ContainerDied","Data":"591c90128372a9fb82f52445a16d560b0d5d807f991e1a4485cef23d8a89c3f5"} Apr 16 14:56:22.970386 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.969987 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d67657d7b-tsgb6" Apr 16 14:56:22.970386 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.970016 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d67657d7b-tsgb6" event={"ID":"5b9a172d-ab11-45d0-8094-dec3ae05cb5e","Type":"ContainerDied","Data":"fac69d5591492dbc5bceb3d8cbf80eeee23f6faf5eec44e4391658e42a594d68"} Apr 16 14:56:22.970386 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.970042 2561 scope.go:117] "RemoveContainer" containerID="591c90128372a9fb82f52445a16d560b0d5d807f991e1a4485cef23d8a89c3f5" Apr 16 14:56:22.983165 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.983145 2561 scope.go:117] "RemoveContainer" containerID="591c90128372a9fb82f52445a16d560b0d5d807f991e1a4485cef23d8a89c3f5" Apr 16 14:56:22.983423 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:56:22.983404 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"591c90128372a9fb82f52445a16d560b0d5d807f991e1a4485cef23d8a89c3f5\": container with ID starting with 591c90128372a9fb82f52445a16d560b0d5d807f991e1a4485cef23d8a89c3f5 not found: ID does not exist" containerID="591c90128372a9fb82f52445a16d560b0d5d807f991e1a4485cef23d8a89c3f5" Apr 16 14:56:22.983473 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.983431 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"591c90128372a9fb82f52445a16d560b0d5d807f991e1a4485cef23d8a89c3f5"} err="failed to get container status \"591c90128372a9fb82f52445a16d560b0d5d807f991e1a4485cef23d8a89c3f5\": rpc error: code = NotFound desc = could not find container \"591c90128372a9fb82f52445a16d560b0d5d807f991e1a4485cef23d8a89c3f5\": container with ID starting with 591c90128372a9fb82f52445a16d560b0d5d807f991e1a4485cef23d8a89c3f5 not found: ID does not exist" Apr 16 14:56:22.996387 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:22.996363 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d67657d7b-tsgb6"] Apr 16 14:56:23.000257 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:23.000239 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6d67657d7b-tsgb6"] Apr 16 14:56:24.225147 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:24.225115 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b9a172d-ab11-45d0-8094-dec3ae05cb5e" path="/var/lib/kubelet/pods/5b9a172d-ab11-45d0-8094-dec3ae05cb5e/volumes" Apr 16 14:56:40.893917 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:40.893882 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b65f5bf46-lqnr9"] Apr 16 14:56:40.894688 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:40.894571 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b9a172d-ab11-45d0-8094-dec3ae05cb5e" containerName="console" Apr 16 14:56:40.894688 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:40.894596 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b9a172d-ab11-45d0-8094-dec3ae05cb5e" containerName="console" Apr 16 14:56:40.894833 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:40.894707 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b9a172d-ab11-45d0-8094-dec3ae05cb5e" containerName="console" Apr 16 14:56:40.898156 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:40.898131 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:40.909802 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:40.909772 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b65f5bf46-lqnr9"] Apr 16 14:56:41.007837 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.007805 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/15ffef42-5fa7-4d64-9d35-899db17f811e-console-serving-cert\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.007987 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.007846 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmvvq\" (UniqueName: \"kubernetes.io/projected/15ffef42-5fa7-4d64-9d35-899db17f811e-kube-api-access-zmvvq\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.007987 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.007923 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/15ffef42-5fa7-4d64-9d35-899db17f811e-console-oauth-config\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.007987 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.007963 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-console-config\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.007987 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.007985 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-service-ca\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.008128 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.008000 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-trusted-ca-bundle\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.008128 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.008061 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-oauth-serving-cert\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.108861 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.108833 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-oauth-serving-cert\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.109001 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.108929 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/15ffef42-5fa7-4d64-9d35-899db17f811e-console-serving-cert\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.109001 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.108962 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmvvq\" (UniqueName: \"kubernetes.io/projected/15ffef42-5fa7-4d64-9d35-899db17f811e-kube-api-access-zmvvq\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.109106 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.109089 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/15ffef42-5fa7-4d64-9d35-899db17f811e-console-oauth-config\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.109152 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.109138 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-console-config\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.109199 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.109172 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-service-ca\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.109345 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.109309 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-trusted-ca-bundle\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.109436 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.109416 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-oauth-serving-cert\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.109779 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.109757 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-console-config\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.109868 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.109793 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-service-ca\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.110068 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.110051 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-trusted-ca-bundle\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.111382 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.111353 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/15ffef42-5fa7-4d64-9d35-899db17f811e-console-serving-cert\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.111470 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.111391 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/15ffef42-5fa7-4d64-9d35-899db17f811e-console-oauth-config\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.117944 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.117923 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmvvq\" (UniqueName: \"kubernetes.io/projected/15ffef42-5fa7-4d64-9d35-899db17f811e-kube-api-access-zmvvq\") pod \"console-7b65f5bf46-lqnr9\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.210833 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.210777 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:41.332922 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:41.332897 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b65f5bf46-lqnr9"] Apr 16 14:56:41.335293 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:56:41.335258 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15ffef42_5fa7_4d64_9d35_899db17f811e.slice/crio-d4ff8caf0bada02146b446dbf69ae415a26c9f10e5706ba59f5a2f565e3441bf WatchSource:0}: Error finding container d4ff8caf0bada02146b446dbf69ae415a26c9f10e5706ba59f5a2f565e3441bf: Status 404 returned error can't find the container with id d4ff8caf0bada02146b446dbf69ae415a26c9f10e5706ba59f5a2f565e3441bf Apr 16 14:56:42.029961 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:42.029927 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b65f5bf46-lqnr9" event={"ID":"15ffef42-5fa7-4d64-9d35-899db17f811e","Type":"ContainerStarted","Data":"2a36aacf1e4a3225f6608e1acb6d2b35167da60a04f0caa2f8693b375d1bc728"} Apr 16 14:56:42.029961 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:42.029963 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b65f5bf46-lqnr9" event={"ID":"15ffef42-5fa7-4d64-9d35-899db17f811e","Type":"ContainerStarted","Data":"d4ff8caf0bada02146b446dbf69ae415a26c9f10e5706ba59f5a2f565e3441bf"} Apr 16 14:56:42.045895 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:42.045847 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b65f5bf46-lqnr9" podStartSLOduration=2.045834811 podStartE2EDuration="2.045834811s" podCreationTimestamp="2026-04-16 14:56:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:56:42.044678113 +0000 UTC m=+274.425675719" watchObservedRunningTime="2026-04-16 14:56:42.045834811 +0000 UTC m=+274.426832426" Apr 16 14:56:46.639511 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:56:46.639471 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-vlvr9" podUID="d01b24a9-f9f3-4d8c-830c-38ff2cc50292" Apr 16 14:56:46.639511 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:56:46.639471 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-5lzf7" podUID="18926de0-0561-424c-845b-6ea1059c821a" Apr 16 14:56:47.044912 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:47.044888 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vlvr9" Apr 16 14:56:47.045062 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:47.044894 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5lzf7" Apr 16 14:56:50.586733 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:50.586699 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:56:50.587095 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:50.586745 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert\") pod \"ingress-canary-vlvr9\" (UID: \"d01b24a9-f9f3-4d8c-830c-38ff2cc50292\") " pod="openshift-ingress-canary/ingress-canary-vlvr9" Apr 16 14:56:50.589008 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:50.588980 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/18926de0-0561-424c-845b-6ea1059c821a-metrics-tls\") pod \"dns-default-5lzf7\" (UID: \"18926de0-0561-424c-845b-6ea1059c821a\") " pod="openshift-dns/dns-default-5lzf7" Apr 16 14:56:50.589138 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:50.589112 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d01b24a9-f9f3-4d8c-830c-38ff2cc50292-cert\") pod \"ingress-canary-vlvr9\" (UID: \"d01b24a9-f9f3-4d8c-830c-38ff2cc50292\") " pod="openshift-ingress-canary/ingress-canary-vlvr9" Apr 16 14:56:50.648004 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:50.647982 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gcjjc\"" Apr 16 14:56:50.648846 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:50.648830 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-njff8\"" Apr 16 14:56:50.656664 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:50.656650 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5lzf7" Apr 16 14:56:50.656818 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:50.656800 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vlvr9" Apr 16 14:56:50.793805 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:50.793780 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5lzf7"] Apr 16 14:56:50.796953 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:56:50.796926 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18926de0_0561_424c_845b_6ea1059c821a.slice/crio-a7f28d82db9adf14d8a74316b9df93b33447520b7d05ce0a51ed8ad8214ec034 WatchSource:0}: Error finding container a7f28d82db9adf14d8a74316b9df93b33447520b7d05ce0a51ed8ad8214ec034: Status 404 returned error can't find the container with id a7f28d82db9adf14d8a74316b9df93b33447520b7d05ce0a51ed8ad8214ec034 Apr 16 14:56:50.819874 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:50.819776 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vlvr9"] Apr 16 14:56:50.822215 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:56:50.822190 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd01b24a9_f9f3_4d8c_830c_38ff2cc50292.slice/crio-8a8366a29223cc3b5c56cda5042cdb04a48c386924980ebcf9d9b1b9efd170fd WatchSource:0}: Error finding container 8a8366a29223cc3b5c56cda5042cdb04a48c386924980ebcf9d9b1b9efd170fd: Status 404 returned error can't find the container with id 8a8366a29223cc3b5c56cda5042cdb04a48c386924980ebcf9d9b1b9efd170fd Apr 16 14:56:51.056583 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:51.056547 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vlvr9" event={"ID":"d01b24a9-f9f3-4d8c-830c-38ff2cc50292","Type":"ContainerStarted","Data":"8a8366a29223cc3b5c56cda5042cdb04a48c386924980ebcf9d9b1b9efd170fd"} Apr 16 14:56:51.057328 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:51.057305 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5lzf7" event={"ID":"18926de0-0561-424c-845b-6ea1059c821a","Type":"ContainerStarted","Data":"a7f28d82db9adf14d8a74316b9df93b33447520b7d05ce0a51ed8ad8214ec034"} Apr 16 14:56:51.210911 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:51.210879 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:51.211038 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:51.210919 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:51.215458 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:51.215436 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:52.066658 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:52.066631 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:56:52.113233 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:52.113198 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79bbb545ff-stc72"] Apr 16 14:56:53.064781 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:53.064746 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vlvr9" event={"ID":"d01b24a9-f9f3-4d8c-830c-38ff2cc50292","Type":"ContainerStarted","Data":"886f0a8e576c72d0f24f91e5618c3b2f9798e826060f50f3440c3ffbc74d0a57"} Apr 16 14:56:53.066353 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:53.066323 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5lzf7" event={"ID":"18926de0-0561-424c-845b-6ea1059c821a","Type":"ContainerStarted","Data":"1b30aef015e6f8504973f1a4be7cca863510e27091f69ffa9ef6836ae74551a5"} Apr 16 14:56:53.066472 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:53.066357 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5lzf7" event={"ID":"18926de0-0561-424c-845b-6ea1059c821a","Type":"ContainerStarted","Data":"7dbaf74c02664ccc199c16c4e31b992cf9f1d6206c1914d1814a6cce86e3b37b"} Apr 16 14:56:53.066514 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:53.066472 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-5lzf7" Apr 16 14:56:53.078962 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:53.078914 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vlvr9" podStartSLOduration=251.249662636 podStartE2EDuration="4m13.078896105s" podCreationTimestamp="2026-04-16 14:52:40 +0000 UTC" firstStartedPulling="2026-04-16 14:56:50.825530934 +0000 UTC m=+283.206528525" lastFinishedPulling="2026-04-16 14:56:52.654764402 +0000 UTC m=+285.035761994" observedRunningTime="2026-04-16 14:56:53.078484719 +0000 UTC m=+285.459482344" watchObservedRunningTime="2026-04-16 14:56:53.078896105 +0000 UTC m=+285.459893719" Apr 16 14:56:53.095954 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:56:53.095908 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5lzf7" podStartSLOduration=251.243582488 podStartE2EDuration="4m13.095893422s" podCreationTimestamp="2026-04-16 14:52:40 +0000 UTC" firstStartedPulling="2026-04-16 14:56:50.798791374 +0000 UTC m=+283.179788977" lastFinishedPulling="2026-04-16 14:56:52.651102315 +0000 UTC m=+285.032099911" observedRunningTime="2026-04-16 14:56:53.094873624 +0000 UTC m=+285.475871240" watchObservedRunningTime="2026-04-16 14:56:53.095893422 +0000 UTC m=+285.476891054" Apr 16 14:57:03.070837 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:03.070805 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5lzf7" Apr 16 14:57:08.089655 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:08.089603 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-sx65q_e23b4d8a-18ac-44f6-bf57-10e1ac5b3247/console-operator/2.log" Apr 16 14:57:08.090075 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:08.089603 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-sx65q_e23b4d8a-18ac-44f6-bf57-10e1ac5b3247/console-operator/2.log" Apr 16 14:57:08.099268 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:08.099251 2561 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 14:57:17.138876 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.138823 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-79bbb545ff-stc72" podUID="4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef" containerName="console" containerID="cri-o://fdf26c1b5eaed27673ddcf4d653825abf2757b266cdebfccab11cc22f3f4b613" gracePeriod=15 Apr 16 14:57:17.368140 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.368116 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79bbb545ff-stc72_4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef/console/0.log" Apr 16 14:57:17.368249 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.368180 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:57:17.481560 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.481479 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-console-config\") pod \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " Apr 16 14:57:17.481560 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.481533 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-service-ca\") pod \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " Apr 16 14:57:17.481793 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.481563 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52gc7\" (UniqueName: \"kubernetes.io/projected/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-kube-api-access-52gc7\") pod \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " Apr 16 14:57:17.481793 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.481591 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-console-serving-cert\") pod \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " Apr 16 14:57:17.481793 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.481635 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-trusted-ca-bundle\") pod \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " Apr 16 14:57:17.481793 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.481673 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-oauth-serving-cert\") pod \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " Apr 16 14:57:17.481793 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.481709 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-console-oauth-config\") pod \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\" (UID: \"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef\") " Apr 16 14:57:17.482073 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.481934 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-console-config" (OuterVolumeSpecName: "console-config") pod "4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef" (UID: "4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:17.482073 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.481964 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-service-ca" (OuterVolumeSpecName: "service-ca") pod "4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef" (UID: "4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:17.482253 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.482224 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef" (UID: "4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:17.482293 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.482272 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef" (UID: "4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:17.483777 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.483751 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef" (UID: "4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:17.483887 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.483791 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-kube-api-access-52gc7" (OuterVolumeSpecName: "kube-api-access-52gc7") pod "4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef" (UID: "4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef"). InnerVolumeSpecName "kube-api-access-52gc7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:57:17.483887 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.483845 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef" (UID: "4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:17.583163 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.583130 2561 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-oauth-serving-cert\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:57:17.583163 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.583159 2561 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-console-oauth-config\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:57:17.583351 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.583174 2561 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-console-config\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:57:17.583351 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.583188 2561 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-service-ca\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:57:17.583351 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.583200 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-52gc7\" (UniqueName: \"kubernetes.io/projected/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-kube-api-access-52gc7\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:57:17.583351 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.583213 2561 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-console-serving-cert\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:57:17.583351 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:17.583226 2561 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef-trusted-ca-bundle\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:57:18.133366 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:18.133334 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79bbb545ff-stc72_4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef/console/0.log" Apr 16 14:57:18.133528 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:18.133375 2561 generic.go:358] "Generic (PLEG): container finished" podID="4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef" containerID="fdf26c1b5eaed27673ddcf4d653825abf2757b266cdebfccab11cc22f3f4b613" exitCode=2 Apr 16 14:57:18.133528 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:18.133460 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79bbb545ff-stc72" Apr 16 14:57:18.133528 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:18.133482 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79bbb545ff-stc72" event={"ID":"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef","Type":"ContainerDied","Data":"fdf26c1b5eaed27673ddcf4d653825abf2757b266cdebfccab11cc22f3f4b613"} Apr 16 14:57:18.133528 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:18.133519 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79bbb545ff-stc72" event={"ID":"4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef","Type":"ContainerDied","Data":"76dc3e7d0f3d4c1baa8c4f9c8325f420b278aeba69e25d970c1a4706c7997614"} Apr 16 14:57:18.133702 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:18.133534 2561 scope.go:117] "RemoveContainer" containerID="fdf26c1b5eaed27673ddcf4d653825abf2757b266cdebfccab11cc22f3f4b613" Apr 16 14:57:18.142066 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:18.141921 2561 scope.go:117] "RemoveContainer" containerID="fdf26c1b5eaed27673ddcf4d653825abf2757b266cdebfccab11cc22f3f4b613" Apr 16 14:57:18.142303 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:57:18.142208 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdf26c1b5eaed27673ddcf4d653825abf2757b266cdebfccab11cc22f3f4b613\": container with ID starting with fdf26c1b5eaed27673ddcf4d653825abf2757b266cdebfccab11cc22f3f4b613 not found: ID does not exist" containerID="fdf26c1b5eaed27673ddcf4d653825abf2757b266cdebfccab11cc22f3f4b613" Apr 16 14:57:18.142303 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:18.142247 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdf26c1b5eaed27673ddcf4d653825abf2757b266cdebfccab11cc22f3f4b613"} err="failed to get container status \"fdf26c1b5eaed27673ddcf4d653825abf2757b266cdebfccab11cc22f3f4b613\": rpc error: code = NotFound desc = could not find container \"fdf26c1b5eaed27673ddcf4d653825abf2757b266cdebfccab11cc22f3f4b613\": container with ID starting with fdf26c1b5eaed27673ddcf4d653825abf2757b266cdebfccab11cc22f3f4b613 not found: ID does not exist" Apr 16 14:57:18.153783 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:18.153764 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79bbb545ff-stc72"] Apr 16 14:57:18.158014 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:18.157996 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-79bbb545ff-stc72"] Apr 16 14:57:18.224746 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:18.224719 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef" path="/var/lib/kubelet/pods/4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef/volumes" Apr 16 14:57:56.382072 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.382040 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8d9cc8cff-ncskx"] Apr 16 14:57:56.382671 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.382464 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef" containerName="console" Apr 16 14:57:56.382671 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.382484 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef" containerName="console" Apr 16 14:57:56.382671 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.382576 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c22f2f5-f7d8-4fb3-871e-ed91eb51d7ef" containerName="console" Apr 16 14:57:56.385682 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.385655 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.399624 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.397806 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8d9cc8cff-ncskx"] Apr 16 14:57:56.486576 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.486548 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-oauth-serving-cert\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.486735 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.486589 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-trusted-ca-bundle\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.486735 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.486676 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f696a1a-abcc-4a94-b486-d6ff4e305cda-console-oauth-config\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.486735 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.486693 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-console-config\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.486735 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.486711 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-service-ca\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.486861 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.486764 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f696a1a-abcc-4a94-b486-d6ff4e305cda-console-serving-cert\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.486861 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.486780 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7wsk\" (UniqueName: \"kubernetes.io/projected/9f696a1a-abcc-4a94-b486-d6ff4e305cda-kube-api-access-c7wsk\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.587562 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.587536 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f696a1a-abcc-4a94-b486-d6ff4e305cda-console-oauth-config\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.587708 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.587564 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-console-config\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.587708 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.587585 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-service-ca\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.587708 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.587638 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f696a1a-abcc-4a94-b486-d6ff4e305cda-console-serving-cert\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.587708 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.587664 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7wsk\" (UniqueName: \"kubernetes.io/projected/9f696a1a-abcc-4a94-b486-d6ff4e305cda-kube-api-access-c7wsk\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.587885 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.587721 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-oauth-serving-cert\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.587885 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.587767 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-trusted-ca-bundle\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.588407 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.588377 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-console-config\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.588539 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.588377 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-service-ca\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.588658 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.588638 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-trusted-ca-bundle\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.588726 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.588636 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-oauth-serving-cert\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.590007 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.589985 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f696a1a-abcc-4a94-b486-d6ff4e305cda-console-serving-cert\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.590100 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.590007 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f696a1a-abcc-4a94-b486-d6ff4e305cda-console-oauth-config\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.596040 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.596015 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7wsk\" (UniqueName: \"kubernetes.io/projected/9f696a1a-abcc-4a94-b486-d6ff4e305cda-kube-api-access-c7wsk\") pod \"console-8d9cc8cff-ncskx\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.696233 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.696169 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:57:56.814572 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.814545 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8d9cc8cff-ncskx"] Apr 16 14:57:56.817070 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:57:56.817040 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f696a1a_abcc_4a94_b486_d6ff4e305cda.slice/crio-b376a71398fc42c8f47f335f5fe0e28f4d1a4c2082fc3740467d99b524af5eda WatchSource:0}: Error finding container b376a71398fc42c8f47f335f5fe0e28f4d1a4c2082fc3740467d99b524af5eda: Status 404 returned error can't find the container with id b376a71398fc42c8f47f335f5fe0e28f4d1a4c2082fc3740467d99b524af5eda Apr 16 14:57:56.818881 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:56.818861 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:57:57.247322 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:57.247286 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8d9cc8cff-ncskx" event={"ID":"9f696a1a-abcc-4a94-b486-d6ff4e305cda","Type":"ContainerStarted","Data":"641beece0bd0c49d3797f7c977522e08c65f1c55fd178f845d778deb64dff8eb"} Apr 16 14:57:57.247322 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:57.247324 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8d9cc8cff-ncskx" event={"ID":"9f696a1a-abcc-4a94-b486-d6ff4e305cda","Type":"ContainerStarted","Data":"b376a71398fc42c8f47f335f5fe0e28f4d1a4c2082fc3740467d99b524af5eda"} Apr 16 14:57:57.265181 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:57:57.265136 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8d9cc8cff-ncskx" podStartSLOduration=1.265121758 podStartE2EDuration="1.265121758s" podCreationTimestamp="2026-04-16 14:57:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:57:57.263734095 +0000 UTC m=+349.644731721" watchObservedRunningTime="2026-04-16 14:57:57.265121758 +0000 UTC m=+349.646119371" Apr 16 14:58:06.696840 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:06.696804 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:58:06.696840 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:06.696847 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:58:06.701297 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:06.701278 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:58:07.277995 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:07.277969 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 14:58:07.322620 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:07.322577 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b65f5bf46-lqnr9"] Apr 16 14:58:32.342854 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.342777 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7b65f5bf46-lqnr9" podUID="15ffef42-5fa7-4d64-9d35-899db17f811e" containerName="console" containerID="cri-o://2a36aacf1e4a3225f6608e1acb6d2b35167da60a04f0caa2f8693b375d1bc728" gracePeriod=15 Apr 16 14:58:32.581255 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.581232 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b65f5bf46-lqnr9_15ffef42-5fa7-4d64-9d35-899db17f811e/console/0.log" Apr 16 14:58:32.581364 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.581291 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:58:32.644746 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.644668 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-service-ca\") pod \"15ffef42-5fa7-4d64-9d35-899db17f811e\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " Apr 16 14:58:32.644746 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.644713 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-oauth-serving-cert\") pod \"15ffef42-5fa7-4d64-9d35-899db17f811e\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " Apr 16 14:58:32.644934 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.644746 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/15ffef42-5fa7-4d64-9d35-899db17f811e-console-oauth-config\") pod \"15ffef42-5fa7-4d64-9d35-899db17f811e\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " Apr 16 14:58:32.644934 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.644811 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-trusted-ca-bundle\") pod \"15ffef42-5fa7-4d64-9d35-899db17f811e\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " Apr 16 14:58:32.645043 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.644940 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-console-config\") pod \"15ffef42-5fa7-4d64-9d35-899db17f811e\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " Apr 16 14:58:32.645043 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.645021 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/15ffef42-5fa7-4d64-9d35-899db17f811e-console-serving-cert\") pod \"15ffef42-5fa7-4d64-9d35-899db17f811e\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " Apr 16 14:58:32.645155 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.645055 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmvvq\" (UniqueName: \"kubernetes.io/projected/15ffef42-5fa7-4d64-9d35-899db17f811e-kube-api-access-zmvvq\") pod \"15ffef42-5fa7-4d64-9d35-899db17f811e\" (UID: \"15ffef42-5fa7-4d64-9d35-899db17f811e\") " Apr 16 14:58:32.645294 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.645050 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-service-ca" (OuterVolumeSpecName: "service-ca") pod "15ffef42-5fa7-4d64-9d35-899db17f811e" (UID: "15ffef42-5fa7-4d64-9d35-899db17f811e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:58:32.645359 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.645284 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-console-config" (OuterVolumeSpecName: "console-config") pod "15ffef42-5fa7-4d64-9d35-899db17f811e" (UID: "15ffef42-5fa7-4d64-9d35-899db17f811e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:58:32.645359 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.645197 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "15ffef42-5fa7-4d64-9d35-899db17f811e" (UID: "15ffef42-5fa7-4d64-9d35-899db17f811e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:58:32.645359 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.645230 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "15ffef42-5fa7-4d64-9d35-899db17f811e" (UID: "15ffef42-5fa7-4d64-9d35-899db17f811e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:58:32.646984 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.646953 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15ffef42-5fa7-4d64-9d35-899db17f811e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "15ffef42-5fa7-4d64-9d35-899db17f811e" (UID: "15ffef42-5fa7-4d64-9d35-899db17f811e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:58:32.647094 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.646998 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15ffef42-5fa7-4d64-9d35-899db17f811e-kube-api-access-zmvvq" (OuterVolumeSpecName: "kube-api-access-zmvvq") pod "15ffef42-5fa7-4d64-9d35-899db17f811e" (UID: "15ffef42-5fa7-4d64-9d35-899db17f811e"). InnerVolumeSpecName "kube-api-access-zmvvq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:58:32.647094 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.647046 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15ffef42-5fa7-4d64-9d35-899db17f811e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "15ffef42-5fa7-4d64-9d35-899db17f811e" (UID: "15ffef42-5fa7-4d64-9d35-899db17f811e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:58:32.746522 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.746496 2561 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-console-config\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:58:32.746522 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.746520 2561 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/15ffef42-5fa7-4d64-9d35-899db17f811e-console-serving-cert\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:58:32.746695 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.746534 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zmvvq\" (UniqueName: \"kubernetes.io/projected/15ffef42-5fa7-4d64-9d35-899db17f811e-kube-api-access-zmvvq\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:58:32.746695 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.746547 2561 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-service-ca\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:58:32.746695 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.746561 2561 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-oauth-serving-cert\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:58:32.746695 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.746573 2561 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/15ffef42-5fa7-4d64-9d35-899db17f811e-console-oauth-config\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:58:32.746695 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:32.746586 2561 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15ffef42-5fa7-4d64-9d35-899db17f811e-trusted-ca-bundle\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:58:33.344822 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:33.344792 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b65f5bf46-lqnr9_15ffef42-5fa7-4d64-9d35-899db17f811e/console/0.log" Apr 16 14:58:33.345232 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:33.344832 2561 generic.go:358] "Generic (PLEG): container finished" podID="15ffef42-5fa7-4d64-9d35-899db17f811e" containerID="2a36aacf1e4a3225f6608e1acb6d2b35167da60a04f0caa2f8693b375d1bc728" exitCode=2 Apr 16 14:58:33.345232 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:33.344924 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b65f5bf46-lqnr9" Apr 16 14:58:33.345232 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:33.344926 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b65f5bf46-lqnr9" event={"ID":"15ffef42-5fa7-4d64-9d35-899db17f811e","Type":"ContainerDied","Data":"2a36aacf1e4a3225f6608e1acb6d2b35167da60a04f0caa2f8693b375d1bc728"} Apr 16 14:58:33.345232 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:33.344966 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b65f5bf46-lqnr9" event={"ID":"15ffef42-5fa7-4d64-9d35-899db17f811e","Type":"ContainerDied","Data":"d4ff8caf0bada02146b446dbf69ae415a26c9f10e5706ba59f5a2f565e3441bf"} Apr 16 14:58:33.345232 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:33.344982 2561 scope.go:117] "RemoveContainer" containerID="2a36aacf1e4a3225f6608e1acb6d2b35167da60a04f0caa2f8693b375d1bc728" Apr 16 14:58:33.353705 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:33.353685 2561 scope.go:117] "RemoveContainer" containerID="2a36aacf1e4a3225f6608e1acb6d2b35167da60a04f0caa2f8693b375d1bc728" Apr 16 14:58:33.353944 ip-10-0-142-86 kubenswrapper[2561]: E0416 14:58:33.353924 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a36aacf1e4a3225f6608e1acb6d2b35167da60a04f0caa2f8693b375d1bc728\": container with ID starting with 2a36aacf1e4a3225f6608e1acb6d2b35167da60a04f0caa2f8693b375d1bc728 not found: ID does not exist" containerID="2a36aacf1e4a3225f6608e1acb6d2b35167da60a04f0caa2f8693b375d1bc728" Apr 16 14:58:33.353994 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:33.353952 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a36aacf1e4a3225f6608e1acb6d2b35167da60a04f0caa2f8693b375d1bc728"} err="failed to get container status \"2a36aacf1e4a3225f6608e1acb6d2b35167da60a04f0caa2f8693b375d1bc728\": rpc error: code = NotFound desc = could not find container \"2a36aacf1e4a3225f6608e1acb6d2b35167da60a04f0caa2f8693b375d1bc728\": container with ID starting with 2a36aacf1e4a3225f6608e1acb6d2b35167da60a04f0caa2f8693b375d1bc728 not found: ID does not exist" Apr 16 14:58:33.364619 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:33.364583 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b65f5bf46-lqnr9"] Apr 16 14:58:33.368171 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:33.368149 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7b65f5bf46-lqnr9"] Apr 16 14:58:34.225696 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:58:34.225661 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15ffef42-5fa7-4d64-9d35-899db17f811e" path="/var/lib/kubelet/pods/15ffef42-5fa7-4d64-9d35-899db17f811e/volumes" Apr 16 14:59:41.630270 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:41.630238 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z"] Apr 16 14:59:41.630753 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:41.630529 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15ffef42-5fa7-4d64-9d35-899db17f811e" containerName="console" Apr 16 14:59:41.630753 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:41.630540 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ffef42-5fa7-4d64-9d35-899db17f811e" containerName="console" Apr 16 14:59:41.630753 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:41.630599 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="15ffef42-5fa7-4d64-9d35-899db17f811e" containerName="console" Apr 16 14:59:41.633696 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:41.633680 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z" Apr 16 14:59:41.636286 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:41.636264 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:59:41.636403 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:41.636272 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:59:41.636403 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:41.636316 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-gc788\"" Apr 16 14:59:41.641316 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:41.641295 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z"] Apr 16 14:59:41.744661 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:41.744629 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9777584-46a7-4fbf-84f0-ce58c408ebee-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z\" (UID: \"c9777584-46a7-4fbf-84f0-ce58c408ebee\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z" Apr 16 14:59:41.744808 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:41.744687 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9777584-46a7-4fbf-84f0-ce58c408ebee-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z\" (UID: \"c9777584-46a7-4fbf-84f0-ce58c408ebee\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z" Apr 16 14:59:41.744808 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:41.744750 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwfdw\" (UniqueName: \"kubernetes.io/projected/c9777584-46a7-4fbf-84f0-ce58c408ebee-kube-api-access-xwfdw\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z\" (UID: \"c9777584-46a7-4fbf-84f0-ce58c408ebee\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z" Apr 16 14:59:41.845632 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:41.845584 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9777584-46a7-4fbf-84f0-ce58c408ebee-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z\" (UID: \"c9777584-46a7-4fbf-84f0-ce58c408ebee\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z" Apr 16 14:59:41.845748 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:41.845655 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwfdw\" (UniqueName: \"kubernetes.io/projected/c9777584-46a7-4fbf-84f0-ce58c408ebee-kube-api-access-xwfdw\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z\" (UID: \"c9777584-46a7-4fbf-84f0-ce58c408ebee\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z" Apr 16 14:59:41.845748 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:41.845707 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9777584-46a7-4fbf-84f0-ce58c408ebee-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z\" (UID: \"c9777584-46a7-4fbf-84f0-ce58c408ebee\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z" Apr 16 14:59:41.845932 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:41.845912 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9777584-46a7-4fbf-84f0-ce58c408ebee-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z\" (UID: \"c9777584-46a7-4fbf-84f0-ce58c408ebee\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z" Apr 16 14:59:41.846010 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:41.845990 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9777584-46a7-4fbf-84f0-ce58c408ebee-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z\" (UID: \"c9777584-46a7-4fbf-84f0-ce58c408ebee\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z" Apr 16 14:59:41.853770 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:41.853751 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwfdw\" (UniqueName: \"kubernetes.io/projected/c9777584-46a7-4fbf-84f0-ce58c408ebee-kube-api-access-xwfdw\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z\" (UID: \"c9777584-46a7-4fbf-84f0-ce58c408ebee\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z" Apr 16 14:59:41.943137 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:41.943083 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z" Apr 16 14:59:42.055260 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:42.055235 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z"] Apr 16 14:59:42.057850 ip-10-0-142-86 kubenswrapper[2561]: W0416 14:59:42.057819 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9777584_46a7_4fbf_84f0_ce58c408ebee.slice/crio-ee86a490f26579e13397ea9e18e8897ac035b6d29fbbe324323db40e19c48d06 WatchSource:0}: Error finding container ee86a490f26579e13397ea9e18e8897ac035b6d29fbbe324323db40e19c48d06: Status 404 returned error can't find the container with id ee86a490f26579e13397ea9e18e8897ac035b6d29fbbe324323db40e19c48d06 Apr 16 14:59:42.536569 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:42.536537 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z" event={"ID":"c9777584-46a7-4fbf-84f0-ce58c408ebee","Type":"ContainerStarted","Data":"ee86a490f26579e13397ea9e18e8897ac035b6d29fbbe324323db40e19c48d06"} Apr 16 14:59:47.552517 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:47.552484 2561 generic.go:358] "Generic (PLEG): container finished" podID="c9777584-46a7-4fbf-84f0-ce58c408ebee" containerID="6902c2a5ee5f868b230729ecea70dce0e00820617cf5b3f2b1881696c14916c0" exitCode=0 Apr 16 14:59:47.552923 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:47.552566 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z" event={"ID":"c9777584-46a7-4fbf-84f0-ce58c408ebee","Type":"ContainerDied","Data":"6902c2a5ee5f868b230729ecea70dce0e00820617cf5b3f2b1881696c14916c0"} Apr 16 14:59:50.565389 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:50.565355 2561 generic.go:358] "Generic (PLEG): container finished" podID="c9777584-46a7-4fbf-84f0-ce58c408ebee" containerID="9ac57b37b10e856c76fe9729e15f567254a8601debdf00dbe7b84153acf1fff0" exitCode=0 Apr 16 14:59:50.565770 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:50.565438 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z" event={"ID":"c9777584-46a7-4fbf-84f0-ce58c408ebee","Type":"ContainerDied","Data":"9ac57b37b10e856c76fe9729e15f567254a8601debdf00dbe7b84153acf1fff0"} Apr 16 14:59:56.587474 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:56.587442 2561 generic.go:358] "Generic (PLEG): container finished" podID="c9777584-46a7-4fbf-84f0-ce58c408ebee" containerID="af7d7ea077a060a8541a169bcbb0f29a33918a85eca77d352fd95dd23181ac37" exitCode=0 Apr 16 14:59:56.587800 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:56.587496 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z" event={"ID":"c9777584-46a7-4fbf-84f0-ce58c408ebee","Type":"ContainerDied","Data":"af7d7ea077a060a8541a169bcbb0f29a33918a85eca77d352fd95dd23181ac37"} Apr 16 14:59:57.708168 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:57.708148 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z" Apr 16 14:59:57.781603 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:57.781577 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9777584-46a7-4fbf-84f0-ce58c408ebee-util\") pod \"c9777584-46a7-4fbf-84f0-ce58c408ebee\" (UID: \"c9777584-46a7-4fbf-84f0-ce58c408ebee\") " Apr 16 14:59:57.781754 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:57.781669 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwfdw\" (UniqueName: \"kubernetes.io/projected/c9777584-46a7-4fbf-84f0-ce58c408ebee-kube-api-access-xwfdw\") pod \"c9777584-46a7-4fbf-84f0-ce58c408ebee\" (UID: \"c9777584-46a7-4fbf-84f0-ce58c408ebee\") " Apr 16 14:59:57.781754 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:57.781736 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9777584-46a7-4fbf-84f0-ce58c408ebee-bundle\") pod \"c9777584-46a7-4fbf-84f0-ce58c408ebee\" (UID: \"c9777584-46a7-4fbf-84f0-ce58c408ebee\") " Apr 16 14:59:57.782413 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:57.782383 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9777584-46a7-4fbf-84f0-ce58c408ebee-bundle" (OuterVolumeSpecName: "bundle") pod "c9777584-46a7-4fbf-84f0-ce58c408ebee" (UID: "c9777584-46a7-4fbf-84f0-ce58c408ebee"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:59:57.783817 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:57.783776 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9777584-46a7-4fbf-84f0-ce58c408ebee-kube-api-access-xwfdw" (OuterVolumeSpecName: "kube-api-access-xwfdw") pod "c9777584-46a7-4fbf-84f0-ce58c408ebee" (UID: "c9777584-46a7-4fbf-84f0-ce58c408ebee"). InnerVolumeSpecName "kube-api-access-xwfdw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:59:57.785715 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:57.785692 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9777584-46a7-4fbf-84f0-ce58c408ebee-util" (OuterVolumeSpecName: "util") pod "c9777584-46a7-4fbf-84f0-ce58c408ebee" (UID: "c9777584-46a7-4fbf-84f0-ce58c408ebee"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:59:57.882891 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:57.882834 2561 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9777584-46a7-4fbf-84f0-ce58c408ebee-bundle\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:59:57.882891 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:57.882858 2561 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9777584-46a7-4fbf-84f0-ce58c408ebee-util\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:59:57.882891 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:57.882869 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xwfdw\" (UniqueName: \"kubernetes.io/projected/c9777584-46a7-4fbf-84f0-ce58c408ebee-kube-api-access-xwfdw\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 14:59:58.600818 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:58.600781 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z" event={"ID":"c9777584-46a7-4fbf-84f0-ce58c408ebee","Type":"ContainerDied","Data":"ee86a490f26579e13397ea9e18e8897ac035b6d29fbbe324323db40e19c48d06"} Apr 16 14:59:58.600818 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:58.600817 2561 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee86a490f26579e13397ea9e18e8897ac035b6d29fbbe324323db40e19c48d06" Apr 16 14:59:58.601009 ip-10-0-142-86 kubenswrapper[2561]: I0416 14:59:58.600862 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8fc8z" Apr 16 15:00:03.217447 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.217412 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt"] Apr 16 15:00:03.217835 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.217781 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9777584-46a7-4fbf-84f0-ce58c408ebee" containerName="extract" Apr 16 15:00:03.217835 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.217795 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9777584-46a7-4fbf-84f0-ce58c408ebee" containerName="extract" Apr 16 15:00:03.217835 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.217814 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9777584-46a7-4fbf-84f0-ce58c408ebee" containerName="pull" Apr 16 15:00:03.217835 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.217820 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9777584-46a7-4fbf-84f0-ce58c408ebee" containerName="pull" Apr 16 15:00:03.217835 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.217828 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9777584-46a7-4fbf-84f0-ce58c408ebee" containerName="util" Apr 16 15:00:03.217835 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.217833 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9777584-46a7-4fbf-84f0-ce58c408ebee" containerName="util" Apr 16 15:00:03.218022 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.217886 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9777584-46a7-4fbf-84f0-ce58c408ebee" containerName="extract" Apr 16 15:00:03.269218 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.269191 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt"] Apr 16 15:00:03.269367 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.269302 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt" Apr 16 15:00:03.271875 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.271853 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 15:00:03.272021 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.271905 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-dmr4f\"" Apr 16 15:00:03.272021 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.271904 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 15:00:03.272153 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.272133 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 15:00:03.322929 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.322903 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/d9cd0d3e-9181-42e0-a7dd-b0566e3c7bb8-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt\" (UID: \"d9cd0d3e-9181-42e0-a7dd-b0566e3c7bb8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt" Apr 16 15:00:03.323057 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.322934 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4dbj\" (UniqueName: \"kubernetes.io/projected/d9cd0d3e-9181-42e0-a7dd-b0566e3c7bb8-kube-api-access-t4dbj\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt\" (UID: \"d9cd0d3e-9181-42e0-a7dd-b0566e3c7bb8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt" Apr 16 15:00:03.423507 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.423466 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/d9cd0d3e-9181-42e0-a7dd-b0566e3c7bb8-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt\" (UID: \"d9cd0d3e-9181-42e0-a7dd-b0566e3c7bb8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt" Apr 16 15:00:03.423507 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.423506 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4dbj\" (UniqueName: \"kubernetes.io/projected/d9cd0d3e-9181-42e0-a7dd-b0566e3c7bb8-kube-api-access-t4dbj\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt\" (UID: \"d9cd0d3e-9181-42e0-a7dd-b0566e3c7bb8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt" Apr 16 15:00:03.425725 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.425705 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/d9cd0d3e-9181-42e0-a7dd-b0566e3c7bb8-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt\" (UID: \"d9cd0d3e-9181-42e0-a7dd-b0566e3c7bb8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt" Apr 16 15:00:03.431563 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.431538 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4dbj\" (UniqueName: \"kubernetes.io/projected/d9cd0d3e-9181-42e0-a7dd-b0566e3c7bb8-kube-api-access-t4dbj\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt\" (UID: \"d9cd0d3e-9181-42e0-a7dd-b0566e3c7bb8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt" Apr 16 15:00:03.579895 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.579863 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt" Apr 16 15:00:03.717060 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:03.717039 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt"] Apr 16 15:00:03.719487 ip-10-0-142-86 kubenswrapper[2561]: W0416 15:00:03.719459 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9cd0d3e_9181_42e0_a7dd_b0566e3c7bb8.slice/crio-29050dc8a12b2f1be6ca0392e03e3f47a83f9f6d490e02de040e62f5c88c37fd WatchSource:0}: Error finding container 29050dc8a12b2f1be6ca0392e03e3f47a83f9f6d490e02de040e62f5c88c37fd: Status 404 returned error can't find the container with id 29050dc8a12b2f1be6ca0392e03e3f47a83f9f6d490e02de040e62f5c88c37fd Apr 16 15:00:04.619107 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:04.619069 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt" event={"ID":"d9cd0d3e-9181-42e0-a7dd-b0566e3c7bb8","Type":"ContainerStarted","Data":"29050dc8a12b2f1be6ca0392e03e3f47a83f9f6d490e02de040e62f5c88c37fd"} Apr 16 15:00:09.289677 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.289645 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-9lwmx"] Apr 16 15:00:09.307269 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.307227 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-9lwmx"] Apr 16 15:00:09.307428 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.307324 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-9lwmx" Apr 16 15:00:09.309890 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.309857 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-fcsr5\"" Apr 16 15:00:09.309890 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.309857 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 15:00:09.310090 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.309863 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 15:00:09.372411 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.372390 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhbt4\" (UniqueName: \"kubernetes.io/projected/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-kube-api-access-qhbt4\") pod \"keda-operator-ffbb595cb-9lwmx\" (UID: \"d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9\") " pod="openshift-keda/keda-operator-ffbb595cb-9lwmx" Apr 16 15:00:09.372518 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.372416 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-certificates\") pod \"keda-operator-ffbb595cb-9lwmx\" (UID: \"d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9\") " pod="openshift-keda/keda-operator-ffbb595cb-9lwmx" Apr 16 15:00:09.372518 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.372475 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-cabundle0\") pod \"keda-operator-ffbb595cb-9lwmx\" (UID: \"d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9\") " pod="openshift-keda/keda-operator-ffbb595cb-9lwmx" Apr 16 15:00:09.473410 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.473377 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-cabundle0\") pod \"keda-operator-ffbb595cb-9lwmx\" (UID: \"d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9\") " pod="openshift-keda/keda-operator-ffbb595cb-9lwmx" Apr 16 15:00:09.473576 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.473444 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhbt4\" (UniqueName: \"kubernetes.io/projected/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-kube-api-access-qhbt4\") pod \"keda-operator-ffbb595cb-9lwmx\" (UID: \"d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9\") " pod="openshift-keda/keda-operator-ffbb595cb-9lwmx" Apr 16 15:00:09.473576 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.473471 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-certificates\") pod \"keda-operator-ffbb595cb-9lwmx\" (UID: \"d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9\") " pod="openshift-keda/keda-operator-ffbb595cb-9lwmx" Apr 16 15:00:09.473733 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:00:09.473578 2561 secret.go:281] references non-existent secret key: ca.crt Apr 16 15:00:09.473733 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:00:09.473593 2561 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 15:00:09.473733 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:00:09.473603 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-9lwmx: references non-existent secret key: ca.crt Apr 16 15:00:09.473733 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:00:09.473690 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-certificates podName:d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9 nodeName:}" failed. No retries permitted until 2026-04-16 15:00:09.973668162 +0000 UTC m=+482.354665761 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-certificates") pod "keda-operator-ffbb595cb-9lwmx" (UID: "d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9") : references non-existent secret key: ca.crt Apr 16 15:00:09.474087 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.474068 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-cabundle0\") pod \"keda-operator-ffbb595cb-9lwmx\" (UID: \"d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9\") " pod="openshift-keda/keda-operator-ffbb595cb-9lwmx" Apr 16 15:00:09.483536 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.483508 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhbt4\" (UniqueName: \"kubernetes.io/projected/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-kube-api-access-qhbt4\") pod \"keda-operator-ffbb595cb-9lwmx\" (UID: \"d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9\") " pod="openshift-keda/keda-operator-ffbb595cb-9lwmx" Apr 16 15:00:09.638447 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.638357 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt" event={"ID":"d9cd0d3e-9181-42e0-a7dd-b0566e3c7bb8","Type":"ContainerStarted","Data":"cc4da9de5efadc7135308058774130e9ef64dff50c4df15738912c8a9a6d6683"} Apr 16 15:00:09.638593 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.638454 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt" Apr 16 15:00:09.656450 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.656390 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt" podStartSLOduration=1.6517188040000002 podStartE2EDuration="6.6563759s" podCreationTimestamp="2026-04-16 15:00:03 +0000 UTC" firstStartedPulling="2026-04-16 15:00:03.721078377 +0000 UTC m=+476.102075972" lastFinishedPulling="2026-04-16 15:00:08.725735473 +0000 UTC m=+481.106733068" observedRunningTime="2026-04-16 15:00:09.655157118 +0000 UTC m=+482.036154749" watchObservedRunningTime="2026-04-16 15:00:09.6563759 +0000 UTC m=+482.037373514" Apr 16 15:00:09.794292 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.794257 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-cg6sg"] Apr 16 15:00:09.813808 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.813784 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-cg6sg"] Apr 16 15:00:09.813924 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.813886 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-cg6sg" Apr 16 15:00:09.816453 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.816433 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 15:00:09.876355 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.876317 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwgwz\" (UniqueName: \"kubernetes.io/projected/5532461e-931f-45a2-8c27-d9d4ec80d1f0-kube-api-access-jwgwz\") pod \"keda-admission-cf49989db-cg6sg\" (UID: \"5532461e-931f-45a2-8c27-d9d4ec80d1f0\") " pod="openshift-keda/keda-admission-cf49989db-cg6sg" Apr 16 15:00:09.876494 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.876439 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5532461e-931f-45a2-8c27-d9d4ec80d1f0-certificates\") pod \"keda-admission-cf49989db-cg6sg\" (UID: \"5532461e-931f-45a2-8c27-d9d4ec80d1f0\") " pod="openshift-keda/keda-admission-cf49989db-cg6sg" Apr 16 15:00:09.977856 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.977764 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5532461e-931f-45a2-8c27-d9d4ec80d1f0-certificates\") pod \"keda-admission-cf49989db-cg6sg\" (UID: \"5532461e-931f-45a2-8c27-d9d4ec80d1f0\") " pod="openshift-keda/keda-admission-cf49989db-cg6sg" Apr 16 15:00:09.977856 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.977836 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-certificates\") pod \"keda-operator-ffbb595cb-9lwmx\" (UID: \"d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9\") " pod="openshift-keda/keda-operator-ffbb595cb-9lwmx" Apr 16 15:00:09.978087 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.977892 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwgwz\" (UniqueName: \"kubernetes.io/projected/5532461e-931f-45a2-8c27-d9d4ec80d1f0-kube-api-access-jwgwz\") pod \"keda-admission-cf49989db-cg6sg\" (UID: \"5532461e-931f-45a2-8c27-d9d4ec80d1f0\") " pod="openshift-keda/keda-admission-cf49989db-cg6sg" Apr 16 15:00:09.978087 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:00:09.977935 2561 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 16 15:00:09.978087 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:00:09.977975 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-cg6sg: secret "keda-admission-webhooks-certs" not found Apr 16 15:00:09.978087 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:00:09.978036 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5532461e-931f-45a2-8c27-d9d4ec80d1f0-certificates podName:5532461e-931f-45a2-8c27-d9d4ec80d1f0 nodeName:}" failed. No retries permitted until 2026-04-16 15:00:10.478016158 +0000 UTC m=+482.859013755 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5532461e-931f-45a2-8c27-d9d4ec80d1f0-certificates") pod "keda-admission-cf49989db-cg6sg" (UID: "5532461e-931f-45a2-8c27-d9d4ec80d1f0") : secret "keda-admission-webhooks-certs" not found Apr 16 15:00:09.978087 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:00:09.977940 2561 secret.go:281] references non-existent secret key: ca.crt Apr 16 15:00:09.978087 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:00:09.978054 2561 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 15:00:09.978087 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:00:09.978063 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-9lwmx: references non-existent secret key: ca.crt Apr 16 15:00:09.978087 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:00:09.978091 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-certificates podName:d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9 nodeName:}" failed. No retries permitted until 2026-04-16 15:00:10.978082362 +0000 UTC m=+483.359079961 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-certificates") pod "keda-operator-ffbb595cb-9lwmx" (UID: "d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9") : references non-existent secret key: ca.crt Apr 16 15:00:09.989021 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:09.988990 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwgwz\" (UniqueName: \"kubernetes.io/projected/5532461e-931f-45a2-8c27-d9d4ec80d1f0-kube-api-access-jwgwz\") pod \"keda-admission-cf49989db-cg6sg\" (UID: \"5532461e-931f-45a2-8c27-d9d4ec80d1f0\") " pod="openshift-keda/keda-admission-cf49989db-cg6sg" Apr 16 15:00:10.482543 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:10.482510 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5532461e-931f-45a2-8c27-d9d4ec80d1f0-certificates\") pod \"keda-admission-cf49989db-cg6sg\" (UID: \"5532461e-931f-45a2-8c27-d9d4ec80d1f0\") " pod="openshift-keda/keda-admission-cf49989db-cg6sg" Apr 16 15:00:10.484939 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:10.484914 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5532461e-931f-45a2-8c27-d9d4ec80d1f0-certificates\") pod \"keda-admission-cf49989db-cg6sg\" (UID: \"5532461e-931f-45a2-8c27-d9d4ec80d1f0\") " pod="openshift-keda/keda-admission-cf49989db-cg6sg" Apr 16 15:00:10.727386 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:10.727354 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-cg6sg" Apr 16 15:00:10.849849 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:10.849823 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-cg6sg"] Apr 16 15:00:10.852871 ip-10-0-142-86 kubenswrapper[2561]: W0416 15:00:10.852836 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5532461e_931f_45a2_8c27_d9d4ec80d1f0.slice/crio-ea29c505effbda25b10abb9f73c1c443406b6a1a2bc6ca0c2dbeaa51fb8edb0d WatchSource:0}: Error finding container ea29c505effbda25b10abb9f73c1c443406b6a1a2bc6ca0c2dbeaa51fb8edb0d: Status 404 returned error can't find the container with id ea29c505effbda25b10abb9f73c1c443406b6a1a2bc6ca0c2dbeaa51fb8edb0d Apr 16 15:00:10.987715 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:10.987637 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-certificates\") pod \"keda-operator-ffbb595cb-9lwmx\" (UID: \"d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9\") " pod="openshift-keda/keda-operator-ffbb595cb-9lwmx" Apr 16 15:00:10.987845 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:00:10.987786 2561 secret.go:281] references non-existent secret key: ca.crt Apr 16 15:00:10.987845 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:00:10.987800 2561 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 15:00:10.987845 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:00:10.987809 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-9lwmx: references non-existent secret key: ca.crt Apr 16 15:00:10.987944 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:00:10.987861 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-certificates podName:d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9 nodeName:}" failed. No retries permitted until 2026-04-16 15:00:12.987845132 +0000 UTC m=+485.368842724 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-certificates") pod "keda-operator-ffbb595cb-9lwmx" (UID: "d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9") : references non-existent secret key: ca.crt Apr 16 15:00:11.645578 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:11.645532 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-cg6sg" event={"ID":"5532461e-931f-45a2-8c27-d9d4ec80d1f0","Type":"ContainerStarted","Data":"ea29c505effbda25b10abb9f73c1c443406b6a1a2bc6ca0c2dbeaa51fb8edb0d"} Apr 16 15:00:12.650156 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:12.650055 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-cg6sg" event={"ID":"5532461e-931f-45a2-8c27-d9d4ec80d1f0","Type":"ContainerStarted","Data":"9a355aa33e51827827778f18d67b221b5a5c75ddb4359dfba891bfb566209c10"} Apr 16 15:00:12.650156 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:12.650135 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-cg6sg" Apr 16 15:00:12.665031 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:12.664982 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-cg6sg" podStartSLOduration=2.197201611 podStartE2EDuration="3.664970227s" podCreationTimestamp="2026-04-16 15:00:09 +0000 UTC" firstStartedPulling="2026-04-16 15:00:10.854536548 +0000 UTC m=+483.235534144" lastFinishedPulling="2026-04-16 15:00:12.322305153 +0000 UTC m=+484.703302760" observedRunningTime="2026-04-16 15:00:12.663801408 +0000 UTC m=+485.044799023" watchObservedRunningTime="2026-04-16 15:00:12.664970227 +0000 UTC m=+485.045967841" Apr 16 15:00:13.005510 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:13.005428 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-certificates\") pod \"keda-operator-ffbb595cb-9lwmx\" (UID: \"d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9\") " pod="openshift-keda/keda-operator-ffbb595cb-9lwmx" Apr 16 15:00:13.005687 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:00:13.005573 2561 secret.go:281] references non-existent secret key: ca.crt Apr 16 15:00:13.005687 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:00:13.005588 2561 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 15:00:13.005687 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:00:13.005597 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-9lwmx: references non-existent secret key: ca.crt Apr 16 15:00:13.005687 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:00:13.005668 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-certificates podName:d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9 nodeName:}" failed. No retries permitted until 2026-04-16 15:00:17.005652651 +0000 UTC m=+489.386650247 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-certificates") pod "keda-operator-ffbb595cb-9lwmx" (UID: "d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9") : references non-existent secret key: ca.crt Apr 16 15:00:17.037662 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:17.037623 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-certificates\") pod \"keda-operator-ffbb595cb-9lwmx\" (UID: \"d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9\") " pod="openshift-keda/keda-operator-ffbb595cb-9lwmx" Apr 16 15:00:17.039921 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:17.039899 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9-certificates\") pod \"keda-operator-ffbb595cb-9lwmx\" (UID: \"d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9\") " pod="openshift-keda/keda-operator-ffbb595cb-9lwmx" Apr 16 15:00:17.117243 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:17.117214 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-9lwmx" Apr 16 15:00:17.236882 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:17.236851 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-9lwmx"] Apr 16 15:00:17.240317 ip-10-0-142-86 kubenswrapper[2561]: W0416 15:00:17.240289 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0399f5f_1002_4a6d_9fd8_698dbd4fb0c9.slice/crio-2633869bf8658a99b49b36772ea9a22c3ca556960de0abc72f79cad5385f2f2b WatchSource:0}: Error finding container 2633869bf8658a99b49b36772ea9a22c3ca556960de0abc72f79cad5385f2f2b: Status 404 returned error can't find the container with id 2633869bf8658a99b49b36772ea9a22c3ca556960de0abc72f79cad5385f2f2b Apr 16 15:00:17.666263 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:17.666219 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-9lwmx" event={"ID":"d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9","Type":"ContainerStarted","Data":"2633869bf8658a99b49b36772ea9a22c3ca556960de0abc72f79cad5385f2f2b"} Apr 16 15:00:20.677909 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:20.677824 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-9lwmx" event={"ID":"d0399f5f-1002-4a6d-9fd8-698dbd4fb0c9","Type":"ContainerStarted","Data":"0c9a0ea3441879e16c302820065a9d891f58ea326fa2d8b3c326ecf1cf01543a"} Apr 16 15:00:20.678348 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:20.677973 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-9lwmx" Apr 16 15:00:20.693987 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:20.693944 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-9lwmx" podStartSLOduration=8.588618406 podStartE2EDuration="11.693933166s" podCreationTimestamp="2026-04-16 15:00:09 +0000 UTC" firstStartedPulling="2026-04-16 15:00:17.241601646 +0000 UTC m=+489.622599239" lastFinishedPulling="2026-04-16 15:00:20.346916404 +0000 UTC m=+492.727913999" observedRunningTime="2026-04-16 15:00:20.69251257 +0000 UTC m=+493.073510184" watchObservedRunningTime="2026-04-16 15:00:20.693933166 +0000 UTC m=+493.074930780" Apr 16 15:00:30.643391 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:30.643356 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-fx5wt" Apr 16 15:00:33.655460 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:33.655425 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-cg6sg" Apr 16 15:00:41.683319 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:00:41.683283 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-9lwmx" Apr 16 15:01:14.532240 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.532166 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-f7whx"] Apr 16 15:01:14.535561 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.535537 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-f7whx" Apr 16 15:01:14.537878 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.537861 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 15:01:14.538936 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.538918 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 15:01:14.539040 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.538948 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-2qftb\"" Apr 16 15:01:14.539040 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.538948 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 15:01:14.545469 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.545451 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-nlz29"] Apr 16 15:01:14.548895 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.548879 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-f7whx"] Apr 16 15:01:14.548998 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.548987 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-nlz29" Apr 16 15:01:14.551386 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.551368 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 15:01:14.551484 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.551419 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-cvgf5\"" Apr 16 15:01:14.555770 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.555748 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-nlz29"] Apr 16 15:01:14.581522 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.581495 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1949308f-8620-489a-9c56-3c217eac6bb2-cert\") pod \"kserve-controller-manager-7669bdc57-f7whx\" (UID: \"1949308f-8620-489a-9c56-3c217eac6bb2\") " pod="kserve/kserve-controller-manager-7669bdc57-f7whx" Apr 16 15:01:14.581682 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.581543 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7vlh\" (UniqueName: \"kubernetes.io/projected/1949308f-8620-489a-9c56-3c217eac6bb2-kube-api-access-c7vlh\") pod \"kserve-controller-manager-7669bdc57-f7whx\" (UID: \"1949308f-8620-489a-9c56-3c217eac6bb2\") " pod="kserve/kserve-controller-manager-7669bdc57-f7whx" Apr 16 15:01:14.581682 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.581655 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l72lf\" (UniqueName: \"kubernetes.io/projected/7a3cbf79-de66-4c90-8568-5d8392101c7e-kube-api-access-l72lf\") pod \"llmisvc-controller-manager-68cc5db7c4-nlz29\" (UID: \"7a3cbf79-de66-4c90-8568-5d8392101c7e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-nlz29" Apr 16 15:01:14.581787 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.581728 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a3cbf79-de66-4c90-8568-5d8392101c7e-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-nlz29\" (UID: \"7a3cbf79-de66-4c90-8568-5d8392101c7e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-nlz29" Apr 16 15:01:14.682161 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.682136 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l72lf\" (UniqueName: \"kubernetes.io/projected/7a3cbf79-de66-4c90-8568-5d8392101c7e-kube-api-access-l72lf\") pod \"llmisvc-controller-manager-68cc5db7c4-nlz29\" (UID: \"7a3cbf79-de66-4c90-8568-5d8392101c7e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-nlz29" Apr 16 15:01:14.682295 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.682193 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a3cbf79-de66-4c90-8568-5d8392101c7e-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-nlz29\" (UID: \"7a3cbf79-de66-4c90-8568-5d8392101c7e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-nlz29" Apr 16 15:01:14.682295 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.682263 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1949308f-8620-489a-9c56-3c217eac6bb2-cert\") pod \"kserve-controller-manager-7669bdc57-f7whx\" (UID: \"1949308f-8620-489a-9c56-3c217eac6bb2\") " pod="kserve/kserve-controller-manager-7669bdc57-f7whx" Apr 16 15:01:14.682414 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.682297 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7vlh\" (UniqueName: \"kubernetes.io/projected/1949308f-8620-489a-9c56-3c217eac6bb2-kube-api-access-c7vlh\") pod \"kserve-controller-manager-7669bdc57-f7whx\" (UID: \"1949308f-8620-489a-9c56-3c217eac6bb2\") " pod="kserve/kserve-controller-manager-7669bdc57-f7whx" Apr 16 15:01:14.682414 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:01:14.682354 2561 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 16 15:01:14.682509 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:01:14.682426 2561 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 15:01:14.682509 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:01:14.682431 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a3cbf79-de66-4c90-8568-5d8392101c7e-cert podName:7a3cbf79-de66-4c90-8568-5d8392101c7e nodeName:}" failed. No retries permitted until 2026-04-16 15:01:15.182410159 +0000 UTC m=+547.563407755 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a3cbf79-de66-4c90-8568-5d8392101c7e-cert") pod "llmisvc-controller-manager-68cc5db7c4-nlz29" (UID: "7a3cbf79-de66-4c90-8568-5d8392101c7e") : secret "llmisvc-webhook-server-cert" not found Apr 16 15:01:14.682509 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:01:14.682502 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1949308f-8620-489a-9c56-3c217eac6bb2-cert podName:1949308f-8620-489a-9c56-3c217eac6bb2 nodeName:}" failed. No retries permitted until 2026-04-16 15:01:15.182484408 +0000 UTC m=+547.563482003 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1949308f-8620-489a-9c56-3c217eac6bb2-cert") pod "kserve-controller-manager-7669bdc57-f7whx" (UID: "1949308f-8620-489a-9c56-3c217eac6bb2") : secret "kserve-webhook-server-cert" not found Apr 16 15:01:14.692852 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.692831 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l72lf\" (UniqueName: \"kubernetes.io/projected/7a3cbf79-de66-4c90-8568-5d8392101c7e-kube-api-access-l72lf\") pod \"llmisvc-controller-manager-68cc5db7c4-nlz29\" (UID: \"7a3cbf79-de66-4c90-8568-5d8392101c7e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-nlz29" Apr 16 15:01:14.697132 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:14.697108 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7vlh\" (UniqueName: \"kubernetes.io/projected/1949308f-8620-489a-9c56-3c217eac6bb2-kube-api-access-c7vlh\") pod \"kserve-controller-manager-7669bdc57-f7whx\" (UID: \"1949308f-8620-489a-9c56-3c217eac6bb2\") " pod="kserve/kserve-controller-manager-7669bdc57-f7whx" Apr 16 15:01:15.186323 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:15.186288 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a3cbf79-de66-4c90-8568-5d8392101c7e-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-nlz29\" (UID: \"7a3cbf79-de66-4c90-8568-5d8392101c7e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-nlz29" Apr 16 15:01:15.186480 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:15.186342 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1949308f-8620-489a-9c56-3c217eac6bb2-cert\") pod \"kserve-controller-manager-7669bdc57-f7whx\" (UID: \"1949308f-8620-489a-9c56-3c217eac6bb2\") " pod="kserve/kserve-controller-manager-7669bdc57-f7whx" Apr 16 15:01:15.188676 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:15.188649 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1949308f-8620-489a-9c56-3c217eac6bb2-cert\") pod \"kserve-controller-manager-7669bdc57-f7whx\" (UID: \"1949308f-8620-489a-9c56-3c217eac6bb2\") " pod="kserve/kserve-controller-manager-7669bdc57-f7whx" Apr 16 15:01:15.188676 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:15.188658 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a3cbf79-de66-4c90-8568-5d8392101c7e-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-nlz29\" (UID: \"7a3cbf79-de66-4c90-8568-5d8392101c7e\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-nlz29" Apr 16 15:01:15.448482 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:15.448402 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-f7whx" Apr 16 15:01:15.463109 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:15.463084 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-nlz29" Apr 16 15:01:15.575587 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:15.575557 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-f7whx"] Apr 16 15:01:15.578233 ip-10-0-142-86 kubenswrapper[2561]: W0416 15:01:15.578193 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1949308f_8620_489a_9c56_3c217eac6bb2.slice/crio-ab85faa23cf08674afc9bb63953fcfbb6f90fe9a1f37848ba7deed567a37f9b9 WatchSource:0}: Error finding container ab85faa23cf08674afc9bb63953fcfbb6f90fe9a1f37848ba7deed567a37f9b9: Status 404 returned error can't find the container with id ab85faa23cf08674afc9bb63953fcfbb6f90fe9a1f37848ba7deed567a37f9b9 Apr 16 15:01:15.599701 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:15.599640 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-nlz29"] Apr 16 15:01:15.601533 ip-10-0-142-86 kubenswrapper[2561]: W0416 15:01:15.601512 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7a3cbf79_de66_4c90_8568_5d8392101c7e.slice/crio-e2200cf9a61c1dde62aa7955f2ae94b42663bf198fe5ca102dfbfbb04081ee60 WatchSource:0}: Error finding container e2200cf9a61c1dde62aa7955f2ae94b42663bf198fe5ca102dfbfbb04081ee60: Status 404 returned error can't find the container with id e2200cf9a61c1dde62aa7955f2ae94b42663bf198fe5ca102dfbfbb04081ee60 Apr 16 15:01:15.871400 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:15.871368 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-nlz29" event={"ID":"7a3cbf79-de66-4c90-8568-5d8392101c7e","Type":"ContainerStarted","Data":"e2200cf9a61c1dde62aa7955f2ae94b42663bf198fe5ca102dfbfbb04081ee60"} Apr 16 15:01:15.872274 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:15.872252 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-f7whx" event={"ID":"1949308f-8620-489a-9c56-3c217eac6bb2","Type":"ContainerStarted","Data":"ab85faa23cf08674afc9bb63953fcfbb6f90fe9a1f37848ba7deed567a37f9b9"} Apr 16 15:01:19.888150 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:19.888113 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-f7whx" event={"ID":"1949308f-8620-489a-9c56-3c217eac6bb2","Type":"ContainerStarted","Data":"e40b1854c574f9f2f21e4e25cd9d735a3d8b34d4ddb95f8b717daf4cefe13140"} Apr 16 15:01:19.888523 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:19.888239 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7669bdc57-f7whx" Apr 16 15:01:19.904045 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:19.904000 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7669bdc57-f7whx" podStartSLOduration=2.538693632 podStartE2EDuration="5.903989156s" podCreationTimestamp="2026-04-16 15:01:14 +0000 UTC" firstStartedPulling="2026-04-16 15:01:15.579581539 +0000 UTC m=+547.960579135" lastFinishedPulling="2026-04-16 15:01:18.944877064 +0000 UTC m=+551.325874659" observedRunningTime="2026-04-16 15:01:19.90125013 +0000 UTC m=+552.282247746" watchObservedRunningTime="2026-04-16 15:01:19.903989156 +0000 UTC m=+552.284986769" Apr 16 15:01:32.932838 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:32.932760 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-nlz29" event={"ID":"7a3cbf79-de66-4c90-8568-5d8392101c7e","Type":"ContainerStarted","Data":"65c253ec1cf831e8ef8d372315642bf4bf2475f08a2b05da9488b7681f707809"} Apr 16 15:01:32.932838 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:32.932840 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-nlz29" Apr 16 15:01:32.950262 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:32.950194 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-nlz29" podStartSLOduration=1.884881313 podStartE2EDuration="18.95018341s" podCreationTimestamp="2026-04-16 15:01:14 +0000 UTC" firstStartedPulling="2026-04-16 15:01:15.602859556 +0000 UTC m=+547.983857148" lastFinishedPulling="2026-04-16 15:01:32.668161636 +0000 UTC m=+565.049159245" observedRunningTime="2026-04-16 15:01:32.947622989 +0000 UTC m=+565.328620593" watchObservedRunningTime="2026-04-16 15:01:32.95018341 +0000 UTC m=+565.331181024" Apr 16 15:01:48.684996 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.684965 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-598cbdf847-tvt9d"] Apr 16 15:01:48.688815 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.688789 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.701238 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.701216 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-598cbdf847-tvt9d"] Apr 16 15:01:48.855315 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.855278 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7f059092-ee05-42ba-90c4-6943c75e7795-oauth-serving-cert\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.855315 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.855317 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f059092-ee05-42ba-90c4-6943c75e7795-console-serving-cert\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.855546 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.855341 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f059092-ee05-42ba-90c4-6943c75e7795-service-ca\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.855546 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.855391 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvxqr\" (UniqueName: \"kubernetes.io/projected/7f059092-ee05-42ba-90c4-6943c75e7795-kube-api-access-gvxqr\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.855546 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.855448 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f059092-ee05-42ba-90c4-6943c75e7795-trusted-ca-bundle\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.855546 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.855492 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7f059092-ee05-42ba-90c4-6943c75e7795-console-config\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.855546 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.855526 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7f059092-ee05-42ba-90c4-6943c75e7795-console-oauth-config\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.956892 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.956829 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7f059092-ee05-42ba-90c4-6943c75e7795-console-oauth-config\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.956892 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.956868 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7f059092-ee05-42ba-90c4-6943c75e7795-oauth-serving-cert\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.956892 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.956888 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f059092-ee05-42ba-90c4-6943c75e7795-console-serving-cert\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.957121 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.956913 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f059092-ee05-42ba-90c4-6943c75e7795-service-ca\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.957121 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.956931 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvxqr\" (UniqueName: \"kubernetes.io/projected/7f059092-ee05-42ba-90c4-6943c75e7795-kube-api-access-gvxqr\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.957121 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.956963 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f059092-ee05-42ba-90c4-6943c75e7795-trusted-ca-bundle\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.957121 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.957070 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7f059092-ee05-42ba-90c4-6943c75e7795-console-config\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.957743 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.957721 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7f059092-ee05-42ba-90c4-6943c75e7795-console-config\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.957743 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.957735 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f059092-ee05-42ba-90c4-6943c75e7795-service-ca\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.957892 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.957724 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7f059092-ee05-42ba-90c4-6943c75e7795-oauth-serving-cert\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.957892 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.957827 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f059092-ee05-42ba-90c4-6943c75e7795-trusted-ca-bundle\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.959371 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.959343 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f059092-ee05-42ba-90c4-6943c75e7795-console-serving-cert\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.959452 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.959369 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7f059092-ee05-42ba-90c4-6943c75e7795-console-oauth-config\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.964835 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.964818 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvxqr\" (UniqueName: \"kubernetes.io/projected/7f059092-ee05-42ba-90c4-6943c75e7795-kube-api-access-gvxqr\") pod \"console-598cbdf847-tvt9d\" (UID: \"7f059092-ee05-42ba-90c4-6943c75e7795\") " pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:48.998218 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:48.998198 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:49.322337 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:49.322293 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-598cbdf847-tvt9d"] Apr 16 15:01:49.324520 ip-10-0-142-86 kubenswrapper[2561]: W0416 15:01:49.324493 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f059092_ee05_42ba_90c4_6943c75e7795.slice/crio-8fc955da0b979d54cf15e4a3e56aca01f669864193cfc74346e872ec3a291e88 WatchSource:0}: Error finding container 8fc955da0b979d54cf15e4a3e56aca01f669864193cfc74346e872ec3a291e88: Status 404 returned error can't find the container with id 8fc955da0b979d54cf15e4a3e56aca01f669864193cfc74346e872ec3a291e88 Apr 16 15:01:49.989403 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:49.989368 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-598cbdf847-tvt9d" event={"ID":"7f059092-ee05-42ba-90c4-6943c75e7795","Type":"ContainerStarted","Data":"1d548bd7a60b33541d0e480e15a2a14d98266aab90bacef03d4e3b7814f5bc5a"} Apr 16 15:01:49.989795 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:49.989409 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-598cbdf847-tvt9d" event={"ID":"7f059092-ee05-42ba-90c4-6943c75e7795","Type":"ContainerStarted","Data":"8fc955da0b979d54cf15e4a3e56aca01f669864193cfc74346e872ec3a291e88"} Apr 16 15:01:50.006903 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:50.006856 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-598cbdf847-tvt9d" podStartSLOduration=2.00684245 podStartE2EDuration="2.00684245s" podCreationTimestamp="2026-04-16 15:01:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:01:50.005733864 +0000 UTC m=+582.386731481" watchObservedRunningTime="2026-04-16 15:01:50.00684245 +0000 UTC m=+582.387840065" Apr 16 15:01:50.897268 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:50.897238 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7669bdc57-f7whx" Apr 16 15:01:58.998867 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:58.998823 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:58.998867 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:58.998874 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:59.003600 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:59.003578 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:59.022352 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:59.022326 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-598cbdf847-tvt9d" Apr 16 15:01:59.063220 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:01:59.063191 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8d9cc8cff-ncskx"] Apr 16 15:02:03.938835 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:03.938804 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-nlz29" Apr 16 15:02:05.229802 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:05.229771 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-f7whx"] Apr 16 15:02:05.230268 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:05.229985 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-7669bdc57-f7whx" podUID="1949308f-8620-489a-9c56-3c217eac6bb2" containerName="manager" containerID="cri-o://e40b1854c574f9f2f21e4e25cd9d735a3d8b34d4ddb95f8b717daf4cefe13140" gracePeriod=10 Apr 16 15:02:05.251286 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:05.251259 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-9nc5f"] Apr 16 15:02:05.253320 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:05.253304 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-9nc5f" Apr 16 15:02:05.263018 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:05.262992 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-9nc5f"] Apr 16 15:02:05.277851 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:05.277824 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn6v4\" (UniqueName: \"kubernetes.io/projected/9b557b33-e79d-4ed8-8dcb-3a2cfaa263b9-kube-api-access-mn6v4\") pod \"kserve-controller-manager-7669bdc57-9nc5f\" (UID: \"9b557b33-e79d-4ed8-8dcb-3a2cfaa263b9\") " pod="kserve/kserve-controller-manager-7669bdc57-9nc5f" Apr 16 15:02:05.277957 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:05.277911 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b557b33-e79d-4ed8-8dcb-3a2cfaa263b9-cert\") pod \"kserve-controller-manager-7669bdc57-9nc5f\" (UID: \"9b557b33-e79d-4ed8-8dcb-3a2cfaa263b9\") " pod="kserve/kserve-controller-manager-7669bdc57-9nc5f" Apr 16 15:02:05.379372 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:05.379339 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b557b33-e79d-4ed8-8dcb-3a2cfaa263b9-cert\") pod \"kserve-controller-manager-7669bdc57-9nc5f\" (UID: \"9b557b33-e79d-4ed8-8dcb-3a2cfaa263b9\") " pod="kserve/kserve-controller-manager-7669bdc57-9nc5f" Apr 16 15:02:05.379498 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:05.379440 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mn6v4\" (UniqueName: \"kubernetes.io/projected/9b557b33-e79d-4ed8-8dcb-3a2cfaa263b9-kube-api-access-mn6v4\") pod \"kserve-controller-manager-7669bdc57-9nc5f\" (UID: \"9b557b33-e79d-4ed8-8dcb-3a2cfaa263b9\") " pod="kserve/kserve-controller-manager-7669bdc57-9nc5f" Apr 16 15:02:05.381918 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:05.381889 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b557b33-e79d-4ed8-8dcb-3a2cfaa263b9-cert\") pod \"kserve-controller-manager-7669bdc57-9nc5f\" (UID: \"9b557b33-e79d-4ed8-8dcb-3a2cfaa263b9\") " pod="kserve/kserve-controller-manager-7669bdc57-9nc5f" Apr 16 15:02:05.388077 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:05.388051 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn6v4\" (UniqueName: \"kubernetes.io/projected/9b557b33-e79d-4ed8-8dcb-3a2cfaa263b9-kube-api-access-mn6v4\") pod \"kserve-controller-manager-7669bdc57-9nc5f\" (UID: \"9b557b33-e79d-4ed8-8dcb-3a2cfaa263b9\") " pod="kserve/kserve-controller-manager-7669bdc57-9nc5f" Apr 16 15:02:05.468289 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:05.468267 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-f7whx" Apr 16 15:02:05.480386 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:05.480338 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7vlh\" (UniqueName: \"kubernetes.io/projected/1949308f-8620-489a-9c56-3c217eac6bb2-kube-api-access-c7vlh\") pod \"1949308f-8620-489a-9c56-3c217eac6bb2\" (UID: \"1949308f-8620-489a-9c56-3c217eac6bb2\") " Apr 16 15:02:05.480471 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:05.480397 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1949308f-8620-489a-9c56-3c217eac6bb2-cert\") pod \"1949308f-8620-489a-9c56-3c217eac6bb2\" (UID: \"1949308f-8620-489a-9c56-3c217eac6bb2\") " Apr 16 15:02:05.482499 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:05.482454 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1949308f-8620-489a-9c56-3c217eac6bb2-kube-api-access-c7vlh" (OuterVolumeSpecName: "kube-api-access-c7vlh") pod "1949308f-8620-489a-9c56-3c217eac6bb2" (UID: "1949308f-8620-489a-9c56-3c217eac6bb2"). InnerVolumeSpecName "kube-api-access-c7vlh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:02:05.482632 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:05.482502 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1949308f-8620-489a-9c56-3c217eac6bb2-cert" (OuterVolumeSpecName: "cert") pod "1949308f-8620-489a-9c56-3c217eac6bb2" (UID: "1949308f-8620-489a-9c56-3c217eac6bb2"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:02:05.581178 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:05.581143 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c7vlh\" (UniqueName: \"kubernetes.io/projected/1949308f-8620-489a-9c56-3c217eac6bb2-kube-api-access-c7vlh\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 15:02:05.581178 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:05.581170 2561 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1949308f-8620-489a-9c56-3c217eac6bb2-cert\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 15:02:05.616119 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:05.616091 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-9nc5f" Apr 16 15:02:05.732451 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:05.732401 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-9nc5f"] Apr 16 15:02:05.734682 ip-10-0-142-86 kubenswrapper[2561]: W0416 15:02:05.734657 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b557b33_e79d_4ed8_8dcb_3a2cfaa263b9.slice/crio-7fd33072ee743ad5a768e3169ab47396a03bff11368ef8ff873b687ed55ae96a WatchSource:0}: Error finding container 7fd33072ee743ad5a768e3169ab47396a03bff11368ef8ff873b687ed55ae96a: Status 404 returned error can't find the container with id 7fd33072ee743ad5a768e3169ab47396a03bff11368ef8ff873b687ed55ae96a Apr 16 15:02:06.048007 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:06.047973 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-9nc5f" event={"ID":"9b557b33-e79d-4ed8-8dcb-3a2cfaa263b9","Type":"ContainerStarted","Data":"7fd33072ee743ad5a768e3169ab47396a03bff11368ef8ff873b687ed55ae96a"} Apr 16 15:02:06.048964 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:06.048937 2561 generic.go:358] "Generic (PLEG): container finished" podID="1949308f-8620-489a-9c56-3c217eac6bb2" containerID="e40b1854c574f9f2f21e4e25cd9d735a3d8b34d4ddb95f8b717daf4cefe13140" exitCode=0 Apr 16 15:02:06.049148 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:06.048998 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-f7whx" Apr 16 15:02:06.049148 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:06.049015 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-f7whx" event={"ID":"1949308f-8620-489a-9c56-3c217eac6bb2","Type":"ContainerDied","Data":"e40b1854c574f9f2f21e4e25cd9d735a3d8b34d4ddb95f8b717daf4cefe13140"} Apr 16 15:02:06.049148 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:06.049045 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-f7whx" event={"ID":"1949308f-8620-489a-9c56-3c217eac6bb2","Type":"ContainerDied","Data":"ab85faa23cf08674afc9bb63953fcfbb6f90fe9a1f37848ba7deed567a37f9b9"} Apr 16 15:02:06.049148 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:06.049069 2561 scope.go:117] "RemoveContainer" containerID="e40b1854c574f9f2f21e4e25cd9d735a3d8b34d4ddb95f8b717daf4cefe13140" Apr 16 15:02:06.057374 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:06.057356 2561 scope.go:117] "RemoveContainer" containerID="e40b1854c574f9f2f21e4e25cd9d735a3d8b34d4ddb95f8b717daf4cefe13140" Apr 16 15:02:06.057633 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:02:06.057599 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e40b1854c574f9f2f21e4e25cd9d735a3d8b34d4ddb95f8b717daf4cefe13140\": container with ID starting with e40b1854c574f9f2f21e4e25cd9d735a3d8b34d4ddb95f8b717daf4cefe13140 not found: ID does not exist" containerID="e40b1854c574f9f2f21e4e25cd9d735a3d8b34d4ddb95f8b717daf4cefe13140" Apr 16 15:02:06.057676 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:06.057642 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e40b1854c574f9f2f21e4e25cd9d735a3d8b34d4ddb95f8b717daf4cefe13140"} err="failed to get container status \"e40b1854c574f9f2f21e4e25cd9d735a3d8b34d4ddb95f8b717daf4cefe13140\": rpc error: code = NotFound desc = could not find container \"e40b1854c574f9f2f21e4e25cd9d735a3d8b34d4ddb95f8b717daf4cefe13140\": container with ID starting with e40b1854c574f9f2f21e4e25cd9d735a3d8b34d4ddb95f8b717daf4cefe13140 not found: ID does not exist" Apr 16 15:02:06.068170 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:06.068150 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-f7whx"] Apr 16 15:02:06.071200 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:06.071180 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-f7whx"] Apr 16 15:02:06.225625 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:06.225586 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1949308f-8620-489a-9c56-3c217eac6bb2" path="/var/lib/kubelet/pods/1949308f-8620-489a-9c56-3c217eac6bb2/volumes" Apr 16 15:02:07.053448 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:07.053413 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-9nc5f" event={"ID":"9b557b33-e79d-4ed8-8dcb-3a2cfaa263b9","Type":"ContainerStarted","Data":"b0962f0b6352f022e1469ebfec59f85b148b797200676b80bf12fe64e2161ed1"} Apr 16 15:02:07.053902 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:07.053479 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7669bdc57-9nc5f" Apr 16 15:02:07.067850 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:07.067806 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7669bdc57-9nc5f" podStartSLOduration=1.60685147 podStartE2EDuration="2.067792468s" podCreationTimestamp="2026-04-16 15:02:05 +0000 UTC" firstStartedPulling="2026-04-16 15:02:05.736142705 +0000 UTC m=+598.117140297" lastFinishedPulling="2026-04-16 15:02:06.1970837 +0000 UTC m=+598.578081295" observedRunningTime="2026-04-16 15:02:07.067295806 +0000 UTC m=+599.448293421" watchObservedRunningTime="2026-04-16 15:02:07.067792468 +0000 UTC m=+599.448790081" Apr 16 15:02:08.115556 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:08.115524 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-sx65q_e23b4d8a-18ac-44f6-bf57-10e1ac5b3247/console-operator/2.log" Apr 16 15:02:08.116284 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:08.116261 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-sx65q_e23b4d8a-18ac-44f6-bf57-10e1ac5b3247/console-operator/2.log" Apr 16 15:02:24.088465 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.088430 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-8d9cc8cff-ncskx" podUID="9f696a1a-abcc-4a94-b486-d6ff4e305cda" containerName="console" containerID="cri-o://641beece0bd0c49d3797f7c977522e08c65f1c55fd178f845d778deb64dff8eb" gracePeriod=15 Apr 16 15:02:24.327255 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.327234 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8d9cc8cff-ncskx_9f696a1a-abcc-4a94-b486-d6ff4e305cda/console/0.log" Apr 16 15:02:24.327363 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.327294 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 15:02:24.425931 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.425856 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f696a1a-abcc-4a94-b486-d6ff4e305cda-console-serving-cert\") pod \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " Apr 16 15:02:24.425931 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.425919 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-oauth-serving-cert\") pod \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " Apr 16 15:02:24.426126 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.425967 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-trusted-ca-bundle\") pod \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " Apr 16 15:02:24.426126 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.425992 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7wsk\" (UniqueName: \"kubernetes.io/projected/9f696a1a-abcc-4a94-b486-d6ff4e305cda-kube-api-access-c7wsk\") pod \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " Apr 16 15:02:24.426126 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.426052 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f696a1a-abcc-4a94-b486-d6ff4e305cda-console-oauth-config\") pod \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " Apr 16 15:02:24.426297 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.426196 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-console-config\") pod \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " Apr 16 15:02:24.426297 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.426285 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-service-ca\") pod \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\" (UID: \"9f696a1a-abcc-4a94-b486-d6ff4e305cda\") " Apr 16 15:02:24.426399 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.426381 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9f696a1a-abcc-4a94-b486-d6ff4e305cda" (UID: "9f696a1a-abcc-4a94-b486-d6ff4e305cda"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:02:24.426817 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.426535 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9f696a1a-abcc-4a94-b486-d6ff4e305cda" (UID: "9f696a1a-abcc-4a94-b486-d6ff4e305cda"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:02:24.426817 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.426764 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-console-config" (OuterVolumeSpecName: "console-config") pod "9f696a1a-abcc-4a94-b486-d6ff4e305cda" (UID: "9f696a1a-abcc-4a94-b486-d6ff4e305cda"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:02:24.426817 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.426779 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-service-ca" (OuterVolumeSpecName: "service-ca") pod "9f696a1a-abcc-4a94-b486-d6ff4e305cda" (UID: "9f696a1a-abcc-4a94-b486-d6ff4e305cda"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:02:24.426817 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.426802 2561 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-trusted-ca-bundle\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 15:02:24.427104 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.426825 2561 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-oauth-serving-cert\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 15:02:24.428262 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.428233 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f696a1a-abcc-4a94-b486-d6ff4e305cda-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9f696a1a-abcc-4a94-b486-d6ff4e305cda" (UID: "9f696a1a-abcc-4a94-b486-d6ff4e305cda"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:02:24.428365 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.428239 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f696a1a-abcc-4a94-b486-d6ff4e305cda-kube-api-access-c7wsk" (OuterVolumeSpecName: "kube-api-access-c7wsk") pod "9f696a1a-abcc-4a94-b486-d6ff4e305cda" (UID: "9f696a1a-abcc-4a94-b486-d6ff4e305cda"). InnerVolumeSpecName "kube-api-access-c7wsk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:02:24.428427 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.428409 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f696a1a-abcc-4a94-b486-d6ff4e305cda-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9f696a1a-abcc-4a94-b486-d6ff4e305cda" (UID: "9f696a1a-abcc-4a94-b486-d6ff4e305cda"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:02:24.527547 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.527517 2561 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9f696a1a-abcc-4a94-b486-d6ff4e305cda-console-oauth-config\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 15:02:24.527547 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.527543 2561 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-console-config\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 15:02:24.527742 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.527553 2561 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9f696a1a-abcc-4a94-b486-d6ff4e305cda-service-ca\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 15:02:24.527742 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.527562 2561 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9f696a1a-abcc-4a94-b486-d6ff4e305cda-console-serving-cert\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 15:02:24.527742 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:24.527572 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c7wsk\" (UniqueName: \"kubernetes.io/projected/9f696a1a-abcc-4a94-b486-d6ff4e305cda-kube-api-access-c7wsk\") on node \"ip-10-0-142-86.ec2.internal\" DevicePath \"\"" Apr 16 15:02:25.108903 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:25.108876 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8d9cc8cff-ncskx_9f696a1a-abcc-4a94-b486-d6ff4e305cda/console/0.log" Apr 16 15:02:25.109311 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:25.108919 2561 generic.go:358] "Generic (PLEG): container finished" podID="9f696a1a-abcc-4a94-b486-d6ff4e305cda" containerID="641beece0bd0c49d3797f7c977522e08c65f1c55fd178f845d778deb64dff8eb" exitCode=2 Apr 16 15:02:25.109311 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:25.108971 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8d9cc8cff-ncskx" event={"ID":"9f696a1a-abcc-4a94-b486-d6ff4e305cda","Type":"ContainerDied","Data":"641beece0bd0c49d3797f7c977522e08c65f1c55fd178f845d778deb64dff8eb"} Apr 16 15:02:25.109311 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:25.108986 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8d9cc8cff-ncskx" Apr 16 15:02:25.109311 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:25.108997 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8d9cc8cff-ncskx" event={"ID":"9f696a1a-abcc-4a94-b486-d6ff4e305cda","Type":"ContainerDied","Data":"b376a71398fc42c8f47f335f5fe0e28f4d1a4c2082fc3740467d99b524af5eda"} Apr 16 15:02:25.109311 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:25.109015 2561 scope.go:117] "RemoveContainer" containerID="641beece0bd0c49d3797f7c977522e08c65f1c55fd178f845d778deb64dff8eb" Apr 16 15:02:25.118296 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:25.118219 2561 scope.go:117] "RemoveContainer" containerID="641beece0bd0c49d3797f7c977522e08c65f1c55fd178f845d778deb64dff8eb" Apr 16 15:02:25.118972 ip-10-0-142-86 kubenswrapper[2561]: E0416 15:02:25.118909 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"641beece0bd0c49d3797f7c977522e08c65f1c55fd178f845d778deb64dff8eb\": container with ID starting with 641beece0bd0c49d3797f7c977522e08c65f1c55fd178f845d778deb64dff8eb not found: ID does not exist" containerID="641beece0bd0c49d3797f7c977522e08c65f1c55fd178f845d778deb64dff8eb" Apr 16 15:02:25.118972 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:25.118942 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641beece0bd0c49d3797f7c977522e08c65f1c55fd178f845d778deb64dff8eb"} err="failed to get container status \"641beece0bd0c49d3797f7c977522e08c65f1c55fd178f845d778deb64dff8eb\": rpc error: code = NotFound desc = could not find container \"641beece0bd0c49d3797f7c977522e08c65f1c55fd178f845d778deb64dff8eb\": container with ID starting with 641beece0bd0c49d3797f7c977522e08c65f1c55fd178f845d778deb64dff8eb not found: ID does not exist" Apr 16 15:02:25.131758 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:25.131736 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8d9cc8cff-ncskx"] Apr 16 15:02:25.135701 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:25.135677 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8d9cc8cff-ncskx"] Apr 16 15:02:26.225742 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:26.225708 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f696a1a-abcc-4a94-b486-d6ff4e305cda" path="/var/lib/kubelet/pods/9f696a1a-abcc-4a94-b486-d6ff4e305cda/volumes" Apr 16 15:02:38.062210 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:38.062181 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7669bdc57-9nc5f" Apr 16 15:02:38.932975 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:38.932917 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-kcdp4"] Apr 16 15:02:38.935778 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:38.934775 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1949308f-8620-489a-9c56-3c217eac6bb2" containerName="manager" Apr 16 15:02:38.935778 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:38.934802 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="1949308f-8620-489a-9c56-3c217eac6bb2" containerName="manager" Apr 16 15:02:38.935778 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:38.934858 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f696a1a-abcc-4a94-b486-d6ff4e305cda" containerName="console" Apr 16 15:02:38.935778 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:38.934867 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f696a1a-abcc-4a94-b486-d6ff4e305cda" containerName="console" Apr 16 15:02:38.935778 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:38.935052 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="1949308f-8620-489a-9c56-3c217eac6bb2" containerName="manager" Apr 16 15:02:38.935778 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:38.935067 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f696a1a-abcc-4a94-b486-d6ff4e305cda" containerName="console" Apr 16 15:02:38.939687 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:38.938261 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-kcdp4" Apr 16 15:02:38.942516 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:38.942428 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 15:02:38.942945 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:38.942704 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-66cws\"" Apr 16 15:02:38.944847 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:38.944825 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-kcdp4"] Apr 16 15:02:39.038945 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:39.038922 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jz6b\" (UniqueName: \"kubernetes.io/projected/05ed9814-601f-4e12-b92a-5bfa7caf44d1-kube-api-access-8jz6b\") pod \"odh-model-controller-696fc77849-kcdp4\" (UID: \"05ed9814-601f-4e12-b92a-5bfa7caf44d1\") " pod="kserve/odh-model-controller-696fc77849-kcdp4" Apr 16 15:02:39.039080 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:39.038971 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05ed9814-601f-4e12-b92a-5bfa7caf44d1-cert\") pod \"odh-model-controller-696fc77849-kcdp4\" (UID: \"05ed9814-601f-4e12-b92a-5bfa7caf44d1\") " pod="kserve/odh-model-controller-696fc77849-kcdp4" Apr 16 15:02:39.139864 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:39.139837 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jz6b\" (UniqueName: \"kubernetes.io/projected/05ed9814-601f-4e12-b92a-5bfa7caf44d1-kube-api-access-8jz6b\") pod \"odh-model-controller-696fc77849-kcdp4\" (UID: \"05ed9814-601f-4e12-b92a-5bfa7caf44d1\") " pod="kserve/odh-model-controller-696fc77849-kcdp4" Apr 16 15:02:39.140200 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:39.139876 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05ed9814-601f-4e12-b92a-5bfa7caf44d1-cert\") pod \"odh-model-controller-696fc77849-kcdp4\" (UID: \"05ed9814-601f-4e12-b92a-5bfa7caf44d1\") " pod="kserve/odh-model-controller-696fc77849-kcdp4" Apr 16 15:02:39.142128 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:39.142103 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05ed9814-601f-4e12-b92a-5bfa7caf44d1-cert\") pod \"odh-model-controller-696fc77849-kcdp4\" (UID: \"05ed9814-601f-4e12-b92a-5bfa7caf44d1\") " pod="kserve/odh-model-controller-696fc77849-kcdp4" Apr 16 15:02:39.149715 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:39.149695 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jz6b\" (UniqueName: \"kubernetes.io/projected/05ed9814-601f-4e12-b92a-5bfa7caf44d1-kube-api-access-8jz6b\") pod \"odh-model-controller-696fc77849-kcdp4\" (UID: \"05ed9814-601f-4e12-b92a-5bfa7caf44d1\") " pod="kserve/odh-model-controller-696fc77849-kcdp4" Apr 16 15:02:39.251171 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:39.251109 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-kcdp4" Apr 16 15:02:39.368243 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:39.368222 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-kcdp4"] Apr 16 15:02:39.370787 ip-10-0-142-86 kubenswrapper[2561]: W0416 15:02:39.370756 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05ed9814_601f_4e12_b92a_5bfa7caf44d1.slice/crio-87f0ff0a5833e765224e6280079910d180b991368945bf552aa7253e0db95fee WatchSource:0}: Error finding container 87f0ff0a5833e765224e6280079910d180b991368945bf552aa7253e0db95fee: Status 404 returned error can't find the container with id 87f0ff0a5833e765224e6280079910d180b991368945bf552aa7253e0db95fee Apr 16 15:02:40.163224 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:40.163173 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-kcdp4" event={"ID":"05ed9814-601f-4e12-b92a-5bfa7caf44d1","Type":"ContainerStarted","Data":"87f0ff0a5833e765224e6280079910d180b991368945bf552aa7253e0db95fee"} Apr 16 15:02:42.171788 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:42.171700 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-kcdp4" event={"ID":"05ed9814-601f-4e12-b92a-5bfa7caf44d1","Type":"ContainerStarted","Data":"f96fd0f21485f37ca3d07db43fe2a103e8e7bc47e19557f164ede9494bc994bd"} Apr 16 15:02:42.172190 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:42.171849 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-kcdp4" Apr 16 15:02:42.195875 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:42.195832 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-kcdp4" podStartSLOduration=1.728070928 podStartE2EDuration="4.195817161s" podCreationTimestamp="2026-04-16 15:02:38 +0000 UTC" firstStartedPulling="2026-04-16 15:02:39.372114181 +0000 UTC m=+631.753111774" lastFinishedPulling="2026-04-16 15:02:41.839860415 +0000 UTC m=+634.220858007" observedRunningTime="2026-04-16 15:02:42.194113401 +0000 UTC m=+634.575111026" watchObservedRunningTime="2026-04-16 15:02:42.195817161 +0000 UTC m=+634.576814774" Apr 16 15:02:53.177263 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:02:53.177235 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-kcdp4" Apr 16 15:07:08.138984 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:07:08.138897 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-sx65q_e23b4d8a-18ac-44f6-bf57-10e1ac5b3247/console-operator/2.log" Apr 16 15:07:08.140878 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:07:08.140850 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-sx65q_e23b4d8a-18ac-44f6-bf57-10e1ac5b3247/console-operator/2.log" Apr 16 15:12:08.160178 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:12:08.160145 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-sx65q_e23b4d8a-18ac-44f6-bf57-10e1ac5b3247/console-operator/2.log" Apr 16 15:12:08.162696 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:12:08.162669 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-sx65q_e23b4d8a-18ac-44f6-bf57-10e1ac5b3247/console-operator/2.log" Apr 16 15:17:08.189776 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:08.189748 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-sx65q_e23b4d8a-18ac-44f6-bf57-10e1ac5b3247/console-operator/2.log" Apr 16 15:17:08.194216 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:08.194191 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-sx65q_e23b4d8a-18ac-44f6-bf57-10e1ac5b3247/console-operator/2.log" Apr 16 15:17:18.847275 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:18.847246 2561 ???:1] "http: TLS handshake error from 10.0.142.46:60410: EOF" Apr 16 15:17:18.853145 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:18.853120 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6fksx_2a785a0f-ff3d-4e46-9883-f604a6fec502/global-pull-secret-syncer/0.log" Apr 16 15:17:19.003762 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:19.003727 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-nzfb6_c9a20462-c104-4c17-a123-4c9d7acb06df/konnectivity-agent/0.log" Apr 16 15:17:19.069788 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:19.069766 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-86.ec2.internal_2e98882ee5619cf5d9be6b369dc8f0f8/haproxy/0.log" Apr 16 15:17:22.884306 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:22.884235 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-jvldv_db20efb8-03e9-4adc-82bf-e69768c8c347/cluster-monitoring-operator/0.log" Apr 16 15:17:22.979120 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:22.979098 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-56f49c9dcd-c869v_3debccbd-ccd0-4888-be40-8734a1730d29/metrics-server/0.log" Apr 16 15:17:23.182308 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:23.182238 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vf9jk_cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46/node-exporter/0.log" Apr 16 15:17:23.203698 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:23.203676 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vf9jk_cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46/kube-rbac-proxy/0.log" Apr 16 15:17:23.223117 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:23.223096 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vf9jk_cc35843d-9efc-4d7f-9cc6-a0e5d5a25c46/init-textfile/0.log" Apr 16 15:17:23.247357 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:23.247337 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-mfp4j_3367a751-ed62-4f70-b4a8-77d1c4071c5e/kube-rbac-proxy-main/0.log" Apr 16 15:17:23.270827 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:23.270808 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-mfp4j_3367a751-ed62-4f70-b4a8-77d1c4071c5e/kube-rbac-proxy-self/0.log" Apr 16 15:17:23.293786 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:23.293760 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-mfp4j_3367a751-ed62-4f70-b4a8-77d1c4071c5e/openshift-state-metrics/0.log" Apr 16 15:17:23.487383 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:23.487307 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-2jfnl_dd8d9682-b82b-4d5a-b119-96819a1ff714/prometheus-operator/0.log" Apr 16 15:17:23.509671 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:23.509648 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-2jfnl_dd8d9682-b82b-4d5a-b119-96819a1ff714/kube-rbac-proxy/0.log" Apr 16 15:17:23.534183 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:23.534161 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-bpszm_54b1b226-0457-4415-85be-c66a6e47c41b/prometheus-operator-admission-webhook/0.log" Apr 16 15:17:24.847132 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:24.847107 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-7htx8_dc9b1052-22c3-43fc-82e3-c5e203f94377/networking-console-plugin/0.log" Apr 16 15:17:25.261839 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:25.261763 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-sx65q_e23b4d8a-18ac-44f6-bf57-10e1ac5b3247/console-operator/2.log" Apr 16 15:17:25.266088 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:25.266067 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-sx65q_e23b4d8a-18ac-44f6-bf57-10e1ac5b3247/console-operator/3.log" Apr 16 15:17:25.612506 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:25.612485 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-598cbdf847-tvt9d_7f059092-ee05-42ba-90c4-6943c75e7795/console/0.log" Apr 16 15:17:25.637948 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:25.637928 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-bmqpp_67dfdfc1-e78f-4a26-ac78-22a18df6d6a7/download-server/0.log" Apr 16 15:17:25.989546 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:25.989482 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-xlt7h_4b6350c7-96e4-468b-b761-620bbf50fa63/volume-data-source-validator/0.log" Apr 16 15:17:26.249044 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.248975 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf"] Apr 16 15:17:26.252303 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.252282 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" Apr 16 15:17:26.254926 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.254906 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xtlq6\"/\"kube-root-ca.crt\"" Apr 16 15:17:26.255968 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.255949 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xtlq6\"/\"default-dockercfg-sktqr\"" Apr 16 15:17:26.256023 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.255971 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xtlq6\"/\"openshift-service-ca.crt\"" Apr 16 15:17:26.260037 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.260016 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf"] Apr 16 15:17:26.395383 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.395353 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8072c12-a639-4bbc-84ca-6fef12a352e1-sys\") pod \"perf-node-gather-daemonset-fz5zf\" (UID: \"d8072c12-a639-4bbc-84ca-6fef12a352e1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" Apr 16 15:17:26.395531 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.395506 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf6hd\" (UniqueName: \"kubernetes.io/projected/d8072c12-a639-4bbc-84ca-6fef12a352e1-kube-api-access-gf6hd\") pod \"perf-node-gather-daemonset-fz5zf\" (UID: \"d8072c12-a639-4bbc-84ca-6fef12a352e1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" Apr 16 15:17:26.395603 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.395541 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d8072c12-a639-4bbc-84ca-6fef12a352e1-proc\") pod \"perf-node-gather-daemonset-fz5zf\" (UID: \"d8072c12-a639-4bbc-84ca-6fef12a352e1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" Apr 16 15:17:26.395689 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.395636 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d8072c12-a639-4bbc-84ca-6fef12a352e1-podres\") pod \"perf-node-gather-daemonset-fz5zf\" (UID: \"d8072c12-a639-4bbc-84ca-6fef12a352e1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" Apr 16 15:17:26.395758 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.395739 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d8072c12-a639-4bbc-84ca-6fef12a352e1-lib-modules\") pod \"perf-node-gather-daemonset-fz5zf\" (UID: \"d8072c12-a639-4bbc-84ca-6fef12a352e1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" Apr 16 15:17:26.496933 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.496907 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8072c12-a639-4bbc-84ca-6fef12a352e1-sys\") pod \"perf-node-gather-daemonset-fz5zf\" (UID: \"d8072c12-a639-4bbc-84ca-6fef12a352e1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" Apr 16 15:17:26.497034 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.496949 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gf6hd\" (UniqueName: \"kubernetes.io/projected/d8072c12-a639-4bbc-84ca-6fef12a352e1-kube-api-access-gf6hd\") pod \"perf-node-gather-daemonset-fz5zf\" (UID: \"d8072c12-a639-4bbc-84ca-6fef12a352e1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" Apr 16 15:17:26.497034 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.496974 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d8072c12-a639-4bbc-84ca-6fef12a352e1-proc\") pod \"perf-node-gather-daemonset-fz5zf\" (UID: \"d8072c12-a639-4bbc-84ca-6fef12a352e1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" Apr 16 15:17:26.497034 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.497019 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8072c12-a639-4bbc-84ca-6fef12a352e1-sys\") pod \"perf-node-gather-daemonset-fz5zf\" (UID: \"d8072c12-a639-4bbc-84ca-6fef12a352e1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" Apr 16 15:17:26.497034 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.497011 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d8072c12-a639-4bbc-84ca-6fef12a352e1-podres\") pod \"perf-node-gather-daemonset-fz5zf\" (UID: \"d8072c12-a639-4bbc-84ca-6fef12a352e1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" Apr 16 15:17:26.497197 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.497082 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d8072c12-a639-4bbc-84ca-6fef12a352e1-lib-modules\") pod \"perf-node-gather-daemonset-fz5zf\" (UID: \"d8072c12-a639-4bbc-84ca-6fef12a352e1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" Apr 16 15:17:26.497197 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.497100 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d8072c12-a639-4bbc-84ca-6fef12a352e1-proc\") pod \"perf-node-gather-daemonset-fz5zf\" (UID: \"d8072c12-a639-4bbc-84ca-6fef12a352e1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" Apr 16 15:17:26.497197 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.497117 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d8072c12-a639-4bbc-84ca-6fef12a352e1-podres\") pod \"perf-node-gather-daemonset-fz5zf\" (UID: \"d8072c12-a639-4bbc-84ca-6fef12a352e1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" Apr 16 15:17:26.497320 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.497229 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d8072c12-a639-4bbc-84ca-6fef12a352e1-lib-modules\") pod \"perf-node-gather-daemonset-fz5zf\" (UID: \"d8072c12-a639-4bbc-84ca-6fef12a352e1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" Apr 16 15:17:26.505070 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.505019 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf6hd\" (UniqueName: \"kubernetes.io/projected/d8072c12-a639-4bbc-84ca-6fef12a352e1-kube-api-access-gf6hd\") pod \"perf-node-gather-daemonset-fz5zf\" (UID: \"d8072c12-a639-4bbc-84ca-6fef12a352e1\") " pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" Apr 16 15:17:26.563002 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.562979 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" Apr 16 15:17:26.623753 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.623724 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5lzf7_18926de0-0561-424c-845b-6ea1059c821a/dns/0.log" Apr 16 15:17:26.648036 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.648012 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5lzf7_18926de0-0561-424c-845b-6ea1059c821a/kube-rbac-proxy/0.log" Apr 16 15:17:26.681598 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.681574 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf"] Apr 16 15:17:26.684492 ip-10-0-142-86 kubenswrapper[2561]: W0416 15:17:26.684465 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd8072c12_a639_4bbc_84ca_6fef12a352e1.slice/crio-93287cad2bccca2a8f522d1633babc0d49ac65258dc2160f38676fa2c7063cfe WatchSource:0}: Error finding container 93287cad2bccca2a8f522d1633babc0d49ac65258dc2160f38676fa2c7063cfe: Status 404 returned error can't find the container with id 93287cad2bccca2a8f522d1633babc0d49ac65258dc2160f38676fa2c7063cfe Apr 16 15:17:26.686369 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.686355 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:17:26.814270 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:26.814244 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-h54t4_72695b7c-a9cd-4e46-80e9-10740ab7de94/dns-node-resolver/0.log" Apr 16 15:17:27.020766 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:27.020731 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" event={"ID":"d8072c12-a639-4bbc-84ca-6fef12a352e1","Type":"ContainerStarted","Data":"d0e05f4b9ee6579efb869906091e2a4a4084cd5e57b55f90ef7236facf41e7e7"} Apr 16 15:17:27.020766 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:27.020768 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" event={"ID":"d8072c12-a639-4bbc-84ca-6fef12a352e1","Type":"ContainerStarted","Data":"93287cad2bccca2a8f522d1633babc0d49ac65258dc2160f38676fa2c7063cfe"} Apr 16 15:17:27.021264 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:27.020862 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" Apr 16 15:17:27.037040 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:27.036998 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" podStartSLOduration=1.036984793 podStartE2EDuration="1.036984793s" podCreationTimestamp="2026-04-16 15:17:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:17:27.035184097 +0000 UTC m=+1519.416181712" watchObservedRunningTime="2026-04-16 15:17:27.036984793 +0000 UTC m=+1519.417982406" Apr 16 15:17:27.274786 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:27.274762 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tkx2t_f593ceff-50dd-4533-bbeb-0dcc375c12b9/node-ca/0.log" Apr 16 15:17:27.966671 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:27.966630 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-55d8df8b-t92xc_856eaa92-f51a-4a81-8e75-e2010da158d8/router/0.log" Apr 16 15:17:28.307852 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:28.307807 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vlvr9_d01b24a9-f9f3-4d8c-830c-38ff2cc50292/serve-healthcheck-canary/0.log" Apr 16 15:17:28.665109 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:28.665026 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4c465_88cecfa9-1dbd-4fa5-a33b-463543fb9b31/kube-rbac-proxy/0.log" Apr 16 15:17:28.684323 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:28.684300 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4c465_88cecfa9-1dbd-4fa5-a33b-463543fb9b31/exporter/0.log" Apr 16 15:17:28.703295 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:28.703276 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4c465_88cecfa9-1dbd-4fa5-a33b-463543fb9b31/extractor/0.log" Apr 16 15:17:30.663402 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:30.663366 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-7669bdc57-9nc5f_9b557b33-e79d-4ed8-8dcb-3a2cfaa263b9/manager/0.log" Apr 16 15:17:30.682947 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:30.682922 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-nlz29_7a3cbf79-de66-4c90-8568-5d8392101c7e/manager/0.log" Apr 16 15:17:30.786531 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:30.786509 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-kcdp4_05ed9814-601f-4e12-b92a-5bfa7caf44d1/manager/0.log" Apr 16 15:17:33.033523 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:33.033493 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xtlq6/perf-node-gather-daemonset-fz5zf" Apr 16 15:17:34.806942 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:34.806911 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-nth4n_46f7af62-73ce-4f89-a210-d2280368ebfc/kube-storage-version-migrator-operator/1.log" Apr 16 15:17:34.807772 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:34.807754 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-nth4n_46f7af62-73ce-4f89-a210-d2280368ebfc/kube-storage-version-migrator-operator/0.log" Apr 16 15:17:35.801297 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:35.801273 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4h6ft_c6c72ed9-22d4-4b46-99b0-c1f258e78270/kube-multus/0.log" Apr 16 15:17:36.035651 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:36.035625 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-575fz_c1f3fc48-12ff-4b51-a439-082e842f2b08/kube-multus-additional-cni-plugins/0.log" Apr 16 15:17:36.088419 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:36.088347 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-575fz_c1f3fc48-12ff-4b51-a439-082e842f2b08/egress-router-binary-copy/0.log" Apr 16 15:17:36.116366 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:36.116345 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-575fz_c1f3fc48-12ff-4b51-a439-082e842f2b08/cni-plugins/0.log" Apr 16 15:17:36.136977 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:36.136959 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-575fz_c1f3fc48-12ff-4b51-a439-082e842f2b08/bond-cni-plugin/0.log" Apr 16 15:17:36.159313 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:36.159294 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-575fz_c1f3fc48-12ff-4b51-a439-082e842f2b08/routeoverride-cni/0.log" Apr 16 15:17:36.178638 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:36.178601 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-575fz_c1f3fc48-12ff-4b51-a439-082e842f2b08/whereabouts-cni-bincopy/0.log" Apr 16 15:17:36.197910 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:36.197883 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-575fz_c1f3fc48-12ff-4b51-a439-082e842f2b08/whereabouts-cni/0.log" Apr 16 15:17:36.432843 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:36.432743 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bppkn_ed427102-c549-468d-8146-32fba6da0a45/network-metrics-daemon/0.log" Apr 16 15:17:36.450019 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:36.449997 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bppkn_ed427102-c549-468d-8146-32fba6da0a45/kube-rbac-proxy/0.log" Apr 16 15:17:37.808001 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:37.807970 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vtjk7_cf2d6237-3c32-44f2-bf46-6f36e887e3c2/ovn-controller/0.log" Apr 16 15:17:37.834396 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:37.834367 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vtjk7_cf2d6237-3c32-44f2-bf46-6f36e887e3c2/ovn-acl-logging/0.log" Apr 16 15:17:37.852726 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:37.852701 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vtjk7_cf2d6237-3c32-44f2-bf46-6f36e887e3c2/kube-rbac-proxy-node/0.log" Apr 16 15:17:37.871127 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:37.871108 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vtjk7_cf2d6237-3c32-44f2-bf46-6f36e887e3c2/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 15:17:37.888737 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:37.888715 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vtjk7_cf2d6237-3c32-44f2-bf46-6f36e887e3c2/northd/0.log" Apr 16 15:17:37.908688 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:37.908668 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vtjk7_cf2d6237-3c32-44f2-bf46-6f36e887e3c2/nbdb/0.log" Apr 16 15:17:37.927581 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:37.927552 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vtjk7_cf2d6237-3c32-44f2-bf46-6f36e887e3c2/sbdb/0.log" Apr 16 15:17:38.013525 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:38.013502 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vtjk7_cf2d6237-3c32-44f2-bf46-6f36e887e3c2/ovnkube-controller/0.log" Apr 16 15:17:38.926168 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:38.926120 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-cp47z_d37fc185-ce8f-4c06-ace2-ca0a852977db/network-check-target-container/0.log" Apr 16 15:17:39.850103 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:39.850079 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-577nk_227b75fb-5d17-43d6-a870-e0959b3989c4/iptables-alerter/0.log" Apr 16 15:17:40.480764 ip-10-0-142-86 kubenswrapper[2561]: I0416 15:17:40.480734 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-bg8fh_59571617-fe46-4cd8-8766-d1b4dbb300e8/tuned/0.log"