Apr 16 13:56:44.755429 ip-10-0-142-16 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 13:56:44.755456 ip-10-0-142-16 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 13:56:44.755465 ip-10-0-142-16 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 13:56:44.755766 ip-10-0-142-16 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 13:56:56.413314 ip-10-0-142-16 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 13:56:56.413331 ip-10-0-142-16 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 42218c4ef05e4030a14302d68e3be4ea -- Apr 16 13:59:22.519685 ip-10-0-142-16 systemd[1]: Starting Kubernetes Kubelet... Apr 16 13:59:23.051009 ip-10-0-142-16 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:23.051009 ip-10-0-142-16 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 13:59:23.051009 ip-10-0-142-16 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:23.051009 ip-10-0-142-16 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 13:59:23.051009 ip-10-0-142-16 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:23.052885 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.052795 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 13:59:23.056034 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056019 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:23.056034 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056034 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:23.056092 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056038 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:23.056092 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056041 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:23.056092 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056044 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:23.056092 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056046 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:23.056092 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056051 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:23.056092 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056054 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:23.056092 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056057 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:23.056092 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056060 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:23.056092 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056062 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:23.056092 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056065 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:23.056092 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056068 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:23.056092 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056071 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:23.056092 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056073 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:23.056092 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056076 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:23.056092 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056079 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:23.056092 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056081 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:23.056092 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056084 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:23.056092 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056086 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:23.056092 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056089 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:23.056571 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056092 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:23.056571 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056095 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:23.056571 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056098 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:23.056571 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056100 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:23.056571 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056103 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:23.056571 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056106 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:23.056571 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056109 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:23.056571 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056111 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:23.056571 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056114 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:23.056571 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056116 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:23.056571 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056119 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:23.056571 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056121 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:23.056571 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056124 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:23.056571 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056126 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:23.056571 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056129 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:23.056571 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056131 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:23.056571 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056134 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:23.056571 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056136 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:23.056571 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056139 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:23.057039 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056141 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:23.057039 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056144 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:23.057039 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056147 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:23.057039 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056149 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:23.057039 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056151 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:23.057039 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056154 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:23.057039 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056160 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:23.057039 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056163 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:23.057039 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056166 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:23.057039 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056169 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:23.057039 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056171 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:23.057039 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056174 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:23.057039 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056177 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:23.057039 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056181 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:23.057039 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056186 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:23.057039 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056188 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:23.057039 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056191 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:23.057039 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056194 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:23.057039 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056196 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:23.057039 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056199 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:23.057519 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056201 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:23.057519 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056204 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:23.057519 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056206 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:23.057519 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056209 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:23.057519 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056211 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:23.057519 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056214 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:23.057519 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056216 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:23.057519 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056218 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:23.057519 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056221 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:23.057519 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056225 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:23.057519 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056228 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:23.057519 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056231 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:23.057519 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056233 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:23.057519 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056236 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:23.057519 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056238 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:23.057519 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056241 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:23.057519 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056243 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:23.057519 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056246 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:23.057519 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056248 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:23.057519 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056250 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:23.058013 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056253 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:23.058013 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056255 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:23.058013 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056257 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:23.058013 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056260 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:23.058013 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056262 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:23.058013 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.056265 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:23.059424 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059412 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:23.059424 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059423 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:23.059482 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059427 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:23.059482 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059430 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:23.059482 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059433 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:23.059482 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059436 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:23.059482 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059439 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:23.059482 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059441 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:23.059482 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059444 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:23.059482 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059447 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:23.059482 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059449 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:23.059482 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059452 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:23.059482 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059455 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:23.059482 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059458 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:23.059482 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059460 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:23.059482 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059463 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:23.059482 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059466 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:23.059482 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059468 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:23.059482 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059471 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:23.059482 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059473 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:23.059482 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059476 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:23.059482 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059478 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:23.059975 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059481 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:23.059975 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059484 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:23.059975 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059487 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:23.059975 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059489 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:23.059975 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059494 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:23.059975 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059497 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:23.059975 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059500 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:23.059975 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059503 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:23.059975 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059506 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:23.059975 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059509 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:23.059975 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059512 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:23.059975 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059515 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:23.059975 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059518 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:23.059975 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059520 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:23.059975 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059523 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:23.059975 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059525 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:23.059975 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059527 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:23.059975 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059530 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:23.059975 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059532 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:23.059975 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059534 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:23.060496 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059537 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:23.060496 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059539 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:23.060496 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059542 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:23.060496 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059544 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:23.060496 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059548 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:23.060496 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059551 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:23.060496 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059554 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:23.060496 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059558 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:23.060496 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059561 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:23.060496 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059563 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:23.060496 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059566 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:23.060496 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059569 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:23.060496 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059571 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:23.060496 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059574 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:23.060496 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059577 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:23.060496 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059579 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:23.060496 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059582 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:23.060496 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059585 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:23.060496 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059587 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:23.060947 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059590 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:23.060947 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059593 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:23.060947 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059595 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:23.060947 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059597 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:23.060947 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059600 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:23.060947 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059603 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:23.060947 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059605 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:23.060947 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059608 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:23.060947 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059610 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:23.060947 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059613 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:23.060947 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059615 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:23.060947 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059618 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:23.060947 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059620 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:23.060947 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059623 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:23.060947 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059625 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:23.060947 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059628 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:23.060947 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059631 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:23.060947 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059634 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:23.060947 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059636 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:23.060947 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059639 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059641 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059644 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059648 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059651 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.059653 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059720 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059726 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059733 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059738 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059755 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059759 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059764 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059768 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059772 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059775 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059778 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059781 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059785 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059789 2569 flags.go:64] FLAG: --cgroup-root="" Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059792 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059795 2569 flags.go:64] FLAG: --client-ca-file="" Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059797 2569 flags.go:64] FLAG: --cloud-config="" Apr 16 13:59:23.061428 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059800 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059803 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059807 2569 flags.go:64] FLAG: --cluster-domain="" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059810 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059813 2569 flags.go:64] FLAG: --config-dir="" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059815 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059819 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059823 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059826 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059829 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059832 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059835 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059839 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059842 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059845 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059848 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059852 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059855 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059858 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059861 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059864 2569 flags.go:64] FLAG: --enable-server="true" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059867 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059871 2569 flags.go:64] FLAG: --event-burst="100" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059874 2569 flags.go:64] FLAG: --event-qps="50" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059878 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 13:59:23.061998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059881 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059885 2569 flags.go:64] FLAG: --eviction-hard="" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059889 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059893 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059897 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059900 2569 flags.go:64] FLAG: --eviction-soft="" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059903 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059906 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059909 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059911 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059915 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059918 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059921 2569 flags.go:64] FLAG: --feature-gates="" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059924 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059928 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059931 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059934 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059937 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059941 2569 flags.go:64] FLAG: --help="false" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059944 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-142-16.ec2.internal" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059947 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059950 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059953 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059956 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 13:59:23.062634 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059959 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059962 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059965 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059967 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059970 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059973 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059976 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059979 2569 flags.go:64] FLAG: --kube-reserved="" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059983 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059986 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059989 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059992 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059995 2569 flags.go:64] FLAG: --lock-file="" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.059998 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060001 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060004 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060009 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060012 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060014 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060017 2569 flags.go:64] FLAG: --logging-format="text" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060020 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060023 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060026 2569 flags.go:64] FLAG: --manifest-url="" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060029 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060033 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 13:59:23.063230 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060036 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060040 2569 flags.go:64] FLAG: --max-pods="110" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060043 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060046 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060049 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060052 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060055 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060058 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060061 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060068 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060071 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060074 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060077 2569 flags.go:64] FLAG: --pod-cidr="" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060080 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060086 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060089 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060092 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060095 2569 flags.go:64] FLAG: --port="10250" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060098 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060101 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e666857028eaf35c" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060105 2569 flags.go:64] FLAG: --qos-reserved="" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060108 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060110 2569 flags.go:64] FLAG: --register-node="true" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060113 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 16 13:59:23.063828 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060116 2569 flags.go:64] FLAG: --register-with-taints="" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060119 2569 flags.go:64] FLAG: --registry-burst="10" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060122 2569 flags.go:64] FLAG: --registry-qps="5" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060125 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060127 2569 flags.go:64] FLAG: --reserved-memory="" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060131 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060134 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060136 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060140 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060142 2569 flags.go:64] FLAG: --runonce="false" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060145 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060149 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060151 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060154 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060157 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060160 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060163 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060166 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060169 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060172 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060175 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060177 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060181 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060184 2569 flags.go:64] FLAG: --system-cgroups="" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060186 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 13:59:23.064421 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060191 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060194 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060197 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060201 2569 flags.go:64] FLAG: --tls-min-version="" Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060204 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060207 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060209 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060212 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060215 2569 flags.go:64] FLAG: --v="2" Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060219 2569 flags.go:64] FLAG: --version="false" Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060223 2569 flags.go:64] FLAG: --vmodule="" Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060227 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.060230 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060319 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060322 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060325 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060328 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060331 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060334 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060336 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060339 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060342 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:23.064998 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060344 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:23.065528 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060347 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:23.065528 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060350 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:23.065528 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060354 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:23.065528 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060357 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:23.065528 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060360 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:23.065528 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060362 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:23.065528 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060368 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:23.065528 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060370 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:23.065528 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060373 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:23.065528 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060376 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:23.065528 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060378 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:23.065528 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060381 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:23.065528 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060383 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:23.065528 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060386 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:23.065528 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060389 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:23.065528 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060391 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:23.065528 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060410 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:23.065528 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060413 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:23.065528 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060415 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:23.066050 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060418 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:23.066050 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060420 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:23.066050 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060423 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:23.066050 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060426 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:23.066050 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060429 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:23.066050 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060432 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:23.066050 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060434 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:23.066050 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060437 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:23.066050 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060440 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:23.066050 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060442 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:23.066050 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060445 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:23.066050 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060447 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:23.066050 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060450 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:23.066050 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060453 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:23.066050 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060455 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:23.066050 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060458 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:23.066050 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060462 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:23.066050 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060465 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:23.066050 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060468 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:23.066050 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060473 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:23.066558 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060476 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:23.066558 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060479 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:23.066558 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060481 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:23.066558 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060484 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:23.066558 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060486 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:23.066558 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060489 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:23.066558 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060491 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:23.066558 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060494 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:23.066558 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060497 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:23.066558 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060500 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:23.066558 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060502 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:23.066558 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060505 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:23.066558 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060508 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:23.066558 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060510 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:23.066558 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060513 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:23.066558 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060516 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:23.066558 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060518 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:23.066558 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060521 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:23.066558 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060524 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:23.067127 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060526 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:23.067127 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060529 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:23.067127 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060531 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:23.067127 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060533 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:23.067127 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060536 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:23.067127 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060538 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:23.067127 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060541 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:23.067127 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060543 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:23.067127 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060546 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:23.067127 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060548 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:23.067127 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060551 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:23.067127 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060553 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:23.067127 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060557 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:23.067127 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060559 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:23.067127 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060562 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:23.067127 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060564 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:23.067127 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060566 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:23.067127 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.060569 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:23.067789 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.061241 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:23.069253 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.069234 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 13:59:23.069253 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.069254 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 13:59:23.069324 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069301 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:23.069324 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069306 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:23.069324 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069309 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:23.069324 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069313 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:23.069324 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069316 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:23.069324 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069318 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:23.069324 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069321 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:23.069324 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069325 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:23.069621 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069329 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:23.069621 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069332 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:23.069621 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069335 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:23.069621 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069338 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:23.069621 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069341 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:23.069621 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069344 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:23.069621 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069346 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:23.069621 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069349 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:23.069621 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069352 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:23.069621 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069355 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:23.069621 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069358 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:23.069621 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069360 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:23.069621 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069363 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:23.069621 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069366 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:23.069621 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069368 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:23.069621 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069371 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:23.069621 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069373 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:23.069621 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069376 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:23.069621 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069378 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:23.069621 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069381 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:23.070141 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069384 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:23.070141 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069386 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:23.070141 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069389 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:23.070141 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069392 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:23.070141 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069408 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:23.070141 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069411 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:23.070141 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069414 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:23.070141 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069416 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:23.070141 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069419 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:23.070141 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069422 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:23.070141 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069424 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:23.070141 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069427 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:23.070141 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069429 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:23.070141 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069433 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:23.070141 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069437 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:23.070141 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069439 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:23.070141 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069443 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:23.070141 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069446 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:23.070141 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069448 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:23.070141 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069451 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:23.070657 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069454 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:23.070657 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069456 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:23.070657 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069459 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:23.070657 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069461 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:23.070657 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069464 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:23.070657 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069466 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:23.070657 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069469 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:23.070657 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069471 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:23.070657 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069474 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:23.070657 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069476 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:23.070657 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069479 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:23.070657 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069482 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:23.070657 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069484 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:23.070657 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069487 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:23.070657 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069490 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:23.070657 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069493 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:23.070657 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069495 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:23.070657 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069497 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:23.070657 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069500 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:23.070657 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069502 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:23.071197 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069505 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:23.071197 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069507 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:23.071197 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069510 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:23.071197 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069512 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:23.071197 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069515 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:23.071197 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069518 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:23.071197 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069521 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:23.071197 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069524 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:23.071197 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069527 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:23.071197 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069530 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:23.071197 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069532 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:23.071197 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069535 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:23.071197 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069537 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:23.071197 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069540 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:23.071197 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069542 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:23.071197 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069545 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:23.071197 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069547 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:23.071197 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069551 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:23.071632 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.069557 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:23.071632 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069672 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:23.071632 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069677 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:23.071632 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069680 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:23.071632 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069683 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:23.071632 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069686 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:23.071632 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069689 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:23.071632 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069692 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:23.071632 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069694 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:23.071632 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069697 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:23.071632 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069699 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:23.071632 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069702 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:23.071632 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069704 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:23.071632 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069707 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:23.071632 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069709 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:23.071991 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069712 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:23.071991 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069714 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:23.071991 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069717 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:23.071991 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069720 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:23.071991 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069722 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:23.071991 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069725 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:23.071991 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069727 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:23.071991 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069730 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:23.071991 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069733 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:23.071991 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069735 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:23.071991 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069738 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:23.071991 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069740 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:23.071991 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069743 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:23.071991 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069745 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:23.071991 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069748 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:23.071991 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069751 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:23.071991 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069753 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:23.071991 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069755 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:23.071991 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069758 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:23.071991 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069760 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:23.072476 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069763 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:23.072476 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069765 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:23.072476 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069767 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:23.072476 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069770 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:23.072476 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069773 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:23.072476 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069775 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:23.072476 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069778 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:23.072476 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069781 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:23.072476 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069789 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:23.072476 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069792 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:23.072476 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069795 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:23.072476 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069798 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:23.072476 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069800 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:23.072476 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069803 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:23.072476 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069805 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:23.072476 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069808 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:23.072476 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069811 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:23.072476 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069813 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:23.072476 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069816 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:23.072924 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069819 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:23.072924 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069821 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:23.072924 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069825 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:23.072924 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069846 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:23.072924 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069849 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:23.072924 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069851 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:23.072924 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069867 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:23.072924 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069871 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:23.072924 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069875 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:23.072924 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069878 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:23.072924 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069881 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:23.072924 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069883 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:23.072924 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069886 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:23.072924 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069889 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:23.072924 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069891 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:23.072924 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069895 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:23.072924 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069898 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:23.072924 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069901 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:23.072924 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069904 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:23.073370 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069906 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:23.073370 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069909 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:23.073370 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069911 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:23.073370 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069913 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:23.073370 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069916 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:23.073370 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069919 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:23.073370 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069921 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:23.073370 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069924 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:23.073370 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069927 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:23.073370 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069930 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:23.073370 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069933 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:23.073370 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069935 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:23.073370 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069938 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:23.073370 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:23.069940 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:23.073370 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.069945 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:23.073370 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.070587 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 13:59:23.075904 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.075890 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 13:59:23.077318 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.077307 2569 server.go:1019] "Starting client certificate rotation" Apr 16 13:59:23.077430 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.077413 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:23.077472 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.077449 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:23.108044 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.108024 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:23.115214 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.115188 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:23.130103 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.130081 2569 log.go:25] "Validated CRI v1 runtime API" Apr 16 13:59:23.136666 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.136651 2569 log.go:25] "Validated CRI v1 image API" Apr 16 13:59:23.138825 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.138804 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 13:59:23.142719 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.142700 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:23.144053 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.144034 2569 fs.go:135] Filesystem UUIDs: map[0777a231-474f-4a2d-8e3b-acda958911f9:/dev/nvme0n1p4 3d0eba7a-f58b-48a1-a98d-4e57bd26e8f5:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 13:59:23.144094 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.144054 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 13:59:23.150003 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.149901 2569 manager.go:217] Machine: {Timestamp:2026-04-16 13:59:23.147678936 +0000 UTC m=+0.488053780 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3161924 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec201260d1be0be39dc946baf589b397 SystemUUID:ec201260-d1be-0be3-9dc9-46baf589b397 BootID:42218c4e-f05e-4030-a143-02d68e3be4ea Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:df:0f:56:d7:5d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:df:0f:56:d7:5d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:2a:81:11:af:8f:76 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 13:59:23.150003 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.149999 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 13:59:23.150123 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.150081 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 13:59:23.151881 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.151854 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 13:59:23.152018 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.151884 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-16.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 13:59:23.152060 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.152028 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 13:59:23.152060 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.152036 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 13:59:23.152060 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.152053 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:23.153156 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.153145 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:23.153964 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.153954 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:23.154067 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.154058 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 13:59:23.156841 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.156831 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 16 13:59:23.156876 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.156845 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 13:59:23.156876 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.156856 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 13:59:23.156876 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.156865 2569 kubelet.go:397] "Adding apiserver pod source" Apr 16 13:59:23.156965 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.156880 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 13:59:23.158220 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.158204 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:23.158268 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.158228 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:23.162101 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.162087 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 13:59:23.163870 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.163851 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 13:59:23.165227 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.165214 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 13:59:23.165302 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.165234 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 13:59:23.165302 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.165244 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 13:59:23.165302 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.165252 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 13:59:23.165302 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.165260 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 13:59:23.165302 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.165270 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 13:59:23.165302 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.165279 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 13:59:23.165302 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.165287 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 13:59:23.165302 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.165297 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 13:59:23.165302 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.165305 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 13:59:23.165588 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.165325 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 13:59:23.165588 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.165339 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 13:59:23.166357 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.166345 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 13:59:23.166420 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.166360 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 13:59:23.166475 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.166413 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5f5p6" Apr 16 13:59:23.169303 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:23.169277 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-16.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 13:59:23.169303 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:23.169279 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 13:59:23.169726 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.169713 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 13:59:23.169800 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.169755 2569 server.go:1295] "Started kubelet" Apr 16 13:59:23.169877 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.169853 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 13:59:23.170547 ip-10-0-142-16 systemd[1]: Started Kubernetes Kubelet. Apr 16 13:59:23.170975 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.170554 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 13:59:23.170975 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.170671 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 13:59:23.171656 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.171644 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 13:59:23.173004 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.172988 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 16 13:59:23.177630 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.177581 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5f5p6" Apr 16 13:59:23.177984 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.177901 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:23.178094 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.178047 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 13:59:23.181650 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.179592 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 13:59:23.181650 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.179610 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 13:59:23.181650 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.179782 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 13:59:23.181650 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.179898 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 16 13:59:23.181650 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.179913 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 16 13:59:23.181650 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.180644 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-16.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 13:59:23.181650 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.181244 2569 factory.go:55] Registering systemd factory Apr 16 13:59:23.181650 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.181259 2569 factory.go:223] Registration of the systemd container factory successfully Apr 16 13:59:23.181650 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.181585 2569 factory.go:153] Registering CRI-O factory Apr 16 13:59:23.181650 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.181599 2569 factory.go:223] Registration of the crio container factory successfully Apr 16 13:59:23.182144 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.181661 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 13:59:23.182144 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.181696 2569 factory.go:103] Registering Raw factory Apr 16 13:59:23.182144 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.181712 2569 manager.go:1196] Started watching for new ooms in manager Apr 16 13:59:23.182144 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:23.181869 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Apr 16 13:59:23.182792 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.182555 2569 manager.go:319] Starting recovery of all containers Apr 16 13:59:23.185353 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:23.180669 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-16.ec2.internal.18a6db0f9e385ecd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-16.ec2.internal,UID:ip-10-0-142-16.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-16.ec2.internal,},FirstTimestamp:2026-04-16 13:59:23.169726157 +0000 UTC m=+0.510101001,LastTimestamp:2026-04-16 13:59:23.169726157 +0000 UTC m=+0.510101001,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-16.ec2.internal,}" Apr 16 13:59:23.185700 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:23.185675 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 13:59:23.187550 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.187525 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:23.190803 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:23.190778 2569 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-142-16.ec2.internal\" not found" node="ip-10-0-142-16.ec2.internal" Apr 16 13:59:23.194988 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.194971 2569 manager.go:324] Recovery completed Apr 16 13:59:23.198672 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.198658 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:23.201050 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.201037 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:23.201109 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.201064 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:23.201109 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.201076 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:23.201517 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.201504 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 13:59:23.201561 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.201517 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 13:59:23.201561 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.201534 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:23.204855 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.204739 2569 policy_none.go:49] "None policy: Start" Apr 16 13:59:23.204855 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.204855 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 13:59:23.204939 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.204865 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 16 13:59:23.244958 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.244943 2569 manager.go:341] "Starting Device Plugin manager" Apr 16 13:59:23.265940 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:23.244979 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 13:59:23.265940 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.244992 2569 server.go:85] "Starting device plugin registration server" Apr 16 13:59:23.265940 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.245233 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 13:59:23.265940 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.245246 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 13:59:23.265940 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.245341 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 13:59:23.265940 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.245452 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 13:59:23.265940 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.245460 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 13:59:23.265940 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:23.246110 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 13:59:23.265940 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:23.246173 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-16.ec2.internal\" not found" Apr 16 13:59:23.306968 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.306893 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 13:59:23.308028 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.308013 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 13:59:23.308105 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.308042 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 13:59:23.308105 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.308063 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 13:59:23.308105 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.308072 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 13:59:23.308232 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:23.308111 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 13:59:23.312374 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.312358 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:23.346071 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.346058 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:23.347648 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.347633 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:23.347720 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.347662 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:23.347720 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.347675 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:23.347720 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.347696 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-16.ec2.internal" Apr 16 13:59:23.355570 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.355554 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-16.ec2.internal" Apr 16 13:59:23.355623 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:23.355578 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-16.ec2.internal\": node \"ip-10-0-142-16.ec2.internal\" not found" Apr 16 13:59:23.374634 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:23.374610 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Apr 16 13:59:23.408602 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.408562 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal"] Apr 16 13:59:23.408684 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.408642 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:23.410213 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.410198 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:23.410290 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.410225 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:23.410290 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.410235 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:23.412509 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.412497 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:23.412652 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.412639 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal" Apr 16 13:59:23.412688 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.412665 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:23.413296 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.413281 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:23.413355 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.413308 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:23.413355 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.413325 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:23.413450 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.413286 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:23.413450 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.413383 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:23.413450 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.413412 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:23.416689 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.416554 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" Apr 16 13:59:23.416689 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.416635 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:23.417805 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.417464 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:23.417805 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.417490 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:23.417805 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.417506 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:23.450049 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:23.450024 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-16.ec2.internal\" not found" node="ip-10-0-142-16.ec2.internal" Apr 16 13:59:23.454314 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:23.454300 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-16.ec2.internal\" not found" node="ip-10-0-142-16.ec2.internal" Apr 16 13:59:23.474830 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:23.474814 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Apr 16 13:59:23.482570 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.482551 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/104ec126e58c79948296ecdd10d4aa5b-config\") pod \"kube-apiserver-proxy-ip-10-0-142-16.ec2.internal\" (UID: \"104ec126e58c79948296ecdd10d4aa5b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal" Apr 16 13:59:23.482655 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.482575 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/173ef8d8e696e627a768316289085c1e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal\" (UID: \"173ef8d8e696e627a768316289085c1e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" Apr 16 13:59:23.482655 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.482594 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/173ef8d8e696e627a768316289085c1e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal\" (UID: \"173ef8d8e696e627a768316289085c1e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" Apr 16 13:59:23.575306 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:23.575224 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Apr 16 13:59:23.583627 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.583608 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/104ec126e58c79948296ecdd10d4aa5b-config\") pod \"kube-apiserver-proxy-ip-10-0-142-16.ec2.internal\" (UID: \"104ec126e58c79948296ecdd10d4aa5b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal" Apr 16 13:59:23.583695 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.583633 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/173ef8d8e696e627a768316289085c1e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal\" (UID: \"173ef8d8e696e627a768316289085c1e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" Apr 16 13:59:23.583695 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.583651 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/173ef8d8e696e627a768316289085c1e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal\" (UID: \"173ef8d8e696e627a768316289085c1e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" Apr 16 13:59:23.583763 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.583698 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/173ef8d8e696e627a768316289085c1e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal\" (UID: \"173ef8d8e696e627a768316289085c1e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" Apr 16 13:59:23.583763 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.583710 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/104ec126e58c79948296ecdd10d4aa5b-config\") pod \"kube-apiserver-proxy-ip-10-0-142-16.ec2.internal\" (UID: \"104ec126e58c79948296ecdd10d4aa5b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal" Apr 16 13:59:23.583763 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.583721 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/173ef8d8e696e627a768316289085c1e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal\" (UID: \"173ef8d8e696e627a768316289085c1e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" Apr 16 13:59:23.676057 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:23.676006 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Apr 16 13:59:23.753608 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.753582 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal" Apr 16 13:59:23.757162 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:23.757138 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" Apr 16 13:59:23.777008 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:23.776984 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Apr 16 13:59:23.878073 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:23.877973 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Apr 16 13:59:23.978533 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:23.978498 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Apr 16 13:59:24.018069 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:24.018051 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:24.076995 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:24.076968 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 13:59:24.077513 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:24.077130 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:24.077513 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:24.077143 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:24.077513 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:24.077143 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:24.079114 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:24.079095 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Apr 16 13:59:24.179068 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:24.178999 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:24.179645 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:24.179627 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Apr 16 13:59:24.182729 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:24.182686 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 13:54:23 +0000 UTC" deadline="2028-01-17 09:28:05.658917228 +0000 UTC" Apr 16 13:59:24.182729 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:24.182717 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15379h28m41.476202956s" Apr 16 13:59:24.208021 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:24.207999 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:24.235295 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:24.235272 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-tbdm6" Apr 16 13:59:24.241660 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:24.241638 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-tbdm6" Apr 16 13:59:24.280631 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:24.280598 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Apr 16 13:59:24.310552 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:24.310515 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod173ef8d8e696e627a768316289085c1e.slice/crio-ef06888b1296ef9b87aac9553da7886c5d104c43384894a0e783724ecbbddaf8 WatchSource:0}: Error finding container ef06888b1296ef9b87aac9553da7886c5d104c43384894a0e783724ecbbddaf8: Status 404 returned error can't find the container with id ef06888b1296ef9b87aac9553da7886c5d104c43384894a0e783724ecbbddaf8 Apr 16 13:59:24.311091 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:24.311072 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod104ec126e58c79948296ecdd10d4aa5b.slice/crio-4af019f1da59537c9ce890232d7ff080145fb0411c935a96da131fea7b45f682 WatchSource:0}: Error finding container 4af019f1da59537c9ce890232d7ff080145fb0411c935a96da131fea7b45f682: Status 404 returned error can't find the container with id 4af019f1da59537c9ce890232d7ff080145fb0411c935a96da131fea7b45f682 Apr 16 13:59:24.315115 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:24.315102 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:59:24.381306 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:24.381274 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Apr 16 13:59:24.481772 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:24.481697 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Apr 16 13:59:24.582225 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:24.582191 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Apr 16 13:59:24.683004 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:24.682966 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Apr 16 13:59:24.718813 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:24.718781 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:24.780696 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:24.780617 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" Apr 16 13:59:24.791612 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:24.791587 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:24.793480 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:24.793244 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal" Apr 16 13:59:24.802062 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:24.801978 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:24.891693 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:24.891670 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:24.954391 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:24.954361 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:25.158472 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.158382 2569 apiserver.go:52] "Watching apiserver" Apr 16 13:59:25.164700 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.164678 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 13:59:25.165172 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.165143 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-lpgrv","kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp","openshift-dns/node-resolver-p4h9n","openshift-multus/multus-additional-cni-plugins-wlchs","openshift-ovn-kubernetes/ovnkube-node-prt2z","kube-system/konnectivity-agent-f5zrv","openshift-cluster-node-tuning-operator/tuned-ql6p9","openshift-image-registry/node-ca-xv4n6","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal","openshift-multus/multus-t2gj8","openshift-multus/network-metrics-daemon-lfj5m","openshift-network-diagnostics/network-check-target-f6dlw"] Apr 16 13:59:25.168554 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.168534 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-f5zrv" Apr 16 13:59:25.171143 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.171089 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 13:59:25.171788 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.171770 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.172439 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.172369 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-lmmnh\"" Apr 16 13:59:25.172439 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.172387 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 13:59:25.173675 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.173657 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 13:59:25.173967 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.173948 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p4h9n" Apr 16 13:59:25.177094 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.174716 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 13:59:25.177094 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.174821 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 13:59:25.177094 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.174839 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5rgm5\"" Apr 16 13:59:25.177094 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.175632 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 13:59:25.177094 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.175672 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cqpdx\"" Apr 16 13:59:25.177094 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.175879 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 13:59:25.177575 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.177553 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.179577 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.179559 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 13:59:25.179690 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.179646 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 13:59:25.180194 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.179866 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 13:59:25.180194 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.179875 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 13:59:25.180194 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.180127 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-m2ddc\"" Apr 16 13:59:25.180374 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.180131 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.180374 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.180205 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 13:59:25.182078 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.182058 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 13:59:25.182078 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.182068 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 13:59:25.182226 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.182097 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-cg5wp\"" Apr 16 13:59:25.182226 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.182111 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 13:59:25.182226 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.182124 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 13:59:25.182226 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.182145 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 13:59:25.182538 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.182519 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 13:59:25.185134 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.184901 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lpgrv" Apr 16 13:59:25.187093 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.186886 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:25.187093 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.186904 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:25.187093 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.186942 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 13:59:25.187093 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.186991 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-htdc2\"" Apr 16 13:59:25.187378 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.187205 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.187378 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.187300 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xv4n6" Apr 16 13:59:25.189110 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.189093 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 13:59:25.189307 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.189223 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-mph7k\"" Apr 16 13:59:25.189392 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.189332 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:25.189392 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.189345 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-lrkm6\"" Apr 16 13:59:25.189736 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.189720 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.189819 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.189776 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 13:59:25.190045 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.190025 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:25.190283 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.190180 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 13:59:25.191615 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.191599 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 13:59:25.191883 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.191870 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-5kxlr\"" Apr 16 13:59:25.192180 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192159 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/169891fa-6d6f-48ee-a833-f55805467ffd-device-dir\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.192272 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192194 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wg7b\" (UniqueName: \"kubernetes.io/projected/ca33a748-eaae-40ea-9131-81e3f97ea69d-kube-api-access-9wg7b\") pod \"node-resolver-p4h9n\" (UID: \"ca33a748-eaae-40ea-9131-81e3f97ea69d\") " pod="openshift-dns/node-resolver-p4h9n" Apr 16 13:59:25.192272 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192220 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44914f75-f504-40b2-932d-a36d8319394c-os-release\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.192272 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192245 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44914f75-f504-40b2-932d-a36d8319394c-cni-binary-copy\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.192272 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192268 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf224\" (UniqueName: \"kubernetes.io/projected/44914f75-f504-40b2-932d-a36d8319394c-kube-api-access-mf224\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.192495 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192318 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e6def905-3f86-432f-b6ba-a5f4649cc324-ovnkube-script-lib\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.192495 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192371 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2b74162e-fbf4-4ede-b8ad-5623c1094615-agent-certs\") pod \"konnectivity-agent-f5zrv\" (UID: \"2b74162e-fbf4-4ede-b8ad-5623c1094615\") " pod="kube-system/konnectivity-agent-f5zrv" Apr 16 13:59:25.192495 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192440 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ca33a748-eaae-40ea-9131-81e3f97ea69d-hosts-file\") pod \"node-resolver-p4h9n\" (UID: \"ca33a748-eaae-40ea-9131-81e3f97ea69d\") " pod="openshift-dns/node-resolver-p4h9n" Apr 16 13:59:25.192495 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192467 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44914f75-f504-40b2-932d-a36d8319394c-cnibin\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.192495 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192491 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-var-lib-openvswitch\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.192715 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192513 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-run-ovn\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.192715 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192537 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-log-socket\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.192715 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192559 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-run-ovn-kubernetes\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.192715 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192581 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ca33a748-eaae-40ea-9131-81e3f97ea69d-tmp-dir\") pod \"node-resolver-p4h9n\" (UID: \"ca33a748-eaae-40ea-9131-81e3f97ea69d\") " pod="openshift-dns/node-resolver-p4h9n" Apr 16 13:59:25.192715 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192604 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/44914f75-f504-40b2-932d-a36d8319394c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.192715 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192630 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-cni-bin\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.192933 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192709 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.192933 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192751 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/169891fa-6d6f-48ee-a833-f55805467ffd-socket-dir\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.192933 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192777 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/169891fa-6d6f-48ee-a833-f55805467ffd-etc-selinux\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.192933 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192802 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44914f75-f504-40b2-932d-a36d8319394c-system-cni-dir\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.192933 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192842 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44914f75-f504-40b2-932d-a36d8319394c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.192933 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192868 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-etc-openvswitch\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.193192 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.192952 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-node-log\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.193192 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.193019 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e6def905-3f86-432f-b6ba-a5f4649cc324-ovn-node-metrics-cert\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.193192 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.193043 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-run-systemd\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.193192 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.193070 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/169891fa-6d6f-48ee-a833-f55805467ffd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.193192 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.193102 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/44914f75-f504-40b2-932d-a36d8319394c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.193192 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.193127 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e6def905-3f86-432f-b6ba-a5f4649cc324-ovnkube-config\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.193192 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.193153 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hjh2\" (UniqueName: \"kubernetes.io/projected/e6def905-3f86-432f-b6ba-a5f4649cc324-kube-api-access-4hjh2\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.193192 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.193188 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2b74162e-fbf4-4ede-b8ad-5623c1094615-konnectivity-ca\") pod \"konnectivity-agent-f5zrv\" (UID: \"2b74162e-fbf4-4ede-b8ad-5623c1094615\") " pod="kube-system/konnectivity-agent-f5zrv" Apr 16 13:59:25.193461 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.193212 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/169891fa-6d6f-48ee-a833-f55805467ffd-sys-fs\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.193461 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.193234 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-kubelet\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.193461 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.193259 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-run-netns\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.193461 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.193276 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-run-openvswitch\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.193461 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.193283 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:25.193461 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.193314 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-cni-netd\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.193461 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.193348 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e6def905-3f86-432f-b6ba-a5f4649cc324-env-overrides\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.193461 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:25.193370 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lfj5m" podUID="44d7f301-04c1-422a-a689-9d0e4f02952c" Apr 16 13:59:25.193461 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.193376 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/169891fa-6d6f-48ee-a833-f55805467ffd-registration-dir\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.193461 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.193447 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q5bg\" (UniqueName: \"kubernetes.io/projected/169891fa-6d6f-48ee-a833-f55805467ffd-kube-api-access-4q5bg\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.193806 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.193480 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-systemd-units\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.193806 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.193518 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-slash\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.195495 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.195477 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:25.195594 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:25.195540 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f6dlw" podUID="9698ff93-a877-4a74-b2ff-29e433108995" Apr 16 13:59:25.242490 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.242461 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:24 +0000 UTC" deadline="2027-12-15 22:23:11.871438776 +0000 UTC" Apr 16 13:59:25.242490 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.242488 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14600h23m46.628954025s" Apr 16 13:59:25.280863 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.280833 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 13:59:25.294140 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294111 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/169891fa-6d6f-48ee-a833-f55805467ffd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.294279 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294152 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/44914f75-f504-40b2-932d-a36d8319394c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.294279 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294177 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e6def905-3f86-432f-b6ba-a5f4649cc324-ovnkube-config\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.294279 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294262 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/169891fa-6d6f-48ee-a833-f55805467ffd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.294428 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294410 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs\") pod \"network-metrics-daemon-lfj5m\" (UID: \"44d7f301-04c1-422a-a689-9d0e4f02952c\") " pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:25.294484 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294461 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-run\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.294541 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294520 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlzkz\" (UniqueName: \"kubernetes.io/projected/d6588176-b995-4b14-80e6-c2ba40893912-kube-api-access-wlzkz\") pod \"node-ca-xv4n6\" (UID: \"d6588176-b995-4b14-80e6-c2ba40893912\") " pod="openshift-image-registry/node-ca-xv4n6" Apr 16 13:59:25.294580 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294556 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-cnibin\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.294627 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294614 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5f104cac-2458-4f3f-b7d2-b71aef2dff52-cni-binary-copy\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.294672 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294643 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-hostroot\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.294712 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294672 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2b74162e-fbf4-4ede-b8ad-5623c1094615-konnectivity-ca\") pod \"konnectivity-agent-f5zrv\" (UID: \"2b74162e-fbf4-4ede-b8ad-5623c1094615\") " pod="kube-system/konnectivity-agent-f5zrv" Apr 16 13:59:25.294712 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294699 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/169891fa-6d6f-48ee-a833-f55805467ffd-sys-fs\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.294804 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294722 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-kubelet\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.294804 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294791 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-kubelet\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.294888 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294812 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/169891fa-6d6f-48ee-a833-f55805467ffd-sys-fs\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.294888 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294850 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e6def905-3f86-432f-b6ba-a5f4649cc324-env-overrides\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.294888 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294860 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/44914f75-f504-40b2-932d-a36d8319394c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.294888 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294878 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-multus-cni-dir\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.295085 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294916 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-system-cni-dir\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.295085 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294928 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e6def905-3f86-432f-b6ba-a5f4649cc324-ovnkube-config\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.295085 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294944 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fndrf\" (UniqueName: \"kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf\") pod \"network-check-target-f6dlw\" (UID: \"9698ff93-a877-4a74-b2ff-29e433108995\") " pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:25.295085 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.294993 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-modprobe-d\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.295085 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295025 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-sysconfig\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.295085 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295050 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-kubernetes\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.295085 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295087 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75pjt\" (UniqueName: \"kubernetes.io/projected/bdcb6f5a-276e-476f-ace0-4bc3f243da52-kube-api-access-75pjt\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.295432 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295110 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4q5bg\" (UniqueName: \"kubernetes.io/projected/169891fa-6d6f-48ee-a833-f55805467ffd-kube-api-access-4q5bg\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.295432 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295135 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-systemd-units\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.295432 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295169 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/169891fa-6d6f-48ee-a833-f55805467ffd-device-dir\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.295432 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295219 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-systemd-units\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.295432 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295234 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2b74162e-fbf4-4ede-b8ad-5623c1094615-konnectivity-ca\") pod \"konnectivity-agent-f5zrv\" (UID: \"2b74162e-fbf4-4ede-b8ad-5623c1094615\") " pod="kube-system/konnectivity-agent-f5zrv" Apr 16 13:59:25.295432 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295224 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44914f75-f504-40b2-932d-a36d8319394c-os-release\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.295432 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295274 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44914f75-f504-40b2-932d-a36d8319394c-cni-binary-copy\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.295432 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295298 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e6def905-3f86-432f-b6ba-a5f4649cc324-env-overrides\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.295432 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295306 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44914f75-f504-40b2-932d-a36d8319394c-os-release\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.295432 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295350 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mf224\" (UniqueName: \"kubernetes.io/projected/44914f75-f504-40b2-932d-a36d8319394c-kube-api-access-mf224\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.295432 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295383 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e6def905-3f86-432f-b6ba-a5f4649cc324-ovnkube-script-lib\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.295432 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295383 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/169891fa-6d6f-48ee-a833-f55805467ffd-device-dir\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.295432 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295424 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2b74162e-fbf4-4ede-b8ad-5623c1094615-agent-certs\") pod \"konnectivity-agent-f5zrv\" (UID: \"2b74162e-fbf4-4ede-b8ad-5623c1094615\") " pod="kube-system/konnectivity-agent-f5zrv" Apr 16 13:59:25.296014 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295446 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44914f75-f504-40b2-932d-a36d8319394c-cnibin\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.296014 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295462 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-var-lib-openvswitch\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.296014 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295487 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-run-ovn\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.296014 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295524 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-var-lib-openvswitch\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.296014 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295529 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-run-ovn\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.296014 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295557 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-run-ovn-kubernetes\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.296014 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295584 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-os-release\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.296014 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295583 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-run-ovn-kubernetes\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.296014 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295610 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-host-var-lib-cni-bin\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.296014 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295636 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-var-lib-kubelet\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.296014 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295661 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ca33a748-eaae-40ea-9131-81e3f97ea69d-tmp-dir\") pod \"node-resolver-p4h9n\" (UID: \"ca33a748-eaae-40ea-9131-81e3f97ea69d\") " pod="openshift-dns/node-resolver-p4h9n" Apr 16 13:59:25.296014 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295663 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44914f75-f504-40b2-932d-a36d8319394c-cnibin\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.296014 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295688 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44914f75-f504-40b2-932d-a36d8319394c-cni-binary-copy\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.296014 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295685 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-cni-bin\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.296014 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295726 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.296014 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295753 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ba1a0175-ff05-496f-a2ab-9b87059cf3c3-iptables-alerter-script\") pod \"iptables-alerter-lpgrv\" (UID: \"ba1a0175-ff05-496f-a2ab-9b87059cf3c3\") " pod="openshift-network-operator/iptables-alerter-lpgrv" Apr 16 13:59:25.296014 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295744 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 13:59:25.296759 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295776 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bdcb6f5a-276e-476f-ace0-4bc3f243da52-tmp\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.296759 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295785 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-cni-bin\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.296759 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295798 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d6588176-b995-4b14-80e6-c2ba40893912-serviceca\") pod \"node-ca-xv4n6\" (UID: \"d6588176-b995-4b14-80e6-c2ba40893912\") " pod="openshift-image-registry/node-ca-xv4n6" Apr 16 13:59:25.296759 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295794 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.296759 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295825 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/169891fa-6d6f-48ee-a833-f55805467ffd-socket-dir\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.296759 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295850 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44914f75-f504-40b2-932d-a36d8319394c-system-cni-dir\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.296759 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295874 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44914f75-f504-40b2-932d-a36d8319394c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.296759 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295914 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-node-log\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.296759 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295926 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44914f75-f504-40b2-932d-a36d8319394c-system-cni-dir\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.296759 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295939 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-multus-socket-dir-parent\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.296759 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295945 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e6def905-3f86-432f-b6ba-a5f4649cc324-ovnkube-script-lib\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.296759 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295957 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/169891fa-6d6f-48ee-a833-f55805467ffd-socket-dir\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.296759 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295968 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ca33a748-eaae-40ea-9131-81e3f97ea69d-tmp-dir\") pod \"node-resolver-p4h9n\" (UID: \"ca33a748-eaae-40ea-9131-81e3f97ea69d\") " pod="openshift-dns/node-resolver-p4h9n" Apr 16 13:59:25.296759 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.295998 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-multus-conf-dir\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.296759 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296003 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-node-log\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.296759 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296028 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-etc-kubernetes\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.296759 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296049 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6588176-b995-4b14-80e6-c2ba40893912-host\") pod \"node-ca-xv4n6\" (UID: \"d6588176-b995-4b14-80e6-c2ba40893912\") " pod="openshift-image-registry/node-ca-xv4n6" Apr 16 13:59:25.297444 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296076 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-run-systemd\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.297444 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296099 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hjh2\" (UniqueName: \"kubernetes.io/projected/e6def905-3f86-432f-b6ba-a5f4649cc324-kube-api-access-4hjh2\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.297444 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296124 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-run-systemd\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.297444 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296125 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-host-run-k8s-cni-cncf-io\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.297444 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296184 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-systemd\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.297444 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296209 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-host-var-lib-kubelet\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.297444 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296245 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-run-netns\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.297444 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296269 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-run-openvswitch\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.297444 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296300 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-cni-netd\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.297444 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296303 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-run-netns\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.297444 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296321 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv5vn\" (UniqueName: \"kubernetes.io/projected/44d7f301-04c1-422a-a689-9d0e4f02952c-kube-api-access-pv5vn\") pod \"network-metrics-daemon-lfj5m\" (UID: \"44d7f301-04c1-422a-a689-9d0e4f02952c\") " pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:25.297444 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296354 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-cni-netd\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.297444 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296360 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/169891fa-6d6f-48ee-a833-f55805467ffd-registration-dir\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.297444 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296385 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-run-openvswitch\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.297444 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296445 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/169891fa-6d6f-48ee-a833-f55805467ffd-registration-dir\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.297444 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296444 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-host-var-lib-cni-multus\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.297444 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296481 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-sysctl-conf\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.298142 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296504 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-host\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.298142 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296523 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-slash\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.298142 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296567 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-host-run-multus-certs\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.298142 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296600 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-host-slash\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.298142 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296670 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ba1a0175-ff05-496f-a2ab-9b87059cf3c3-host-slash\") pod \"iptables-alerter-lpgrv\" (UID: \"ba1a0175-ff05-496f-a2ab-9b87059cf3c3\") " pod="openshift-network-operator/iptables-alerter-lpgrv" Apr 16 13:59:25.298142 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296736 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-lib-modules\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.298142 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296754 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44914f75-f504-40b2-932d-a36d8319394c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.298142 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296772 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wg7b\" (UniqueName: \"kubernetes.io/projected/ca33a748-eaae-40ea-9131-81e3f97ea69d-kube-api-access-9wg7b\") pod \"node-resolver-p4h9n\" (UID: \"ca33a748-eaae-40ea-9131-81e3f97ea69d\") " pod="openshift-dns/node-resolver-p4h9n" Apr 16 13:59:25.298142 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296799 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-host-run-netns\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.298142 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296822 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-sysctl-d\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.298142 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296847 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ca33a748-eaae-40ea-9131-81e3f97ea69d-hosts-file\") pod \"node-resolver-p4h9n\" (UID: \"ca33a748-eaae-40ea-9131-81e3f97ea69d\") " pod="openshift-dns/node-resolver-p4h9n" Apr 16 13:59:25.298142 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296883 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-log-socket\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.298142 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296927 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-log-socket\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.298142 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296957 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5f104cac-2458-4f3f-b7d2-b71aef2dff52-multus-daemon-config\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.298142 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296977 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ca33a748-eaae-40ea-9131-81e3f97ea69d-hosts-file\") pod \"node-resolver-p4h9n\" (UID: \"ca33a748-eaae-40ea-9131-81e3f97ea69d\") " pod="openshift-dns/node-resolver-p4h9n" Apr 16 13:59:25.298142 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.296982 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rwwf\" (UniqueName: \"kubernetes.io/projected/5f104cac-2458-4f3f-b7d2-b71aef2dff52-kube-api-access-7rwwf\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.298142 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.297017 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/44914f75-f504-40b2-932d-a36d8319394c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.298938 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.297045 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-sys\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.298938 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.297166 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-tuned\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.298938 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.297200 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/169891fa-6d6f-48ee-a833-f55805467ffd-etc-selinux\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.298938 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.297227 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-etc-openvswitch\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.298938 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.297250 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e6def905-3f86-432f-b6ba-a5f4649cc324-ovn-node-metrics-cert\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.298938 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.297282 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/169891fa-6d6f-48ee-a833-f55805467ffd-etc-selinux\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.298938 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.297304 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2mpd\" (UniqueName: \"kubernetes.io/projected/ba1a0175-ff05-496f-a2ab-9b87059cf3c3-kube-api-access-n2mpd\") pod \"iptables-alerter-lpgrv\" (UID: \"ba1a0175-ff05-496f-a2ab-9b87059cf3c3\") " pod="openshift-network-operator/iptables-alerter-lpgrv" Apr 16 13:59:25.298938 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.297330 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e6def905-3f86-432f-b6ba-a5f4649cc324-etc-openvswitch\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.298938 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.297388 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/44914f75-f504-40b2-932d-a36d8319394c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.299578 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.299560 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2b74162e-fbf4-4ede-b8ad-5623c1094615-agent-certs\") pod \"konnectivity-agent-f5zrv\" (UID: \"2b74162e-fbf4-4ede-b8ad-5623c1094615\") " pod="kube-system/konnectivity-agent-f5zrv" Apr 16 13:59:25.299733 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.299719 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e6def905-3f86-432f-b6ba-a5f4649cc324-ovn-node-metrics-cert\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.306006 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.305982 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hjh2\" (UniqueName: \"kubernetes.io/projected/e6def905-3f86-432f-b6ba-a5f4649cc324-kube-api-access-4hjh2\") pod \"ovnkube-node-prt2z\" (UID: \"e6def905-3f86-432f-b6ba-a5f4649cc324\") " pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.306350 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.306329 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wg7b\" (UniqueName: \"kubernetes.io/projected/ca33a748-eaae-40ea-9131-81e3f97ea69d-kube-api-access-9wg7b\") pod \"node-resolver-p4h9n\" (UID: \"ca33a748-eaae-40ea-9131-81e3f97ea69d\") " pod="openshift-dns/node-resolver-p4h9n" Apr 16 13:59:25.306467 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.306451 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf224\" (UniqueName: \"kubernetes.io/projected/44914f75-f504-40b2-932d-a36d8319394c-kube-api-access-mf224\") pod \"multus-additional-cni-plugins-wlchs\" (UID: \"44914f75-f504-40b2-932d-a36d8319394c\") " pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.306717 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.306696 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q5bg\" (UniqueName: \"kubernetes.io/projected/169891fa-6d6f-48ee-a833-f55805467ffd-kube-api-access-4q5bg\") pod \"aws-ebs-csi-driver-node-xrmkp\" (UID: \"169891fa-6d6f-48ee-a833-f55805467ffd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.311607 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.311555 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" event={"ID":"173ef8d8e696e627a768316289085c1e","Type":"ContainerStarted","Data":"ef06888b1296ef9b87aac9553da7886c5d104c43384894a0e783724ecbbddaf8"} Apr 16 13:59:25.312544 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.312526 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal" event={"ID":"104ec126e58c79948296ecdd10d4aa5b","Type":"ContainerStarted","Data":"4af019f1da59537c9ce890232d7ff080145fb0411c935a96da131fea7b45f682"} Apr 16 13:59:25.398385 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398352 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2mpd\" (UniqueName: \"kubernetes.io/projected/ba1a0175-ff05-496f-a2ab-9b87059cf3c3-kube-api-access-n2mpd\") pod \"iptables-alerter-lpgrv\" (UID: \"ba1a0175-ff05-496f-a2ab-9b87059cf3c3\") " pod="openshift-network-operator/iptables-alerter-lpgrv" Apr 16 13:59:25.398549 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398415 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs\") pod \"network-metrics-daemon-lfj5m\" (UID: \"44d7f301-04c1-422a-a689-9d0e4f02952c\") " pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:25.398549 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398440 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-run\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.398549 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398468 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlzkz\" (UniqueName: \"kubernetes.io/projected/d6588176-b995-4b14-80e6-c2ba40893912-kube-api-access-wlzkz\") pod \"node-ca-xv4n6\" (UID: \"d6588176-b995-4b14-80e6-c2ba40893912\") " pod="openshift-image-registry/node-ca-xv4n6" Apr 16 13:59:25.398549 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398489 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-cnibin\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.398754 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398535 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5f104cac-2458-4f3f-b7d2-b71aef2dff52-cni-binary-copy\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.398754 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:25.398579 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:25.398754 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398584 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-run\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.398754 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398582 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-hostroot\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.398754 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398650 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-multus-cni-dir\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.398754 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398686 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-system-cni-dir\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.398754 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:25.398713 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs podName:44d7f301-04c1-422a-a689-9d0e4f02952c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:25.898670019 +0000 UTC m=+3.239044870 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs") pod "network-metrics-daemon-lfj5m" (UID: "44d7f301-04c1-422a-a689-9d0e4f02952c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:25.398754 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398718 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-hostroot\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.398754 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398550 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-cnibin\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.398754 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398745 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fndrf\" (UniqueName: \"kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf\") pod \"network-check-target-f6dlw\" (UID: \"9698ff93-a877-4a74-b2ff-29e433108995\") " pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:25.398754 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398760 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-system-cni-dir\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.399256 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398774 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-modprobe-d\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.399256 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398807 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-sysconfig\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.399256 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398845 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-multus-cni-dir\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.399256 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398851 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-kubernetes\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.399256 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398876 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75pjt\" (UniqueName: \"kubernetes.io/projected/bdcb6f5a-276e-476f-ace0-4bc3f243da52-kube-api-access-75pjt\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.399256 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398899 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-modprobe-d\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.399256 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398936 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-sysconfig\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.399256 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398960 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-os-release\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.399256 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398978 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-kubernetes\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.399256 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.398987 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-host-var-lib-cni-bin\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.399256 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399012 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-var-lib-kubelet\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.399256 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399022 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-os-release\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.399256 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399052 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ba1a0175-ff05-496f-a2ab-9b87059cf3c3-iptables-alerter-script\") pod \"iptables-alerter-lpgrv\" (UID: \"ba1a0175-ff05-496f-a2ab-9b87059cf3c3\") " pod="openshift-network-operator/iptables-alerter-lpgrv" Apr 16 13:59:25.399256 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399064 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-host-var-lib-cni-bin\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.399256 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399077 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bdcb6f5a-276e-476f-ace0-4bc3f243da52-tmp\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.399256 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399092 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-var-lib-kubelet\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.399256 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399100 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d6588176-b995-4b14-80e6-c2ba40893912-serviceca\") pod \"node-ca-xv4n6\" (UID: \"d6588176-b995-4b14-80e6-c2ba40893912\") " pod="openshift-image-registry/node-ca-xv4n6" Apr 16 13:59:25.399256 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399142 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-multus-socket-dir-parent\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.399880 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399169 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-multus-conf-dir\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.399880 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399189 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-multus-socket-dir-parent\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.399880 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399193 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-etc-kubernetes\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.399880 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399215 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-multus-conf-dir\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.399880 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399220 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-etc-kubernetes\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.399880 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399193 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5f104cac-2458-4f3f-b7d2-b71aef2dff52-cni-binary-copy\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.399880 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399251 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6588176-b995-4b14-80e6-c2ba40893912-host\") pod \"node-ca-xv4n6\" (UID: \"d6588176-b995-4b14-80e6-c2ba40893912\") " pod="openshift-image-registry/node-ca-xv4n6" Apr 16 13:59:25.399880 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399292 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6588176-b995-4b14-80e6-c2ba40893912-host\") pod \"node-ca-xv4n6\" (UID: \"d6588176-b995-4b14-80e6-c2ba40893912\") " pod="openshift-image-registry/node-ca-xv4n6" Apr 16 13:59:25.399880 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399291 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-host-run-k8s-cni-cncf-io\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.399880 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399322 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-host-run-k8s-cni-cncf-io\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.399880 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399325 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-systemd\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.399880 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399361 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-host-var-lib-kubelet\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.399880 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399360 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-systemd\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.399880 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399391 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pv5vn\" (UniqueName: \"kubernetes.io/projected/44d7f301-04c1-422a-a689-9d0e4f02952c-kube-api-access-pv5vn\") pod \"network-metrics-daemon-lfj5m\" (UID: \"44d7f301-04c1-422a-a689-9d0e4f02952c\") " pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:25.399880 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399429 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-host-var-lib-kubelet\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.399880 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399436 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-host-var-lib-cni-multus\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.399880 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399465 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-sysctl-conf\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.399880 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399491 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-host\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.400442 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399521 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-host-run-multus-certs\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.400442 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399539 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d6588176-b995-4b14-80e6-c2ba40893912-serviceca\") pod \"node-ca-xv4n6\" (UID: \"d6588176-b995-4b14-80e6-c2ba40893912\") " pod="openshift-image-registry/node-ca-xv4n6" Apr 16 13:59:25.400442 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399552 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ba1a0175-ff05-496f-a2ab-9b87059cf3c3-host-slash\") pod \"iptables-alerter-lpgrv\" (UID: \"ba1a0175-ff05-496f-a2ab-9b87059cf3c3\") " pod="openshift-network-operator/iptables-alerter-lpgrv" Apr 16 13:59:25.400442 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399573 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ba1a0175-ff05-496f-a2ab-9b87059cf3c3-iptables-alerter-script\") pod \"iptables-alerter-lpgrv\" (UID: \"ba1a0175-ff05-496f-a2ab-9b87059cf3c3\") " pod="openshift-network-operator/iptables-alerter-lpgrv" Apr 16 13:59:25.400442 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399579 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-lib-modules\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.400442 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399584 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-host-run-multus-certs\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.400442 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399593 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ba1a0175-ff05-496f-a2ab-9b87059cf3c3-host-slash\") pod \"iptables-alerter-lpgrv\" (UID: \"ba1a0175-ff05-496f-a2ab-9b87059cf3c3\") " pod="openshift-network-operator/iptables-alerter-lpgrv" Apr 16 13:59:25.400442 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399552 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-host-var-lib-cni-multus\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.400442 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399604 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-host-run-netns\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.400442 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399635 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f104cac-2458-4f3f-b7d2-b71aef2dff52-host-run-netns\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.400442 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399640 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-sysctl-d\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.400442 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399603 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-host\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.400442 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399666 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5f104cac-2458-4f3f-b7d2-b71aef2dff52-multus-daemon-config\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.400442 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399694 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-sysctl-conf\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.400442 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399713 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rwwf\" (UniqueName: \"kubernetes.io/projected/5f104cac-2458-4f3f-b7d2-b71aef2dff52-kube-api-access-7rwwf\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.400442 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399747 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-sys\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.400442 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399714 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-lib-modules\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.400442 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399772 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-sysctl-d\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.401074 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399800 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-tuned\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.401074 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.399812 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bdcb6f5a-276e-476f-ace0-4bc3f243da52-sys\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.401074 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.400118 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5f104cac-2458-4f3f-b7d2-b71aef2dff52-multus-daemon-config\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.401441 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.401420 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bdcb6f5a-276e-476f-ace0-4bc3f243da52-tmp\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.401906 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.401888 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bdcb6f5a-276e-476f-ace0-4bc3f243da52-etc-tuned\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.406670 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:25.406652 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:25.406670 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:25.406672 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:25.406829 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:25.406684 2569 projected.go:194] Error preparing data for projected volume kube-api-access-fndrf for pod openshift-network-diagnostics/network-check-target-f6dlw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:25.406829 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:25.406765 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf podName:9698ff93-a877-4a74-b2ff-29e433108995 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:25.906742149 +0000 UTC m=+3.247117001 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fndrf" (UniqueName: "kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf") pod "network-check-target-f6dlw" (UID: "9698ff93-a877-4a74-b2ff-29e433108995") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:25.409804 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.408833 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlzkz\" (UniqueName: \"kubernetes.io/projected/d6588176-b995-4b14-80e6-c2ba40893912-kube-api-access-wlzkz\") pod \"node-ca-xv4n6\" (UID: \"d6588176-b995-4b14-80e6-c2ba40893912\") " pod="openshift-image-registry/node-ca-xv4n6" Apr 16 13:59:25.411426 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.410814 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75pjt\" (UniqueName: \"kubernetes.io/projected/bdcb6f5a-276e-476f-ace0-4bc3f243da52-kube-api-access-75pjt\") pod \"tuned-ql6p9\" (UID: \"bdcb6f5a-276e-476f-ace0-4bc3f243da52\") " pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.411426 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.411414 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv5vn\" (UniqueName: \"kubernetes.io/projected/44d7f301-04c1-422a-a689-9d0e4f02952c-kube-api-access-pv5vn\") pod \"network-metrics-daemon-lfj5m\" (UID: \"44d7f301-04c1-422a-a689-9d0e4f02952c\") " pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:25.412087 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.412072 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rwwf\" (UniqueName: \"kubernetes.io/projected/5f104cac-2458-4f3f-b7d2-b71aef2dff52-kube-api-access-7rwwf\") pod \"multus-t2gj8\" (UID: \"5f104cac-2458-4f3f-b7d2-b71aef2dff52\") " pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.412131 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.412107 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2mpd\" (UniqueName: \"kubernetes.io/projected/ba1a0175-ff05-496f-a2ab-9b87059cf3c3-kube-api-access-n2mpd\") pod \"iptables-alerter-lpgrv\" (UID: \"ba1a0175-ff05-496f-a2ab-9b87059cf3c3\") " pod="openshift-network-operator/iptables-alerter-lpgrv" Apr 16 13:59:25.481160 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.481129 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-f5zrv" Apr 16 13:59:25.488654 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.488629 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" Apr 16 13:59:25.496632 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.496611 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p4h9n" Apr 16 13:59:25.502212 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.502195 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wlchs" Apr 16 13:59:25.507763 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.507748 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:25.514326 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.514299 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lpgrv" Apr 16 13:59:25.520874 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.520856 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" Apr 16 13:59:25.529383 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.529360 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xv4n6" Apr 16 13:59:25.533947 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.533927 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-t2gj8" Apr 16 13:59:25.902025 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:25.901996 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs\") pod \"network-metrics-daemon-lfj5m\" (UID: \"44d7f301-04c1-422a-a689-9d0e4f02952c\") " pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:25.902156 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:25.902103 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:25.902212 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:25.902163 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs podName:44d7f301-04c1-422a-a689-9d0e4f02952c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:26.902141066 +0000 UTC m=+4.242515905 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs") pod "network-metrics-daemon-lfj5m" (UID: "44d7f301-04c1-422a-a689-9d0e4f02952c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:25.916948 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:25.916908 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca33a748_eaae_40ea_9131_81e3f97ea69d.slice/crio-5c5d1f744237ad16b86e7bed767168d78171547cba7f0f1e4ed0efb1aed52147 WatchSource:0}: Error finding container 5c5d1f744237ad16b86e7bed767168d78171547cba7f0f1e4ed0efb1aed52147: Status 404 returned error can't find the container with id 5c5d1f744237ad16b86e7bed767168d78171547cba7f0f1e4ed0efb1aed52147 Apr 16 13:59:25.918384 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:25.918348 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba1a0175_ff05_496f_a2ab_9b87059cf3c3.slice/crio-5a35622a213444ef3f8f92a8f68ee88064a4cb19b042ddc4ba7d2dc3d548e30a WatchSource:0}: Error finding container 5a35622a213444ef3f8f92a8f68ee88064a4cb19b042ddc4ba7d2dc3d548e30a: Status 404 returned error can't find the container with id 5a35622a213444ef3f8f92a8f68ee88064a4cb19b042ddc4ba7d2dc3d548e30a Apr 16 13:59:25.919601 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:25.919542 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6def905_3f86_432f_b6ba_a5f4649cc324.slice/crio-84232f66ae15fe38d1eb9ff4d26137e5d7d409b5136ce4c743546433ec45fa9d WatchSource:0}: Error finding container 84232f66ae15fe38d1eb9ff4d26137e5d7d409b5136ce4c743546433ec45fa9d: Status 404 returned error can't find the container with id 84232f66ae15fe38d1eb9ff4d26137e5d7d409b5136ce4c743546433ec45fa9d Apr 16 13:59:25.920934 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:25.920911 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6588176_b995_4b14_80e6_c2ba40893912.slice/crio-3b5637efe6d4eb0559b79ca5337637eefd62e7d7895f0f88d5a9921b15dd198f WatchSource:0}: Error finding container 3b5637efe6d4eb0559b79ca5337637eefd62e7d7895f0f88d5a9921b15dd198f: Status 404 returned error can't find the container with id 3b5637efe6d4eb0559b79ca5337637eefd62e7d7895f0f88d5a9921b15dd198f Apr 16 13:59:25.923872 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:25.923848 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b74162e_fbf4_4ede_b8ad_5623c1094615.slice/crio-0b2764f05fd37112583276966bcba0a284990197464fbce4f22a5be5d3ae0b07 WatchSource:0}: Error finding container 0b2764f05fd37112583276966bcba0a284990197464fbce4f22a5be5d3ae0b07: Status 404 returned error can't find the container with id 0b2764f05fd37112583276966bcba0a284990197464fbce4f22a5be5d3ae0b07 Apr 16 13:59:25.924827 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:25.924778 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdcb6f5a_276e_476f_ace0_4bc3f243da52.slice/crio-24ccf4989e66907a67550c0ce82a7a3e5b165529a8f5cb9c419ba114b10d9271 WatchSource:0}: Error finding container 24ccf4989e66907a67550c0ce82a7a3e5b165529a8f5cb9c419ba114b10d9271: Status 404 returned error can't find the container with id 24ccf4989e66907a67550c0ce82a7a3e5b165529a8f5cb9c419ba114b10d9271 Apr 16 13:59:25.925610 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:25.925325 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44914f75_f504_40b2_932d_a36d8319394c.slice/crio-bd487a288b7e521b95184f8cc296b285e2f15fd9e19058707a8aff422b0e75da WatchSource:0}: Error finding container bd487a288b7e521b95184f8cc296b285e2f15fd9e19058707a8aff422b0e75da: Status 404 returned error can't find the container with id bd487a288b7e521b95184f8cc296b285e2f15fd9e19058707a8aff422b0e75da Apr 16 13:59:25.927189 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:25.927169 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod169891fa_6d6f_48ee_a833_f55805467ffd.slice/crio-d3c41feacfdc28280b9d637762b5317a2cae156ee19ca5812d592497cc8d21b0 WatchSource:0}: Error finding container d3c41feacfdc28280b9d637762b5317a2cae156ee19ca5812d592497cc8d21b0: Status 404 returned error can't find the container with id d3c41feacfdc28280b9d637762b5317a2cae156ee19ca5812d592497cc8d21b0 Apr 16 13:59:25.928196 ip-10-0-142-16 kubenswrapper[2569]: W0416 13:59:25.928176 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f104cac_2458_4f3f_b7d2_b71aef2dff52.slice/crio-89b8b0e740b10394bf1b4e3b05f196f2aa971e0103190772382220aa2f22bab8 WatchSource:0}: Error finding container 89b8b0e740b10394bf1b4e3b05f196f2aa971e0103190772382220aa2f22bab8: Status 404 returned error can't find the container with id 89b8b0e740b10394bf1b4e3b05f196f2aa971e0103190772382220aa2f22bab8 Apr 16 13:59:26.003328 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:26.003144 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fndrf\" (UniqueName: \"kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf\") pod \"network-check-target-f6dlw\" (UID: \"9698ff93-a877-4a74-b2ff-29e433108995\") " pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:26.003447 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:26.003306 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:26.003447 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:26.003371 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:26.003447 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:26.003383 2569 projected.go:194] Error preparing data for projected volume kube-api-access-fndrf for pod openshift-network-diagnostics/network-check-target-f6dlw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:26.003608 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:26.003453 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf podName:9698ff93-a877-4a74-b2ff-29e433108995 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:27.003436197 +0000 UTC m=+4.343811047 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-fndrf" (UniqueName: "kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf") pod "network-check-target-f6dlw" (UID: "9698ff93-a877-4a74-b2ff-29e433108995") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:26.243336 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:26.243219 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:24 +0000 UTC" deadline="2028-01-30 10:36:34.127517402 +0000 UTC" Apr 16 13:59:26.243336 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:26.243253 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15692h37m7.884267094s" Apr 16 13:59:26.308280 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:26.308250 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:26.308480 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:26.308380 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lfj5m" podUID="44d7f301-04c1-422a-a689-9d0e4f02952c" Apr 16 13:59:26.316642 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:26.316610 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal" event={"ID":"104ec126e58c79948296ecdd10d4aa5b","Type":"ContainerStarted","Data":"0b60606f81517eeba987cd6541b3cf420322fd947b987d8cb13e8c23549d3ee1"} Apr 16 13:59:26.318093 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:26.318070 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t2gj8" event={"ID":"5f104cac-2458-4f3f-b7d2-b71aef2dff52","Type":"ContainerStarted","Data":"89b8b0e740b10394bf1b4e3b05f196f2aa971e0103190772382220aa2f22bab8"} Apr 16 13:59:26.319956 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:26.319933 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" event={"ID":"169891fa-6d6f-48ee-a833-f55805467ffd","Type":"ContainerStarted","Data":"d3c41feacfdc28280b9d637762b5317a2cae156ee19ca5812d592497cc8d21b0"} Apr 16 13:59:26.321652 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:26.321625 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" event={"ID":"bdcb6f5a-276e-476f-ace0-4bc3f243da52","Type":"ContainerStarted","Data":"24ccf4989e66907a67550c0ce82a7a3e5b165529a8f5cb9c419ba114b10d9271"} Apr 16 13:59:26.323078 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:26.323046 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" event={"ID":"e6def905-3f86-432f-b6ba-a5f4649cc324","Type":"ContainerStarted","Data":"84232f66ae15fe38d1eb9ff4d26137e5d7d409b5136ce4c743546433ec45fa9d"} Apr 16 13:59:26.324853 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:26.324831 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wlchs" event={"ID":"44914f75-f504-40b2-932d-a36d8319394c","Type":"ContainerStarted","Data":"bd487a288b7e521b95184f8cc296b285e2f15fd9e19058707a8aff422b0e75da"} Apr 16 13:59:26.326065 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:26.326046 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-f5zrv" event={"ID":"2b74162e-fbf4-4ede-b8ad-5623c1094615","Type":"ContainerStarted","Data":"0b2764f05fd37112583276966bcba0a284990197464fbce4f22a5be5d3ae0b07"} Apr 16 13:59:26.327929 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:26.327905 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xv4n6" event={"ID":"d6588176-b995-4b14-80e6-c2ba40893912","Type":"ContainerStarted","Data":"3b5637efe6d4eb0559b79ca5337637eefd62e7d7895f0f88d5a9921b15dd198f"} Apr 16 13:59:26.329292 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:26.329261 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lpgrv" event={"ID":"ba1a0175-ff05-496f-a2ab-9b87059cf3c3","Type":"ContainerStarted","Data":"5a35622a213444ef3f8f92a8f68ee88064a4cb19b042ddc4ba7d2dc3d548e30a"} Apr 16 13:59:26.331140 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:26.331110 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p4h9n" event={"ID":"ca33a748-eaae-40ea-9131-81e3f97ea69d","Type":"ContainerStarted","Data":"5c5d1f744237ad16b86e7bed767168d78171547cba7f0f1e4ed0efb1aed52147"} Apr 16 13:59:26.331966 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:26.331893 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal" podStartSLOduration=2.331880733 podStartE2EDuration="2.331880733s" podCreationTimestamp="2026-04-16 13:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:26.331513043 +0000 UTC m=+3.671887896" watchObservedRunningTime="2026-04-16 13:59:26.331880733 +0000 UTC m=+3.672255584" Apr 16 13:59:26.910577 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:26.909942 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs\") pod \"network-metrics-daemon-lfj5m\" (UID: \"44d7f301-04c1-422a-a689-9d0e4f02952c\") " pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:26.910577 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:26.910121 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:26.910577 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:26.910186 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs podName:44d7f301-04c1-422a-a689-9d0e4f02952c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:28.910166882 +0000 UTC m=+6.250541714 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs") pod "network-metrics-daemon-lfj5m" (UID: "44d7f301-04c1-422a-a689-9d0e4f02952c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:26.950724 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:26.950695 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:27.011272 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:27.011236 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fndrf\" (UniqueName: \"kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf\") pod \"network-check-target-f6dlw\" (UID: \"9698ff93-a877-4a74-b2ff-29e433108995\") " pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:27.011451 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:27.011430 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:27.011524 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:27.011453 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:27.011524 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:27.011465 2569 projected.go:194] Error preparing data for projected volume kube-api-access-fndrf for pod openshift-network-diagnostics/network-check-target-f6dlw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:27.011524 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:27.011523 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf podName:9698ff93-a877-4a74-b2ff-29e433108995 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:29.011505737 +0000 UTC m=+6.351880568 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-fndrf" (UniqueName: "kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf") pod "network-check-target-f6dlw" (UID: "9698ff93-a877-4a74-b2ff-29e433108995") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:27.309571 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:27.309491 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:27.310029 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:27.309622 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f6dlw" podUID="9698ff93-a877-4a74-b2ff-29e433108995" Apr 16 13:59:27.354666 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:27.354066 2569 generic.go:358] "Generic (PLEG): container finished" podID="173ef8d8e696e627a768316289085c1e" containerID="d208b185b0d5d4856d2f4413ad697da433273f4020cc6498200fc4e2cf9fbf09" exitCode=0 Apr 16 13:59:27.354666 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:27.354608 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" event={"ID":"173ef8d8e696e627a768316289085c1e","Type":"ContainerDied","Data":"d208b185b0d5d4856d2f4413ad697da433273f4020cc6498200fc4e2cf9fbf09"} Apr 16 13:59:28.309250 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:28.308768 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:28.309250 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:28.308903 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lfj5m" podUID="44d7f301-04c1-422a-a689-9d0e4f02952c" Apr 16 13:59:28.359979 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:28.359942 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" event={"ID":"173ef8d8e696e627a768316289085c1e","Type":"ContainerStarted","Data":"9d6dfc5175fea930f994f62b5d9292515826642becd0334da28d0b4f115b6512"} Apr 16 13:59:28.376461 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:28.376383 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" podStartSLOduration=4.376365167 podStartE2EDuration="4.376365167s" podCreationTimestamp="2026-04-16 13:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:28.376221757 +0000 UTC m=+5.716596610" watchObservedRunningTime="2026-04-16 13:59:28.376365167 +0000 UTC m=+5.716740018" Apr 16 13:59:28.928089 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:28.927460 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs\") pod \"network-metrics-daemon-lfj5m\" (UID: \"44d7f301-04c1-422a-a689-9d0e4f02952c\") " pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:28.928089 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:28.927612 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:28.928089 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:28.927692 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs podName:44d7f301-04c1-422a-a689-9d0e4f02952c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:32.927651914 +0000 UTC m=+10.268026758 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs") pod "network-metrics-daemon-lfj5m" (UID: "44d7f301-04c1-422a-a689-9d0e4f02952c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:29.028074 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:29.028044 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fndrf\" (UniqueName: \"kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf\") pod \"network-check-target-f6dlw\" (UID: \"9698ff93-a877-4a74-b2ff-29e433108995\") " pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:29.028431 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:29.028219 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:29.028431 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:29.028247 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:29.028431 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:29.028260 2569 projected.go:194] Error preparing data for projected volume kube-api-access-fndrf for pod openshift-network-diagnostics/network-check-target-f6dlw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:29.028431 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:29.028316 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf podName:9698ff93-a877-4a74-b2ff-29e433108995 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:33.028296762 +0000 UTC m=+10.368671595 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-fndrf" (UniqueName: "kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf") pod "network-check-target-f6dlw" (UID: "9698ff93-a877-4a74-b2ff-29e433108995") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:29.309535 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:29.309460 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:29.309679 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:29.309597 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f6dlw" podUID="9698ff93-a877-4a74-b2ff-29e433108995" Apr 16 13:59:30.308887 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:30.308851 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:30.309327 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:30.308995 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lfj5m" podUID="44d7f301-04c1-422a-a689-9d0e4f02952c" Apr 16 13:59:31.309127 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:31.309096 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:31.309598 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:31.309223 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f6dlw" podUID="9698ff93-a877-4a74-b2ff-29e433108995" Apr 16 13:59:32.309792 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:32.309240 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:32.309792 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:32.309380 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lfj5m" podUID="44d7f301-04c1-422a-a689-9d0e4f02952c" Apr 16 13:59:32.961919 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:32.961881 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs\") pod \"network-metrics-daemon-lfj5m\" (UID: \"44d7f301-04c1-422a-a689-9d0e4f02952c\") " pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:32.962114 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:32.962012 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:32.962183 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:32.962130 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs podName:44d7f301-04c1-422a-a689-9d0e4f02952c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:40.962114561 +0000 UTC m=+18.302489392 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs") pod "network-metrics-daemon-lfj5m" (UID: "44d7f301-04c1-422a-a689-9d0e4f02952c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:33.062422 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:33.062313 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fndrf\" (UniqueName: \"kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf\") pod \"network-check-target-f6dlw\" (UID: \"9698ff93-a877-4a74-b2ff-29e433108995\") " pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:33.062603 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:33.062523 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:33.062603 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:33.062579 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:33.062603 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:33.062595 2569 projected.go:194] Error preparing data for projected volume kube-api-access-fndrf for pod openshift-network-diagnostics/network-check-target-f6dlw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:33.062765 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:33.062668 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf podName:9698ff93-a877-4a74-b2ff-29e433108995 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:41.062649157 +0000 UTC m=+18.403024008 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-fndrf" (UniqueName: "kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf") pod "network-check-target-f6dlw" (UID: "9698ff93-a877-4a74-b2ff-29e433108995") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:33.310189 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:33.309683 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:33.310189 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:33.309778 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f6dlw" podUID="9698ff93-a877-4a74-b2ff-29e433108995" Apr 16 13:59:34.309266 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:34.309227 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:34.309450 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:34.309377 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lfj5m" podUID="44d7f301-04c1-422a-a689-9d0e4f02952c" Apr 16 13:59:35.308920 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:35.308887 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:35.309334 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:35.308998 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f6dlw" podUID="9698ff93-a877-4a74-b2ff-29e433108995" Apr 16 13:59:36.308779 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:36.308740 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:36.308932 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:36.308882 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lfj5m" podUID="44d7f301-04c1-422a-a689-9d0e4f02952c" Apr 16 13:59:37.308785 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:37.308752 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:37.309170 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:37.308851 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f6dlw" podUID="9698ff93-a877-4a74-b2ff-29e433108995" Apr 16 13:59:38.309003 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:38.308967 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:38.309450 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:38.309083 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lfj5m" podUID="44d7f301-04c1-422a-a689-9d0e4f02952c" Apr 16 13:59:39.308695 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:39.308664 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:39.308859 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:39.308765 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f6dlw" podUID="9698ff93-a877-4a74-b2ff-29e433108995" Apr 16 13:59:40.308643 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:40.308610 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:40.309083 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:40.308754 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lfj5m" podUID="44d7f301-04c1-422a-a689-9d0e4f02952c" Apr 16 13:59:41.023100 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:41.023047 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs\") pod \"network-metrics-daemon-lfj5m\" (UID: \"44d7f301-04c1-422a-a689-9d0e4f02952c\") " pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:41.023284 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:41.023236 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:41.023350 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:41.023307 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs podName:44d7f301-04c1-422a-a689-9d0e4f02952c nodeName:}" failed. No retries permitted until 2026-04-16 13:59:57.023287281 +0000 UTC m=+34.363662113 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs") pod "network-metrics-daemon-lfj5m" (UID: "44d7f301-04c1-422a-a689-9d0e4f02952c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:41.123560 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:41.123529 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fndrf\" (UniqueName: \"kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf\") pod \"network-check-target-f6dlw\" (UID: \"9698ff93-a877-4a74-b2ff-29e433108995\") " pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:41.123691 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:41.123640 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:41.123691 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:41.123651 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:41.123691 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:41.123659 2569 projected.go:194] Error preparing data for projected volume kube-api-access-fndrf for pod openshift-network-diagnostics/network-check-target-f6dlw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:41.123788 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:41.123701 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf podName:9698ff93-a877-4a74-b2ff-29e433108995 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:57.123688795 +0000 UTC m=+34.464063625 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-fndrf" (UniqueName: "kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf") pod "network-check-target-f6dlw" (UID: "9698ff93-a877-4a74-b2ff-29e433108995") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:41.308452 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:41.308364 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:41.308594 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:41.308512 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f6dlw" podUID="9698ff93-a877-4a74-b2ff-29e433108995" Apr 16 13:59:42.309050 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:42.309025 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:42.309323 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:42.309136 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lfj5m" podUID="44d7f301-04c1-422a-a689-9d0e4f02952c" Apr 16 13:59:43.309217 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.308966 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:43.309689 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:43.309274 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f6dlw" podUID="9698ff93-a877-4a74-b2ff-29e433108995" Apr 16 13:59:43.386011 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.385993 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 13:59:43.386310 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.386286 2569 generic.go:358] "Generic (PLEG): container finished" podID="e6def905-3f86-432f-b6ba-a5f4649cc324" containerID="95b3cf97193292d50769fbe4ecb7c50256a73af52fd2ebd51140e0956de9be53" exitCode=1 Apr 16 13:59:43.386382 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.386357 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" event={"ID":"e6def905-3f86-432f-b6ba-a5f4649cc324","Type":"ContainerStarted","Data":"580807dee83989bd5868952dabe99e4bfbeb6e791f5b12bc42ba13e3499d6658"} Apr 16 13:59:43.386441 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.386412 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" event={"ID":"e6def905-3f86-432f-b6ba-a5f4649cc324","Type":"ContainerStarted","Data":"573160a924ac1577928a3e27a8377b82e5fd28a372cde1a5d5bc21944684ca42"} Apr 16 13:59:43.386441 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.386428 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" event={"ID":"e6def905-3f86-432f-b6ba-a5f4649cc324","Type":"ContainerStarted","Data":"2226bc777a97b30aa6fc8edcf717f7db4d60cc4b3fc976636c7599604390ce8c"} Apr 16 13:59:43.386534 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.386441 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" event={"ID":"e6def905-3f86-432f-b6ba-a5f4649cc324","Type":"ContainerDied","Data":"95b3cf97193292d50769fbe4ecb7c50256a73af52fd2ebd51140e0956de9be53"} Apr 16 13:59:43.386534 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.386456 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" event={"ID":"e6def905-3f86-432f-b6ba-a5f4649cc324","Type":"ContainerStarted","Data":"f9ee5f0d18a48c189e46646f84745603436f52135053bf83a68c38f10ba64b5a"} Apr 16 13:59:43.387764 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.387745 2569 generic.go:358] "Generic (PLEG): container finished" podID="44914f75-f504-40b2-932d-a36d8319394c" containerID="3daa92558edd13123e78675c1445dba875ac1bf5dd15276228348481ae35b164" exitCode=0 Apr 16 13:59:43.387842 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.387804 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wlchs" event={"ID":"44914f75-f504-40b2-932d-a36d8319394c","Type":"ContainerDied","Data":"3daa92558edd13123e78675c1445dba875ac1bf5dd15276228348481ae35b164"} Apr 16 13:59:43.389098 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.389043 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-f5zrv" event={"ID":"2b74162e-fbf4-4ede-b8ad-5623c1094615","Type":"ContainerStarted","Data":"6003c074041e742f4f39e6ec8aa61487ee2bad851a57d66da34ab3bc27ef2250"} Apr 16 13:59:43.390236 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.390207 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xv4n6" event={"ID":"d6588176-b995-4b14-80e6-c2ba40893912","Type":"ContainerStarted","Data":"edd8544e7951caa61e407766e1af804b2f8f72715cdb4f6f92a72b72834839df"} Apr 16 13:59:43.391498 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.391481 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p4h9n" event={"ID":"ca33a748-eaae-40ea-9131-81e3f97ea69d","Type":"ContainerStarted","Data":"02968388d488790465ecf29dab8e132493ee17786c13a68ef81aba8fe4cdf450"} Apr 16 13:59:43.392772 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.392752 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t2gj8" event={"ID":"5f104cac-2458-4f3f-b7d2-b71aef2dff52","Type":"ContainerStarted","Data":"a968f11f758f25cec2d0909ed3fd4ca7beba501c05610124c775ecd8c0396213"} Apr 16 13:59:43.394084 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.394059 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" event={"ID":"169891fa-6d6f-48ee-a833-f55805467ffd","Type":"ContainerStarted","Data":"ae66a613bc05792d862b03b4abd58111ddb7c893201ca9758ba6e05536b21e2b"} Apr 16 13:59:43.395583 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.395564 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" event={"ID":"bdcb6f5a-276e-476f-ace0-4bc3f243da52","Type":"ContainerStarted","Data":"14f1b9b7488f0e0da10fb9c883647a795c904e2ea3ff05724786d65641735b01"} Apr 16 13:59:43.434611 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.434571 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ql6p9" podStartSLOduration=3.7986902110000003 podStartE2EDuration="20.434561034s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 13:59:25.926577338 +0000 UTC m=+3.266952182" lastFinishedPulling="2026-04-16 13:59:42.56244817 +0000 UTC m=+19.902823005" observedRunningTime="2026-04-16 13:59:43.434366496 +0000 UTC m=+20.774741350" watchObservedRunningTime="2026-04-16 13:59:43.434561034 +0000 UTC m=+20.774935886" Apr 16 13:59:43.450911 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.450878 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-f5zrv" podStartSLOduration=4.093096963 podStartE2EDuration="20.450870925s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 13:59:25.925896116 +0000 UTC m=+3.266270961" lastFinishedPulling="2026-04-16 13:59:42.283670091 +0000 UTC m=+19.624044923" observedRunningTime="2026-04-16 13:59:43.450522995 +0000 UTC m=+20.790897847" watchObservedRunningTime="2026-04-16 13:59:43.450870925 +0000 UTC m=+20.791245777" Apr 16 13:59:43.468556 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.468518 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-p4h9n" podStartSLOduration=3.825016871 podStartE2EDuration="20.468509207s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 13:59:25.918947799 +0000 UTC m=+3.259322630" lastFinishedPulling="2026-04-16 13:59:42.562440117 +0000 UTC m=+19.902814966" observedRunningTime="2026-04-16 13:59:43.468452578 +0000 UTC m=+20.808827429" watchObservedRunningTime="2026-04-16 13:59:43.468509207 +0000 UTC m=+20.808884058" Apr 16 13:59:43.482228 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.482192 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xv4n6" podStartSLOduration=4.121482489 podStartE2EDuration="20.482183423s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 13:59:25.922967342 +0000 UTC m=+3.263342173" lastFinishedPulling="2026-04-16 13:59:42.283668263 +0000 UTC m=+19.624043107" observedRunningTime="2026-04-16 13:59:43.482141479 +0000 UTC m=+20.822516326" watchObservedRunningTime="2026-04-16 13:59:43.482183423 +0000 UTC m=+20.822558276" Apr 16 13:59:43.501263 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:43.501231 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-t2gj8" podStartSLOduration=3.843040824 podStartE2EDuration="20.501222978s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 13:59:25.929751811 +0000 UTC m=+3.270126655" lastFinishedPulling="2026-04-16 13:59:42.587933975 +0000 UTC m=+19.928308809" observedRunningTime="2026-04-16 13:59:43.501013247 +0000 UTC m=+20.841388099" watchObservedRunningTime="2026-04-16 13:59:43.501222978 +0000 UTC m=+20.841597829" Apr 16 13:59:44.309489 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:44.309324 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:44.309916 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:44.309562 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lfj5m" podUID="44d7f301-04c1-422a-a689-9d0e4f02952c" Apr 16 13:59:44.399311 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:44.399263 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lpgrv" event={"ID":"ba1a0175-ff05-496f-a2ab-9b87059cf3c3","Type":"ContainerStarted","Data":"c84fa281b0dd92f5639c950f21cab43925e055540ae214a8838ea51c8e24082e"} Apr 16 13:59:44.403144 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:44.403118 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 13:59:44.404011 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:44.403592 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" event={"ID":"e6def905-3f86-432f-b6ba-a5f4649cc324","Type":"ContainerStarted","Data":"ddd15b39adff008540a1b94723daa60fa6318e42aa634c49ccfec1d49e47a817"} Apr 16 13:59:44.416474 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:44.416427 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-lpgrv" podStartSLOduration=4.7739795659999995 podStartE2EDuration="21.416411399s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 13:59:25.920006915 +0000 UTC m=+3.260381759" lastFinishedPulling="2026-04-16 13:59:42.562438743 +0000 UTC m=+19.902813592" observedRunningTime="2026-04-16 13:59:44.415806028 +0000 UTC m=+21.756180930" watchObservedRunningTime="2026-04-16 13:59:44.416411399 +0000 UTC m=+21.756786244" Apr 16 13:59:44.628797 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:44.628722 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-f5zrv" Apr 16 13:59:44.629435 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:44.629410 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-f5zrv" Apr 16 13:59:45.073652 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:45.073627 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 13:59:45.257574 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:45.257425 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T13:59:45.073648548Z","UUID":"1b814ae4-661e-4b70-8b32-89b6ab90b1ab","Handler":null,"Name":"","Endpoint":""} Apr 16 13:59:45.259548 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:45.259525 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 13:59:45.259667 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:45.259556 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 13:59:45.308752 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:45.308673 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:45.308876 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:45.308770 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f6dlw" podUID="9698ff93-a877-4a74-b2ff-29e433108995" Apr 16 13:59:45.407057 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:45.407010 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" event={"ID":"169891fa-6d6f-48ee-a833-f55805467ffd","Type":"ContainerStarted","Data":"80591fd02e12dbd0d94d242b901b00597a286f980cea6ba0f89dd668240fae03"} Apr 16 13:59:46.308735 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:46.308703 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:46.308903 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:46.308826 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lfj5m" podUID="44d7f301-04c1-422a-a689-9d0e4f02952c" Apr 16 13:59:46.412089 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:46.412055 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 13:59:46.412547 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:46.412511 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" event={"ID":"e6def905-3f86-432f-b6ba-a5f4649cc324","Type":"ContainerStarted","Data":"23e7d7f745c5e540e78ed36d10f476ecad99334324283802d3729295b43cbc96"} Apr 16 13:59:46.412676 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:46.412530 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:59:47.073594 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:47.073553 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-f5zrv" Apr 16 13:59:47.074214 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:47.074193 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-f5zrv" Apr 16 13:59:47.309106 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:47.308918 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:47.309250 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:47.309190 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f6dlw" podUID="9698ff93-a877-4a74-b2ff-29e433108995" Apr 16 13:59:48.309225 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:48.309145 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:48.309909 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:48.309244 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lfj5m" podUID="44d7f301-04c1-422a-a689-9d0e4f02952c" Apr 16 13:59:48.418097 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:48.418066 2569 generic.go:358] "Generic (PLEG): container finished" podID="44914f75-f504-40b2-932d-a36d8319394c" containerID="64b3fd68af120b5fea53584122ef4700637e133390f0381c4611f4e65b7b05a0" exitCode=0 Apr 16 13:59:48.418253 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:48.418141 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wlchs" event={"ID":"44914f75-f504-40b2-932d-a36d8319394c","Type":"ContainerDied","Data":"64b3fd68af120b5fea53584122ef4700637e133390f0381c4611f4e65b7b05a0"} Apr 16 13:59:48.420033 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:48.420009 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" event={"ID":"169891fa-6d6f-48ee-a833-f55805467ffd","Type":"ContainerStarted","Data":"6cff9f814f767ff6dbbc9bda1b55769ec129b30a2e4b70c951a75c9619964ca3"} Apr 16 13:59:48.423076 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:48.422902 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 13:59:48.423388 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:48.423370 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" event={"ID":"e6def905-3f86-432f-b6ba-a5f4649cc324","Type":"ContainerStarted","Data":"2d394b515588df920cd1c49c924151e5769ed47d5503b3c089f2be2d10025d37"} Apr 16 13:59:48.423675 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:48.423650 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:48.423675 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:48.423677 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:48.423798 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:48.423689 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:48.423798 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:48.423784 2569 scope.go:117] "RemoveContainer" containerID="95b3cf97193292d50769fbe4ecb7c50256a73af52fd2ebd51140e0956de9be53" Apr 16 13:59:48.441966 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:48.439653 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:48.442812 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:48.442791 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 13:59:48.483508 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:48.483462 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xrmkp" podStartSLOduration=3.664877807 podStartE2EDuration="25.483447565s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 13:59:25.929345426 +0000 UTC m=+3.269720259" lastFinishedPulling="2026-04-16 13:59:47.747915182 +0000 UTC m=+25.088290017" observedRunningTime="2026-04-16 13:59:48.48342615 +0000 UTC m=+25.823801003" watchObservedRunningTime="2026-04-16 13:59:48.483447565 +0000 UTC m=+25.823822408" Apr 16 13:59:49.309906 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:49.309873 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:49.310364 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:49.310000 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f6dlw" podUID="9698ff93-a877-4a74-b2ff-29e433108995" Apr 16 13:59:49.429219 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:49.429188 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 13:59:49.429617 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:49.429569 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" event={"ID":"e6def905-3f86-432f-b6ba-a5f4649cc324","Type":"ContainerStarted","Data":"37d45780aebc918f4aeacd84c8bfe1db9ce2f19538af9388d43a2a6712da60f8"} Apr 16 13:59:49.461452 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:49.461018 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" podStartSLOduration=9.751971627 podStartE2EDuration="26.460983314s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 13:59:25.92200269 +0000 UTC m=+3.262377536" lastFinishedPulling="2026-04-16 13:59:42.63101438 +0000 UTC m=+19.971389223" observedRunningTime="2026-04-16 13:59:49.460960431 +0000 UTC m=+26.801335279" watchObservedRunningTime="2026-04-16 13:59:49.460983314 +0000 UTC m=+26.801358168" Apr 16 13:59:49.731534 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:49.731454 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lfj5m"] Apr 16 13:59:49.731721 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:49.731587 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:49.731721 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:49.731686 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lfj5m" podUID="44d7f301-04c1-422a-a689-9d0e4f02952c" Apr 16 13:59:49.736071 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:49.736044 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-f6dlw"] Apr 16 13:59:49.736173 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:49.736147 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:49.736252 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:49.736234 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f6dlw" podUID="9698ff93-a877-4a74-b2ff-29e433108995" Apr 16 13:59:50.433930 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:50.433895 2569 generic.go:358] "Generic (PLEG): container finished" podID="44914f75-f504-40b2-932d-a36d8319394c" containerID="38c5891f486aa004e05bec04df1174cc68ec4aa20d6009e5eb8a1de05527ca08" exitCode=0 Apr 16 13:59:50.434503 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:50.433953 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wlchs" event={"ID":"44914f75-f504-40b2-932d-a36d8319394c","Type":"ContainerDied","Data":"38c5891f486aa004e05bec04df1174cc68ec4aa20d6009e5eb8a1de05527ca08"} Apr 16 13:59:51.308824 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:51.308788 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:51.308824 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:51.308814 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:51.309036 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:51.308933 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lfj5m" podUID="44d7f301-04c1-422a-a689-9d0e4f02952c" Apr 16 13:59:51.309099 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:51.309066 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f6dlw" podUID="9698ff93-a877-4a74-b2ff-29e433108995" Apr 16 13:59:52.440300 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:52.440266 2569 generic.go:358] "Generic (PLEG): container finished" podID="44914f75-f504-40b2-932d-a36d8319394c" containerID="8e06d8ec72f871ce20c292094ade87ea37f47389a01f4b2d6a67b9c5a3215224" exitCode=0 Apr 16 13:59:52.440663 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:52.440311 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wlchs" event={"ID":"44914f75-f504-40b2-932d-a36d8319394c","Type":"ContainerDied","Data":"8e06d8ec72f871ce20c292094ade87ea37f47389a01f4b2d6a67b9c5a3215224"} Apr 16 13:59:53.309248 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:53.309215 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:53.309422 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:53.309296 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lfj5m" podUID="44d7f301-04c1-422a-a689-9d0e4f02952c" Apr 16 13:59:53.309422 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:53.309368 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:53.309505 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:53.309456 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f6dlw" podUID="9698ff93-a877-4a74-b2ff-29e433108995" Apr 16 13:59:55.308801 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.308765 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:55.308801 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.308796 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:55.309422 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:55.308897 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-f6dlw" podUID="9698ff93-a877-4a74-b2ff-29e433108995" Apr 16 13:59:55.309422 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:55.309030 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lfj5m" podUID="44d7f301-04c1-422a-a689-9d0e4f02952c" Apr 16 13:59:55.437752 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.437721 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeReady" Apr 16 13:59:55.437920 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.437856 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 13:59:55.490569 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.490517 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-p6gtp"] Apr 16 13:59:55.492695 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.492670 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p6gtp" Apr 16 13:59:55.495626 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.495603 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tpzlf\"" Apr 16 13:59:55.496142 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.496011 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 13:59:55.496321 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.496301 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 13:59:55.497467 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.497450 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-b4d9q"] Apr 16 13:59:55.500028 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.499940 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b4d9q" Apr 16 13:59:55.502672 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.502651 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 13:59:55.502825 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.502676 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbpqw\"" Apr 16 13:59:55.502825 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.502656 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 13:59:55.504797 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.504775 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 13:59:55.506049 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.506028 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p6gtp"] Apr 16 13:59:55.519772 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.519744 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b4d9q"] Apr 16 13:59:55.636671 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.636580 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-config-volume\") pod \"dns-default-p6gtp\" (UID: \"59e98d1e-f9cf-4faa-bd64-a597149d3bc7\") " pod="openshift-dns/dns-default-p6gtp" Apr 16 13:59:55.636671 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.636630 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-tmp-dir\") pod \"dns-default-p6gtp\" (UID: \"59e98d1e-f9cf-4faa-bd64-a597149d3bc7\") " pod="openshift-dns/dns-default-p6gtp" Apr 16 13:59:55.636671 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.636667 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs4cx\" (UniqueName: \"kubernetes.io/projected/852d4d38-4926-4c6a-a9ad-11a60019138a-kube-api-access-bs4cx\") pod \"ingress-canary-b4d9q\" (UID: \"852d4d38-4926-4c6a-a9ad-11a60019138a\") " pod="openshift-ingress-canary/ingress-canary-b4d9q" Apr 16 13:59:55.636888 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.636697 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdtnq\" (UniqueName: \"kubernetes.io/projected/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-kube-api-access-wdtnq\") pod \"dns-default-p6gtp\" (UID: \"59e98d1e-f9cf-4faa-bd64-a597149d3bc7\") " pod="openshift-dns/dns-default-p6gtp" Apr 16 13:59:55.636888 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.636717 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert\") pod \"ingress-canary-b4d9q\" (UID: \"852d4d38-4926-4c6a-a9ad-11a60019138a\") " pod="openshift-ingress-canary/ingress-canary-b4d9q" Apr 16 13:59:55.636888 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.636742 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls\") pod \"dns-default-p6gtp\" (UID: \"59e98d1e-f9cf-4faa-bd64-a597149d3bc7\") " pod="openshift-dns/dns-default-p6gtp" Apr 16 13:59:55.738109 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.738072 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-config-volume\") pod \"dns-default-p6gtp\" (UID: \"59e98d1e-f9cf-4faa-bd64-a597149d3bc7\") " pod="openshift-dns/dns-default-p6gtp" Apr 16 13:59:55.738283 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.738142 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-tmp-dir\") pod \"dns-default-p6gtp\" (UID: \"59e98d1e-f9cf-4faa-bd64-a597149d3bc7\") " pod="openshift-dns/dns-default-p6gtp" Apr 16 13:59:55.738283 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.738188 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bs4cx\" (UniqueName: \"kubernetes.io/projected/852d4d38-4926-4c6a-a9ad-11a60019138a-kube-api-access-bs4cx\") pod \"ingress-canary-b4d9q\" (UID: \"852d4d38-4926-4c6a-a9ad-11a60019138a\") " pod="openshift-ingress-canary/ingress-canary-b4d9q" Apr 16 13:59:55.738283 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.738222 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdtnq\" (UniqueName: \"kubernetes.io/projected/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-kube-api-access-wdtnq\") pod \"dns-default-p6gtp\" (UID: \"59e98d1e-f9cf-4faa-bd64-a597149d3bc7\") " pod="openshift-dns/dns-default-p6gtp" Apr 16 13:59:55.738283 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.738253 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert\") pod \"ingress-canary-b4d9q\" (UID: \"852d4d38-4926-4c6a-a9ad-11a60019138a\") " pod="openshift-ingress-canary/ingress-canary-b4d9q" Apr 16 13:59:55.738283 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.738273 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls\") pod \"dns-default-p6gtp\" (UID: \"59e98d1e-f9cf-4faa-bd64-a597149d3bc7\") " pod="openshift-dns/dns-default-p6gtp" Apr 16 13:59:55.738576 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:55.738366 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:55.738576 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:55.738441 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls podName:59e98d1e-f9cf-4faa-bd64-a597149d3bc7 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:56.238420691 +0000 UTC m=+33.578795534 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls") pod "dns-default-p6gtp" (UID: "59e98d1e-f9cf-4faa-bd64-a597149d3bc7") : secret "dns-default-metrics-tls" not found Apr 16 13:59:55.738576 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.738535 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-tmp-dir\") pod \"dns-default-p6gtp\" (UID: \"59e98d1e-f9cf-4faa-bd64-a597149d3bc7\") " pod="openshift-dns/dns-default-p6gtp" Apr 16 13:59:55.738738 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.738700 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-config-volume\") pod \"dns-default-p6gtp\" (UID: \"59e98d1e-f9cf-4faa-bd64-a597149d3bc7\") " pod="openshift-dns/dns-default-p6gtp" Apr 16 13:59:55.738738 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:55.738712 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:55.738822 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:55.738760 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert podName:852d4d38-4926-4c6a-a9ad-11a60019138a nodeName:}" failed. No retries permitted until 2026-04-16 13:59:56.238745503 +0000 UTC m=+33.579120337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert") pod "ingress-canary-b4d9q" (UID: "852d4d38-4926-4c6a-a9ad-11a60019138a") : secret "canary-serving-cert" not found Apr 16 13:59:55.749416 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.749375 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs4cx\" (UniqueName: \"kubernetes.io/projected/852d4d38-4926-4c6a-a9ad-11a60019138a-kube-api-access-bs4cx\") pod \"ingress-canary-b4d9q\" (UID: \"852d4d38-4926-4c6a-a9ad-11a60019138a\") " pod="openshift-ingress-canary/ingress-canary-b4d9q" Apr 16 13:59:55.749547 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:55.749391 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdtnq\" (UniqueName: \"kubernetes.io/projected/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-kube-api-access-wdtnq\") pod \"dns-default-p6gtp\" (UID: \"59e98d1e-f9cf-4faa-bd64-a597149d3bc7\") " pod="openshift-dns/dns-default-p6gtp" Apr 16 13:59:56.241428 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:56.241378 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert\") pod \"ingress-canary-b4d9q\" (UID: \"852d4d38-4926-4c6a-a9ad-11a60019138a\") " pod="openshift-ingress-canary/ingress-canary-b4d9q" Apr 16 13:59:56.241614 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:56.241443 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls\") pod \"dns-default-p6gtp\" (UID: \"59e98d1e-f9cf-4faa-bd64-a597149d3bc7\") " pod="openshift-dns/dns-default-p6gtp" Apr 16 13:59:56.241614 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:56.241517 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:56.241614 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:56.241536 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:56.241614 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:56.241593 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls podName:59e98d1e-f9cf-4faa-bd64-a597149d3bc7 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:57.241577191 +0000 UTC m=+34.581952041 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls") pod "dns-default-p6gtp" (UID: "59e98d1e-f9cf-4faa-bd64-a597149d3bc7") : secret "dns-default-metrics-tls" not found Apr 16 13:59:56.241614 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:56.241610 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert podName:852d4d38-4926-4c6a-a9ad-11a60019138a nodeName:}" failed. No retries permitted until 2026-04-16 13:59:57.241602592 +0000 UTC m=+34.581977428 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert") pod "ingress-canary-b4d9q" (UID: "852d4d38-4926-4c6a-a9ad-11a60019138a") : secret "canary-serving-cert" not found Apr 16 13:59:57.047432 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:57.047372 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs\") pod \"network-metrics-daemon-lfj5m\" (UID: \"44d7f301-04c1-422a-a689-9d0e4f02952c\") " pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:57.048090 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:57.047522 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:57.048090 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:57.047605 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs podName:44d7f301-04c1-422a-a689-9d0e4f02952c nodeName:}" failed. No retries permitted until 2026-04-16 14:00:29.047584853 +0000 UTC m=+66.387959694 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs") pod "network-metrics-daemon-lfj5m" (UID: "44d7f301-04c1-422a-a689-9d0e4f02952c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:57.148433 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:57.148387 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fndrf\" (UniqueName: \"kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf\") pod \"network-check-target-f6dlw\" (UID: \"9698ff93-a877-4a74-b2ff-29e433108995\") " pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:57.148609 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:57.148540 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:57.148609 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:57.148556 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:57.148609 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:57.148565 2569 projected.go:194] Error preparing data for projected volume kube-api-access-fndrf for pod openshift-network-diagnostics/network-check-target-f6dlw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:57.148729 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:57.148630 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf podName:9698ff93-a877-4a74-b2ff-29e433108995 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:29.148614066 +0000 UTC m=+66.488988903 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-fndrf" (UniqueName: "kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf") pod "network-check-target-f6dlw" (UID: "9698ff93-a877-4a74-b2ff-29e433108995") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:57.249140 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:57.249098 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert\") pod \"ingress-canary-b4d9q\" (UID: \"852d4d38-4926-4c6a-a9ad-11a60019138a\") " pod="openshift-ingress-canary/ingress-canary-b4d9q" Apr 16 13:59:57.249140 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:57.249147 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls\") pod \"dns-default-p6gtp\" (UID: \"59e98d1e-f9cf-4faa-bd64-a597149d3bc7\") " pod="openshift-dns/dns-default-p6gtp" Apr 16 13:59:57.249363 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:57.249280 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:57.249363 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:57.249358 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert podName:852d4d38-4926-4c6a-a9ad-11a60019138a nodeName:}" failed. No retries permitted until 2026-04-16 13:59:59.249340356 +0000 UTC m=+36.589715201 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert") pod "ingress-canary-b4d9q" (UID: "852d4d38-4926-4c6a-a9ad-11a60019138a") : secret "canary-serving-cert" not found Apr 16 13:59:57.249493 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:57.249281 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:57.249493 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:57.249419 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls podName:59e98d1e-f9cf-4faa-bd64-a597149d3bc7 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:59.249409362 +0000 UTC m=+36.589784197 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls") pod "dns-default-p6gtp" (UID: "59e98d1e-f9cf-4faa-bd64-a597149d3bc7") : secret "dns-default-metrics-tls" not found Apr 16 13:59:57.308695 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:57.308604 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 13:59:57.308852 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:57.308800 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 13:59:57.310981 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:57.310952 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 13:59:57.311116 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:57.311001 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 13:59:57.311116 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:57.311023 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 13:59:57.311745 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:57.311726 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gs8rq\"" Apr 16 13:59:57.311850 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:57.311783 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x5z8x\"" Apr 16 13:59:59.264633 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:59.264601 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls\") pod \"dns-default-p6gtp\" (UID: \"59e98d1e-f9cf-4faa-bd64-a597149d3bc7\") " pod="openshift-dns/dns-default-p6gtp" Apr 16 13:59:59.265084 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:59.264689 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert\") pod \"ingress-canary-b4d9q\" (UID: \"852d4d38-4926-4c6a-a9ad-11a60019138a\") " pod="openshift-ingress-canary/ingress-canary-b4d9q" Apr 16 13:59:59.265084 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:59.264748 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:59.265084 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:59.264781 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:59.265084 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:59.264817 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls podName:59e98d1e-f9cf-4faa-bd64-a597149d3bc7 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:03.264801913 +0000 UTC m=+40.605176743 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls") pod "dns-default-p6gtp" (UID: "59e98d1e-f9cf-4faa-bd64-a597149d3bc7") : secret "dns-default-metrics-tls" not found Apr 16 13:59:59.265084 ip-10-0-142-16 kubenswrapper[2569]: E0416 13:59:59.264831 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert podName:852d4d38-4926-4c6a-a9ad-11a60019138a nodeName:}" failed. No retries permitted until 2026-04-16 14:00:03.264825406 +0000 UTC m=+40.605200236 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert") pod "ingress-canary-b4d9q" (UID: "852d4d38-4926-4c6a-a9ad-11a60019138a") : secret "canary-serving-cert" not found Apr 16 13:59:59.456409 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:59.456369 2569 generic.go:358] "Generic (PLEG): container finished" podID="44914f75-f504-40b2-932d-a36d8319394c" containerID="1b444cf0d30606b7e6e1d21beb6d55e7db54c21e1d0d3e5218bc74ca5e7f04b2" exitCode=0 Apr 16 13:59:59.456541 ip-10-0-142-16 kubenswrapper[2569]: I0416 13:59:59.456425 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wlchs" event={"ID":"44914f75-f504-40b2-932d-a36d8319394c","Type":"ContainerDied","Data":"1b444cf0d30606b7e6e1d21beb6d55e7db54c21e1d0d3e5218bc74ca5e7f04b2"} Apr 16 14:00:00.460925 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:00.460893 2569 generic.go:358] "Generic (PLEG): container finished" podID="44914f75-f504-40b2-932d-a36d8319394c" containerID="eb458f199d2cf41c82eea93684ae3a20ac256d6bc3483e3ddfaaa5d1b0e48516" exitCode=0 Apr 16 14:00:00.461279 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:00.460945 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wlchs" event={"ID":"44914f75-f504-40b2-932d-a36d8319394c","Type":"ContainerDied","Data":"eb458f199d2cf41c82eea93684ae3a20ac256d6bc3483e3ddfaaa5d1b0e48516"} Apr 16 14:00:01.464983 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:01.464796 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wlchs" event={"ID":"44914f75-f504-40b2-932d-a36d8319394c","Type":"ContainerStarted","Data":"2e8a7a983136951b58674cc89e71ba77ffecefbc2281e71acda22b5600a44102"} Apr 16 14:00:01.490029 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:01.489922 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wlchs" podStartSLOduration=5.854944633 podStartE2EDuration="38.489908592s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 13:59:25.927426606 +0000 UTC m=+3.267801450" lastFinishedPulling="2026-04-16 13:59:58.562390578 +0000 UTC m=+35.902765409" observedRunningTime="2026-04-16 14:00:01.489062605 +0000 UTC m=+38.829437457" watchObservedRunningTime="2026-04-16 14:00:01.489908592 +0000 UTC m=+38.830283445" Apr 16 14:00:03.293411 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:03.293371 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert\") pod \"ingress-canary-b4d9q\" (UID: \"852d4d38-4926-4c6a-a9ad-11a60019138a\") " pod="openshift-ingress-canary/ingress-canary-b4d9q" Apr 16 14:00:03.293411 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:03.293415 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls\") pod \"dns-default-p6gtp\" (UID: \"59e98d1e-f9cf-4faa-bd64-a597149d3bc7\") " pod="openshift-dns/dns-default-p6gtp" Apr 16 14:00:03.293858 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:03.293493 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:03.293858 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:03.293508 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:03.293858 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:03.293531 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls podName:59e98d1e-f9cf-4faa-bd64-a597149d3bc7 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:11.29351833 +0000 UTC m=+48.633893160 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls") pod "dns-default-p6gtp" (UID: "59e98d1e-f9cf-4faa-bd64-a597149d3bc7") : secret "dns-default-metrics-tls" not found Apr 16 14:00:03.293858 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:03.293567 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert podName:852d4d38-4926-4c6a-a9ad-11a60019138a nodeName:}" failed. No retries permitted until 2026-04-16 14:00:11.293549853 +0000 UTC m=+48.633924686 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert") pod "ingress-canary-b4d9q" (UID: "852d4d38-4926-4c6a-a9ad-11a60019138a") : secret "canary-serving-cert" not found Apr 16 14:00:11.346054 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:11.346022 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert\") pod \"ingress-canary-b4d9q\" (UID: \"852d4d38-4926-4c6a-a9ad-11a60019138a\") " pod="openshift-ingress-canary/ingress-canary-b4d9q" Apr 16 14:00:11.346054 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:11.346058 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls\") pod \"dns-default-p6gtp\" (UID: \"59e98d1e-f9cf-4faa-bd64-a597149d3bc7\") " pod="openshift-dns/dns-default-p6gtp" Apr 16 14:00:11.346462 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:11.346153 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:11.346462 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:11.346154 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:11.346462 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:11.346201 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls podName:59e98d1e-f9cf-4faa-bd64-a597149d3bc7 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:27.346187463 +0000 UTC m=+64.686562293 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls") pod "dns-default-p6gtp" (UID: "59e98d1e-f9cf-4faa-bd64-a597149d3bc7") : secret "dns-default-metrics-tls" not found Apr 16 14:00:11.346462 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:11.346213 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert podName:852d4d38-4926-4c6a-a9ad-11a60019138a nodeName:}" failed. No retries permitted until 2026-04-16 14:00:27.346207501 +0000 UTC m=+64.686582331 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert") pod "ingress-canary-b4d9q" (UID: "852d4d38-4926-4c6a-a9ad-11a60019138a") : secret "canary-serving-cert" not found Apr 16 14:00:20.449013 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:20.448985 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-prt2z" Apr 16 14:00:27.436466 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:27.436419 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert\") pod \"ingress-canary-b4d9q\" (UID: \"852d4d38-4926-4c6a-a9ad-11a60019138a\") " pod="openshift-ingress-canary/ingress-canary-b4d9q" Apr 16 14:00:27.436466 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:27.436469 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls\") pod \"dns-default-p6gtp\" (UID: \"59e98d1e-f9cf-4faa-bd64-a597149d3bc7\") " pod="openshift-dns/dns-default-p6gtp" Apr 16 14:00:27.436952 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:27.436556 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:27.436952 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:27.436558 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:27.436952 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:27.436618 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls podName:59e98d1e-f9cf-4faa-bd64-a597149d3bc7 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:59.436599054 +0000 UTC m=+96.776973890 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls") pod "dns-default-p6gtp" (UID: "59e98d1e-f9cf-4faa-bd64-a597149d3bc7") : secret "dns-default-metrics-tls" not found Apr 16 14:00:27.436952 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:27.436636 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert podName:852d4d38-4926-4c6a-a9ad-11a60019138a nodeName:}" failed. No retries permitted until 2026-04-16 14:00:59.436627453 +0000 UTC m=+96.777002283 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert") pod "ingress-canary-b4d9q" (UID: "852d4d38-4926-4c6a-a9ad-11a60019138a") : secret "canary-serving-cert" not found Apr 16 14:00:29.047644 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:29.047610 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs\") pod \"network-metrics-daemon-lfj5m\" (UID: \"44d7f301-04c1-422a-a689-9d0e4f02952c\") " pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 14:00:29.050175 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:29.050157 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:00:29.057968 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:29.057945 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:00:29.058057 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:29.058015 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs podName:44d7f301-04c1-422a-a689-9d0e4f02952c nodeName:}" failed. No retries permitted until 2026-04-16 14:01:33.057992547 +0000 UTC m=+130.398367395 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs") pod "network-metrics-daemon-lfj5m" (UID: "44d7f301-04c1-422a-a689-9d0e4f02952c") : secret "metrics-daemon-secret" not found Apr 16 14:00:29.249565 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:29.249531 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fndrf\" (UniqueName: \"kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf\") pod \"network-check-target-f6dlw\" (UID: \"9698ff93-a877-4a74-b2ff-29e433108995\") " pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 14:00:29.251903 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:29.251883 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:00:29.262872 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:29.262853 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:00:29.274839 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:29.274821 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fndrf\" (UniqueName: \"kubernetes.io/projected/9698ff93-a877-4a74-b2ff-29e433108995-kube-api-access-fndrf\") pod \"network-check-target-f6dlw\" (UID: \"9698ff93-a877-4a74-b2ff-29e433108995\") " pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 14:00:29.429843 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:29.429821 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gs8rq\"" Apr 16 14:00:29.438526 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:29.438507 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 14:00:29.557744 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:29.557711 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-f6dlw"] Apr 16 14:00:29.561719 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:00:29.561693 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9698ff93_a877_4a74_b2ff_29e433108995.slice/crio-c59d618b1fa30beb3129631a60f27979afa4b5d49bf324f422f0ad1f7dc13193 WatchSource:0}: Error finding container c59d618b1fa30beb3129631a60f27979afa4b5d49bf324f422f0ad1f7dc13193: Status 404 returned error can't find the container with id c59d618b1fa30beb3129631a60f27979afa4b5d49bf324f422f0ad1f7dc13193 Apr 16 14:00:30.517142 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:30.517098 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-f6dlw" event={"ID":"9698ff93-a877-4a74-b2ff-29e433108995","Type":"ContainerStarted","Data":"c59d618b1fa30beb3129631a60f27979afa4b5d49bf324f422f0ad1f7dc13193"} Apr 16 14:00:32.522952 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:32.522920 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-f6dlw" event={"ID":"9698ff93-a877-4a74-b2ff-29e433108995","Type":"ContainerStarted","Data":"4daa3ee3510f07c86f84a82501f4f4b8acefccd790de168560de7ace9df937cb"} Apr 16 14:00:32.523351 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:32.523138 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 14:00:32.542110 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:32.542066 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-f6dlw" podStartSLOduration=66.956653564 podStartE2EDuration="1m9.542054288s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 14:00:29.563903288 +0000 UTC m=+66.904278119" lastFinishedPulling="2026-04-16 14:00:32.149304012 +0000 UTC m=+69.489678843" observedRunningTime="2026-04-16 14:00:32.541731962 +0000 UTC m=+69.882106817" watchObservedRunningTime="2026-04-16 14:00:32.542054288 +0000 UTC m=+69.882429136" Apr 16 14:00:57.534756 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.534720 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-kpwpf"] Apr 16 14:00:57.538649 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.538627 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" Apr 16 14:00:57.540813 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.540790 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 14:00:57.540922 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.540837 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 14:00:57.540922 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.540859 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-x8dzr\"" Apr 16 14:00:57.541365 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.541342 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:00:57.541485 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.541426 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 14:00:57.546062 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.546020 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 14:00:57.547262 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.547242 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-kpwpf"] Apr 16 14:00:57.636800 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.636768 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj"] Apr 16 14:00:57.639488 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.639472 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj" Apr 16 14:00:57.642091 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.642061 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpxp5\" (UniqueName: \"kubernetes.io/projected/cb232208-c05b-4821-9c83-1582341d5232-kube-api-access-cpxp5\") pod \"console-operator-d87b8d5fc-kpwpf\" (UID: \"cb232208-c05b-4821-9c83-1582341d5232\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" Apr 16 14:00:57.642227 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.642163 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb232208-c05b-4821-9c83-1582341d5232-config\") pod \"console-operator-d87b8d5fc-kpwpf\" (UID: \"cb232208-c05b-4821-9c83-1582341d5232\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" Apr 16 14:00:57.642323 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.642244 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb232208-c05b-4821-9c83-1582341d5232-trusted-ca\") pod \"console-operator-d87b8d5fc-kpwpf\" (UID: \"cb232208-c05b-4821-9c83-1582341d5232\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" Apr 16 14:00:57.642382 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.642331 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb232208-c05b-4821-9c83-1582341d5232-serving-cert\") pod \"console-operator-d87b8d5fc-kpwpf\" (UID: \"cb232208-c05b-4821-9c83-1582341d5232\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" Apr 16 14:00:57.642540 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.642526 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5c4787ff58-x4l8s"] Apr 16 14:00:57.642626 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.642531 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 14:00:57.642804 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.642787 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-4lwtk\"" Apr 16 14:00:57.642879 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.642859 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:00:57.642937 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.642926 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 14:00:57.643191 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.643175 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:00:57.645282 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.645265 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:00:57.647222 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.647205 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-wbwvt\"" Apr 16 14:00:57.647294 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.647205 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 14:00:57.647999 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.647984 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 14:00:57.648276 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.648260 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 14:00:57.648355 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.648330 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 14:00:57.648355 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.648330 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 14:00:57.648573 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.648555 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 14:00:57.656106 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.656080 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj"] Apr 16 14:00:57.659266 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.659243 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5c4787ff58-x4l8s"] Apr 16 14:00:57.743312 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.743273 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpxp5\" (UniqueName: \"kubernetes.io/projected/cb232208-c05b-4821-9c83-1582341d5232-kube-api-access-cpxp5\") pod \"console-operator-d87b8d5fc-kpwpf\" (UID: \"cb232208-c05b-4821-9c83-1582341d5232\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" Apr 16 14:00:57.743312 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.743317 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-default-certificate\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:00:57.743546 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.743334 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cps6l\" (UniqueName: \"kubernetes.io/projected/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-kube-api-access-cps6l\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:00:57.743546 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.743440 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb232208-c05b-4821-9c83-1582341d5232-config\") pod \"console-operator-d87b8d5fc-kpwpf\" (UID: \"cb232208-c05b-4821-9c83-1582341d5232\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" Apr 16 14:00:57.743546 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.743480 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a78f3464-a81c-413a-a11a-ba6020b56874-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-ht2bj\" (UID: \"a78f3464-a81c-413a-a11a-ba6020b56874\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj" Apr 16 14:00:57.743546 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.743532 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-metrics-certs\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:00:57.743702 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.743578 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb232208-c05b-4821-9c83-1582341d5232-trusted-ca\") pod \"console-operator-d87b8d5fc-kpwpf\" (UID: \"cb232208-c05b-4821-9c83-1582341d5232\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" Apr 16 14:00:57.743702 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.743610 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-stats-auth\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:00:57.743702 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.743634 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29dx6\" (UniqueName: \"kubernetes.io/projected/a78f3464-a81c-413a-a11a-ba6020b56874-kube-api-access-29dx6\") pod \"cluster-monitoring-operator-6667474d89-ht2bj\" (UID: \"a78f3464-a81c-413a-a11a-ba6020b56874\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj" Apr 16 14:00:57.743702 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.743677 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-service-ca-bundle\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:00:57.743981 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.743776 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ht2bj\" (UID: \"a78f3464-a81c-413a-a11a-ba6020b56874\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj" Apr 16 14:00:57.743981 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.743822 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb232208-c05b-4821-9c83-1582341d5232-serving-cert\") pod \"console-operator-d87b8d5fc-kpwpf\" (UID: \"cb232208-c05b-4821-9c83-1582341d5232\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" Apr 16 14:00:57.744381 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.744355 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb232208-c05b-4821-9c83-1582341d5232-config\") pod \"console-operator-d87b8d5fc-kpwpf\" (UID: \"cb232208-c05b-4821-9c83-1582341d5232\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" Apr 16 14:00:57.744505 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.744361 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb232208-c05b-4821-9c83-1582341d5232-trusted-ca\") pod \"console-operator-d87b8d5fc-kpwpf\" (UID: \"cb232208-c05b-4821-9c83-1582341d5232\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" Apr 16 14:00:57.746162 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.746143 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb232208-c05b-4821-9c83-1582341d5232-serving-cert\") pod \"console-operator-d87b8d5fc-kpwpf\" (UID: \"cb232208-c05b-4821-9c83-1582341d5232\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" Apr 16 14:00:57.757145 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.757122 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-7t4bf"] Apr 16 14:00:57.759200 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.759182 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpxp5\" (UniqueName: \"kubernetes.io/projected/cb232208-c05b-4821-9c83-1582341d5232-kube-api-access-cpxp5\") pod \"console-operator-d87b8d5fc-kpwpf\" (UID: \"cb232208-c05b-4821-9c83-1582341d5232\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" Apr 16 14:00:57.760038 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.760025 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-7t4bf" Apr 16 14:00:57.762919 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.762894 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-bg6tj"] Apr 16 14:00:57.765474 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.765458 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bg6tj" Apr 16 14:00:57.771296 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.771264 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 14:00:57.771582 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.771562 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-hfcrj\"" Apr 16 14:00:57.771681 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.771590 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 14:00:57.771740 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.771723 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-jqb7c\"" Apr 16 14:00:57.771797 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.771760 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:00:57.772963 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.772947 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 14:00:57.774661 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.773980 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-7t4bf"] Apr 16 14:00:57.783412 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.783375 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-bg6tj"] Apr 16 14:00:57.844561 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.844474 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-metrics-certs\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:00:57.844561 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.844523 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgszw\" (UniqueName: \"kubernetes.io/projected/f1a2e25c-5259-48d8-865a-7328810adf10-kube-api-access-sgszw\") pod \"network-check-source-7b678d77c7-7t4bf\" (UID: \"f1a2e25c-5259-48d8-865a-7328810adf10\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-7t4bf" Apr 16 14:00:57.844561 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.844545 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-stats-auth\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:00:57.844776 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.844568 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29dx6\" (UniqueName: \"kubernetes.io/projected/a78f3464-a81c-413a-a11a-ba6020b56874-kube-api-access-29dx6\") pod \"cluster-monitoring-operator-6667474d89-ht2bj\" (UID: \"a78f3464-a81c-413a-a11a-ba6020b56874\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj" Apr 16 14:00:57.844776 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:57.844619 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:00:57.844776 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:57.844687 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-metrics-certs podName:e5a40466-a66f-4e5a-b8ea-43dd46b22ac1 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:58.344671374 +0000 UTC m=+95.685046205 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-metrics-certs") pod "router-default-5c4787ff58-x4l8s" (UID: "e5a40466-a66f-4e5a-b8ea-43dd46b22ac1") : secret "router-metrics-certs-default" not found Apr 16 14:00:57.844776 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.844714 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-service-ca-bundle\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:00:57.844776 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.844747 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8c6a6f-0f8c-4610-a6c5-f33111156650-config\") pod \"service-ca-operator-69965bb79d-bg6tj\" (UID: \"5f8c6a6f-0f8c-4610-a6c5-f33111156650\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bg6tj" Apr 16 14:00:57.844990 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.844778 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ht2bj\" (UID: \"a78f3464-a81c-413a-a11a-ba6020b56874\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj" Apr 16 14:00:57.844990 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:57.844806 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-service-ca-bundle podName:e5a40466-a66f-4e5a-b8ea-43dd46b22ac1 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:58.344792482 +0000 UTC m=+95.685167333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-service-ca-bundle") pod "router-default-5c4787ff58-x4l8s" (UID: "e5a40466-a66f-4e5a-b8ea-43dd46b22ac1") : configmap references non-existent config key: service-ca.crt Apr 16 14:00:57.844990 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.844839 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f8c6a6f-0f8c-4610-a6c5-f33111156650-serving-cert\") pod \"service-ca-operator-69965bb79d-bg6tj\" (UID: \"5f8c6a6f-0f8c-4610-a6c5-f33111156650\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bg6tj" Apr 16 14:00:57.844990 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:57.844849 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:00:57.844990 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:57.844884 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls podName:a78f3464-a81c-413a-a11a-ba6020b56874 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:58.34487204 +0000 UTC m=+95.685246874 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-ht2bj" (UID: "a78f3464-a81c-413a-a11a-ba6020b56874") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:00:57.844990 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.844903 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-default-certificate\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:00:57.844990 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.844926 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cps6l\" (UniqueName: \"kubernetes.io/projected/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-kube-api-access-cps6l\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:00:57.844990 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.844966 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g5n7\" (UniqueName: \"kubernetes.io/projected/5f8c6a6f-0f8c-4610-a6c5-f33111156650-kube-api-access-5g5n7\") pod \"service-ca-operator-69965bb79d-bg6tj\" (UID: \"5f8c6a6f-0f8c-4610-a6c5-f33111156650\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bg6tj" Apr 16 14:00:57.845327 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.844998 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a78f3464-a81c-413a-a11a-ba6020b56874-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-ht2bj\" (UID: \"a78f3464-a81c-413a-a11a-ba6020b56874\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj" Apr 16 14:00:57.845666 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.845649 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a78f3464-a81c-413a-a11a-ba6020b56874-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-ht2bj\" (UID: \"a78f3464-a81c-413a-a11a-ba6020b56874\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj" Apr 16 14:00:57.847070 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.847051 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-stats-auth\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:00:57.847165 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.847135 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-default-certificate\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:00:57.849020 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.849002 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" Apr 16 14:00:57.852971 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.852947 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29dx6\" (UniqueName: \"kubernetes.io/projected/a78f3464-a81c-413a-a11a-ba6020b56874-kube-api-access-29dx6\") pod \"cluster-monitoring-operator-6667474d89-ht2bj\" (UID: \"a78f3464-a81c-413a-a11a-ba6020b56874\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj" Apr 16 14:00:57.854351 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.854329 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cps6l\" (UniqueName: \"kubernetes.io/projected/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-kube-api-access-cps6l\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:00:57.945868 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.945832 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5g5n7\" (UniqueName: \"kubernetes.io/projected/5f8c6a6f-0f8c-4610-a6c5-f33111156650-kube-api-access-5g5n7\") pod \"service-ca-operator-69965bb79d-bg6tj\" (UID: \"5f8c6a6f-0f8c-4610-a6c5-f33111156650\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bg6tj" Apr 16 14:00:57.946165 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.946150 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgszw\" (UniqueName: \"kubernetes.io/projected/f1a2e25c-5259-48d8-865a-7328810adf10-kube-api-access-sgszw\") pod \"network-check-source-7b678d77c7-7t4bf\" (UID: \"f1a2e25c-5259-48d8-865a-7328810adf10\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-7t4bf" Apr 16 14:00:57.946231 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.946201 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8c6a6f-0f8c-4610-a6c5-f33111156650-config\") pod \"service-ca-operator-69965bb79d-bg6tj\" (UID: \"5f8c6a6f-0f8c-4610-a6c5-f33111156650\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bg6tj" Apr 16 14:00:57.946231 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.946228 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f8c6a6f-0f8c-4610-a6c5-f33111156650-serving-cert\") pod \"service-ca-operator-69965bb79d-bg6tj\" (UID: \"5f8c6a6f-0f8c-4610-a6c5-f33111156650\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bg6tj" Apr 16 14:00:57.946869 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.946821 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8c6a6f-0f8c-4610-a6c5-f33111156650-config\") pod \"service-ca-operator-69965bb79d-bg6tj\" (UID: \"5f8c6a6f-0f8c-4610-a6c5-f33111156650\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bg6tj" Apr 16 14:00:57.948342 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.948320 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f8c6a6f-0f8c-4610-a6c5-f33111156650-serving-cert\") pod \"service-ca-operator-69965bb79d-bg6tj\" (UID: \"5f8c6a6f-0f8c-4610-a6c5-f33111156650\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bg6tj" Apr 16 14:00:57.954589 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.954550 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g5n7\" (UniqueName: \"kubernetes.io/projected/5f8c6a6f-0f8c-4610-a6c5-f33111156650-kube-api-access-5g5n7\") pod \"service-ca-operator-69965bb79d-bg6tj\" (UID: \"5f8c6a6f-0f8c-4610-a6c5-f33111156650\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bg6tj" Apr 16 14:00:57.954696 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.954604 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgszw\" (UniqueName: \"kubernetes.io/projected/f1a2e25c-5259-48d8-865a-7328810adf10-kube-api-access-sgszw\") pod \"network-check-source-7b678d77c7-7t4bf\" (UID: \"f1a2e25c-5259-48d8-865a-7328810adf10\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-7t4bf" Apr 16 14:00:57.983874 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:57.983834 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-kpwpf"] Apr 16 14:00:57.987935 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:00:57.987908 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb232208_c05b_4821_9c83_1582341d5232.slice/crio-2ee2c078c0bf65eca49103db7edc185ed6c9ee098d8fa20012d87950b9ab6431 WatchSource:0}: Error finding container 2ee2c078c0bf65eca49103db7edc185ed6c9ee098d8fa20012d87950b9ab6431: Status 404 returned error can't find the container with id 2ee2c078c0bf65eca49103db7edc185ed6c9ee098d8fa20012d87950b9ab6431 Apr 16 14:00:58.072318 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:58.072283 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-7t4bf" Apr 16 14:00:58.079043 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:58.079021 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bg6tj" Apr 16 14:00:58.191335 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:58.191304 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-7t4bf"] Apr 16 14:00:58.194473 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:00:58.194431 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1a2e25c_5259_48d8_865a_7328810adf10.slice/crio-61a99739494143d616b86a3cd15f7757bdcc9b607e91ba967f91ed3c1475b021 WatchSource:0}: Error finding container 61a99739494143d616b86a3cd15f7757bdcc9b607e91ba967f91ed3c1475b021: Status 404 returned error can't find the container with id 61a99739494143d616b86a3cd15f7757bdcc9b607e91ba967f91ed3c1475b021 Apr 16 14:00:58.206486 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:58.206460 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-bg6tj"] Apr 16 14:00:58.210427 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:00:58.210389 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f8c6a6f_0f8c_4610_a6c5_f33111156650.slice/crio-20cc8a02b9a499190b14df7284dbe66475b3ecf5f04425eb94a813bbdfcd0d97 WatchSource:0}: Error finding container 20cc8a02b9a499190b14df7284dbe66475b3ecf5f04425eb94a813bbdfcd0d97: Status 404 returned error can't find the container with id 20cc8a02b9a499190b14df7284dbe66475b3ecf5f04425eb94a813bbdfcd0d97 Apr 16 14:00:58.348837 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:58.348791 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-metrics-certs\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:00:58.349040 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:58.348851 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-service-ca-bundle\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:00:58.349040 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:58.348879 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ht2bj\" (UID: \"a78f3464-a81c-413a-a11a-ba6020b56874\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj" Apr 16 14:00:58.349040 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:58.348930 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:00:58.349040 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:58.348969 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:00:58.349040 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:58.349000 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-metrics-certs podName:e5a40466-a66f-4e5a-b8ea-43dd46b22ac1 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:59.34898358 +0000 UTC m=+96.689358411 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-metrics-certs") pod "router-default-5c4787ff58-x4l8s" (UID: "e5a40466-a66f-4e5a-b8ea-43dd46b22ac1") : secret "router-metrics-certs-default" not found Apr 16 14:00:58.349040 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:58.349014 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-service-ca-bundle podName:e5a40466-a66f-4e5a-b8ea-43dd46b22ac1 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:59.349008254 +0000 UTC m=+96.689383084 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-service-ca-bundle") pod "router-default-5c4787ff58-x4l8s" (UID: "e5a40466-a66f-4e5a-b8ea-43dd46b22ac1") : configmap references non-existent config key: service-ca.crt Apr 16 14:00:58.349040 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:58.349024 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls podName:a78f3464-a81c-413a-a11a-ba6020b56874 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:59.349018936 +0000 UTC m=+96.689393767 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-ht2bj" (UID: "a78f3464-a81c-413a-a11a-ba6020b56874") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:00:58.577413 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:58.577372 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bg6tj" event={"ID":"5f8c6a6f-0f8c-4610-a6c5-f33111156650","Type":"ContainerStarted","Data":"20cc8a02b9a499190b14df7284dbe66475b3ecf5f04425eb94a813bbdfcd0d97"} Apr 16 14:00:58.578635 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:58.578609 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-7t4bf" event={"ID":"f1a2e25c-5259-48d8-865a-7328810adf10","Type":"ContainerStarted","Data":"9b194c15e0aef491a6dbc395e8f075d884acd4757078042f433f377c88c72b58"} Apr 16 14:00:58.578754 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:58.578641 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-7t4bf" event={"ID":"f1a2e25c-5259-48d8-865a-7328810adf10","Type":"ContainerStarted","Data":"61a99739494143d616b86a3cd15f7757bdcc9b607e91ba967f91ed3c1475b021"} Apr 16 14:00:58.579623 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:58.579606 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" event={"ID":"cb232208-c05b-4821-9c83-1582341d5232","Type":"ContainerStarted","Data":"2ee2c078c0bf65eca49103db7edc185ed6c9ee098d8fa20012d87950b9ab6431"} Apr 16 14:00:58.593011 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:58.592963 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-7t4bf" podStartSLOduration=1.592948057 podStartE2EDuration="1.592948057s" podCreationTimestamp="2026-04-16 14:00:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:00:58.592799522 +0000 UTC m=+95.933174373" watchObservedRunningTime="2026-04-16 14:00:58.592948057 +0000 UTC m=+95.933322911" Apr 16 14:00:59.356628 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:59.356587 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-service-ca-bundle\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:00:59.356845 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:59.356645 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ht2bj\" (UID: \"a78f3464-a81c-413a-a11a-ba6020b56874\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj" Apr 16 14:00:59.356845 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:59.356741 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-metrics-certs\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:00:59.356845 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:59.356771 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-service-ca-bundle podName:e5a40466-a66f-4e5a-b8ea-43dd46b22ac1 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:01.356747484 +0000 UTC m=+98.697122331 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-service-ca-bundle") pod "router-default-5c4787ff58-x4l8s" (UID: "e5a40466-a66f-4e5a-b8ea-43dd46b22ac1") : configmap references non-existent config key: service-ca.crt Apr 16 14:00:59.356845 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:59.356835 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:00:59.357037 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:59.356835 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:00:59.357037 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:59.356879 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls podName:a78f3464-a81c-413a-a11a-ba6020b56874 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:01.356866193 +0000 UTC m=+98.697241034 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-ht2bj" (UID: "a78f3464-a81c-413a-a11a-ba6020b56874") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:00:59.357037 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:59.356908 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-metrics-certs podName:e5a40466-a66f-4e5a-b8ea-43dd46b22ac1 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:01.356897258 +0000 UTC m=+98.697272089 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-metrics-certs") pod "router-default-5c4787ff58-x4l8s" (UID: "e5a40466-a66f-4e5a-b8ea-43dd46b22ac1") : secret "router-metrics-certs-default" not found Apr 16 14:00:59.457161 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:59.457126 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert\") pod \"ingress-canary-b4d9q\" (UID: \"852d4d38-4926-4c6a-a9ad-11a60019138a\") " pod="openshift-ingress-canary/ingress-canary-b4d9q" Apr 16 14:00:59.457346 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:00:59.457181 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls\") pod \"dns-default-p6gtp\" (UID: \"59e98d1e-f9cf-4faa-bd64-a597149d3bc7\") " pod="openshift-dns/dns-default-p6gtp" Apr 16 14:00:59.457346 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:59.457261 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:59.457346 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:59.457329 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert podName:852d4d38-4926-4c6a-a9ad-11a60019138a nodeName:}" failed. No retries permitted until 2026-04-16 14:02:03.457311981 +0000 UTC m=+160.797686823 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert") pod "ingress-canary-b4d9q" (UID: "852d4d38-4926-4c6a-a9ad-11a60019138a") : secret "canary-serving-cert" not found Apr 16 14:00:59.457547 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:59.457343 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:59.457547 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:00:59.457438 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls podName:59e98d1e-f9cf-4faa-bd64-a597149d3bc7 nodeName:}" failed. No retries permitted until 2026-04-16 14:02:03.45738827 +0000 UTC m=+160.797763105 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls") pod "dns-default-p6gtp" (UID: "59e98d1e-f9cf-4faa-bd64-a597149d3bc7") : secret "dns-default-metrics-tls" not found Apr 16 14:01:00.585263 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:00.585171 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bg6tj" event={"ID":"5f8c6a6f-0f8c-4610-a6c5-f33111156650","Type":"ContainerStarted","Data":"9d42854938c9f26bfb7f9938aa28c6d8393aa93bfc1b06427503810e1dc9879a"} Apr 16 14:01:00.603287 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:00.603230 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bg6tj" podStartSLOduration=1.524477733 podStartE2EDuration="3.603213916s" podCreationTimestamp="2026-04-16 14:00:57 +0000 UTC" firstStartedPulling="2026-04-16 14:00:58.212481527 +0000 UTC m=+95.552856358" lastFinishedPulling="2026-04-16 14:01:00.291217695 +0000 UTC m=+97.631592541" observedRunningTime="2026-04-16 14:01:00.601254965 +0000 UTC m=+97.941629818" watchObservedRunningTime="2026-04-16 14:01:00.603213916 +0000 UTC m=+97.943588770" Apr 16 14:01:01.372526 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:01.372488 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-metrics-certs\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:01:01.372719 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:01.372546 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-service-ca-bundle\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:01:01.372719 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:01.372566 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ht2bj\" (UID: \"a78f3464-a81c-413a-a11a-ba6020b56874\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj" Apr 16 14:01:01.372719 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:01.372654 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:01:01.372719 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:01.372679 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-service-ca-bundle podName:e5a40466-a66f-4e5a-b8ea-43dd46b22ac1 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:05.372661622 +0000 UTC m=+102.713036452 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-service-ca-bundle") pod "router-default-5c4787ff58-x4l8s" (UID: "e5a40466-a66f-4e5a-b8ea-43dd46b22ac1") : configmap references non-existent config key: service-ca.crt Apr 16 14:01:01.372719 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:01.372721 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-metrics-certs podName:e5a40466-a66f-4e5a-b8ea-43dd46b22ac1 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:05.37270355 +0000 UTC m=+102.713078382 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-metrics-certs") pod "router-default-5c4787ff58-x4l8s" (UID: "e5a40466-a66f-4e5a-b8ea-43dd46b22ac1") : secret "router-metrics-certs-default" not found Apr 16 14:01:01.372927 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:01.372748 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:01:01.372927 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:01.372791 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls podName:a78f3464-a81c-413a-a11a-ba6020b56874 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:05.372780096 +0000 UTC m=+102.713154926 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-ht2bj" (UID: "a78f3464-a81c-413a-a11a-ba6020b56874") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:01:03.528876 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:03.528846 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-f6dlw" Apr 16 14:01:03.594073 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:03.594048 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/0.log" Apr 16 14:01:03.594217 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:03.594090 2569 generic.go:358] "Generic (PLEG): container finished" podID="cb232208-c05b-4821-9c83-1582341d5232" containerID="1e67a6d83293e47feb095fd10bddf7fdaaa232559dbaaa90479a7216e9898508" exitCode=255 Apr 16 14:01:03.594217 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:03.594142 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" event={"ID":"cb232208-c05b-4821-9c83-1582341d5232","Type":"ContainerDied","Data":"1e67a6d83293e47feb095fd10bddf7fdaaa232559dbaaa90479a7216e9898508"} Apr 16 14:01:03.594377 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:03.594366 2569 scope.go:117] "RemoveContainer" containerID="1e67a6d83293e47feb095fd10bddf7fdaaa232559dbaaa90479a7216e9898508" Apr 16 14:01:04.598501 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:04.598476 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/1.log" Apr 16 14:01:04.598885 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:04.598812 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/0.log" Apr 16 14:01:04.598885 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:04.598844 2569 generic.go:358] "Generic (PLEG): container finished" podID="cb232208-c05b-4821-9c83-1582341d5232" containerID="52ca39ac3c0c3cce9c358480d27dc3901dc5129ba2a0dcba9f738c9df4958468" exitCode=255 Apr 16 14:01:04.599001 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:04.598876 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" event={"ID":"cb232208-c05b-4821-9c83-1582341d5232","Type":"ContainerDied","Data":"52ca39ac3c0c3cce9c358480d27dc3901dc5129ba2a0dcba9f738c9df4958468"} Apr 16 14:01:04.599001 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:04.598931 2569 scope.go:117] "RemoveContainer" containerID="1e67a6d83293e47feb095fd10bddf7fdaaa232559dbaaa90479a7216e9898508" Apr 16 14:01:04.599152 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:04.599137 2569 scope.go:117] "RemoveContainer" containerID="52ca39ac3c0c3cce9c358480d27dc3901dc5129ba2a0dcba9f738c9df4958468" Apr 16 14:01:04.599350 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:04.599329 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-kpwpf_openshift-console-operator(cb232208-c05b-4821-9c83-1582341d5232)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" podUID="cb232208-c05b-4821-9c83-1582341d5232" Apr 16 14:01:05.241536 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:05.241512 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-p4h9n_ca33a748-eaae-40ea-9131-81e3f97ea69d/dns-node-resolver/0.log" Apr 16 14:01:05.408251 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:05.408213 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-service-ca-bundle\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:01:05.408251 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:05.408254 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ht2bj\" (UID: \"a78f3464-a81c-413a-a11a-ba6020b56874\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj" Apr 16 14:01:05.408484 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:05.408315 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-metrics-certs\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:01:05.408484 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:05.408381 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-service-ca-bundle podName:e5a40466-a66f-4e5a-b8ea-43dd46b22ac1 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:13.408361708 +0000 UTC m=+110.748736540 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-service-ca-bundle") pod "router-default-5c4787ff58-x4l8s" (UID: "e5a40466-a66f-4e5a-b8ea-43dd46b22ac1") : configmap references non-existent config key: service-ca.crt Apr 16 14:01:05.408484 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:05.408450 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:01:05.408603 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:05.408504 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls podName:a78f3464-a81c-413a-a11a-ba6020b56874 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:13.408491238 +0000 UTC m=+110.748866072 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-ht2bj" (UID: "a78f3464-a81c-413a-a11a-ba6020b56874") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:01:05.408603 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:05.408450 2569 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:01:05.408603 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:05.408535 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-metrics-certs podName:e5a40466-a66f-4e5a-b8ea-43dd46b22ac1 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:13.4085263 +0000 UTC m=+110.748901131 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-metrics-certs") pod "router-default-5c4787ff58-x4l8s" (UID: "e5a40466-a66f-4e5a-b8ea-43dd46b22ac1") : secret "router-metrics-certs-default" not found Apr 16 14:01:05.602369 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:05.602286 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/1.log" Apr 16 14:01:05.602753 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:05.602652 2569 scope.go:117] "RemoveContainer" containerID="52ca39ac3c0c3cce9c358480d27dc3901dc5129ba2a0dcba9f738c9df4958468" Apr 16 14:01:05.602823 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:05.602807 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-kpwpf_openshift-console-operator(cb232208-c05b-4821-9c83-1582341d5232)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" podUID="cb232208-c05b-4821-9c83-1582341d5232" Apr 16 14:01:06.240181 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:06.240157 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xv4n6_d6588176-b995-4b14-80e6-c2ba40893912/node-ca/0.log" Apr 16 14:01:07.849893 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:07.849855 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" Apr 16 14:01:07.849893 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:07.849892 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" Apr 16 14:01:07.850454 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:07.850310 2569 scope.go:117] "RemoveContainer" containerID="52ca39ac3c0c3cce9c358480d27dc3901dc5129ba2a0dcba9f738c9df4958468" Apr 16 14:01:07.850565 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:07.850544 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-kpwpf_openshift-console-operator(cb232208-c05b-4821-9c83-1582341d5232)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" podUID="cb232208-c05b-4821-9c83-1582341d5232" Apr 16 14:01:13.470522 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:13.470484 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-metrics-certs\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:01:13.470965 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:13.470538 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-service-ca-bundle\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:01:13.470965 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:13.470713 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ht2bj\" (UID: \"a78f3464-a81c-413a-a11a-ba6020b56874\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj" Apr 16 14:01:13.470965 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:13.470784 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:01:13.470965 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:13.470829 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls podName:a78f3464-a81c-413a-a11a-ba6020b56874 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:29.470815932 +0000 UTC m=+126.811190763 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-ht2bj" (UID: "a78f3464-a81c-413a-a11a-ba6020b56874") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:01:13.471326 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:13.471301 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-service-ca-bundle\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:01:13.472741 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:13.472722 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5a40466-a66f-4e5a-b8ea-43dd46b22ac1-metrics-certs\") pod \"router-default-5c4787ff58-x4l8s\" (UID: \"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1\") " pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:01:13.554415 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:13.554374 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:01:13.672549 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:13.672521 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5c4787ff58-x4l8s"] Apr 16 14:01:13.675866 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:01:13.675840 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5a40466_a66f_4e5a_b8ea_43dd46b22ac1.slice/crio-423055c2503f29fc412a054d31cf908ec4f4764abee8ff7a93f7d9db4ca2f01d WatchSource:0}: Error finding container 423055c2503f29fc412a054d31cf908ec4f4764abee8ff7a93f7d9db4ca2f01d: Status 404 returned error can't find the container with id 423055c2503f29fc412a054d31cf908ec4f4764abee8ff7a93f7d9db4ca2f01d Apr 16 14:01:14.621376 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:14.621338 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5c4787ff58-x4l8s" event={"ID":"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1","Type":"ContainerStarted","Data":"1c49bdee8989db4edd3bcb8a2af0f5e02cfded2e9678612646ec9e04b132a016"} Apr 16 14:01:14.621376 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:14.621375 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5c4787ff58-x4l8s" event={"ID":"e5a40466-a66f-4e5a-b8ea-43dd46b22ac1","Type":"ContainerStarted","Data":"423055c2503f29fc412a054d31cf908ec4f4764abee8ff7a93f7d9db4ca2f01d"} Apr 16 14:01:14.647461 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:14.647414 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5c4787ff58-x4l8s" podStartSLOduration=17.647384064 podStartE2EDuration="17.647384064s" podCreationTimestamp="2026-04-16 14:00:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:01:14.646606064 +0000 UTC m=+111.986980920" watchObservedRunningTime="2026-04-16 14:01:14.647384064 +0000 UTC m=+111.987758916" Apr 16 14:01:15.554744 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:15.554700 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:01:15.557184 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:15.557162 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:01:15.623963 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:15.623933 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:01:15.625219 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:15.625201 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5c4787ff58-x4l8s" Apr 16 14:01:22.309529 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:22.309498 2569 scope.go:117] "RemoveContainer" containerID="52ca39ac3c0c3cce9c358480d27dc3901dc5129ba2a0dcba9f738c9df4958468" Apr 16 14:01:22.643578 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:22.643497 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:01:22.643864 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:22.643849 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/1.log" Apr 16 14:01:22.643920 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:22.643880 2569 generic.go:358] "Generic (PLEG): container finished" podID="cb232208-c05b-4821-9c83-1582341d5232" containerID="c9dc9694c18d38f0027d2285e1acae441ba1076677a92cbde93c79457e44f188" exitCode=255 Apr 16 14:01:22.643962 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:22.643932 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" event={"ID":"cb232208-c05b-4821-9c83-1582341d5232","Type":"ContainerDied","Data":"c9dc9694c18d38f0027d2285e1acae441ba1076677a92cbde93c79457e44f188"} Apr 16 14:01:22.643996 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:22.643967 2569 scope.go:117] "RemoveContainer" containerID="52ca39ac3c0c3cce9c358480d27dc3901dc5129ba2a0dcba9f738c9df4958468" Apr 16 14:01:22.644285 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:22.644269 2569 scope.go:117] "RemoveContainer" containerID="c9dc9694c18d38f0027d2285e1acae441ba1076677a92cbde93c79457e44f188" Apr 16 14:01:22.644493 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:22.644475 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-kpwpf_openshift-console-operator(cb232208-c05b-4821-9c83-1582341d5232)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" podUID="cb232208-c05b-4821-9c83-1582341d5232" Apr 16 14:01:23.648185 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.648156 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:01:23.708549 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.708518 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-c4d9z"] Apr 16 14:01:23.712498 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.712483 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-c4d9z" Apr 16 14:01:23.714333 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.714315 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-kp7s2\"" Apr 16 14:01:23.714525 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.714505 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 14:01:23.715095 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.715081 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 14:01:23.724379 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.724347 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-s5shw"] Apr 16 14:01:23.727413 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.727384 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-c4d9z"] Apr 16 14:01:23.727548 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.727534 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-s5shw" Apr 16 14:01:23.730082 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.730061 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:01:23.730193 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.730103 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-g94nx\"" Apr 16 14:01:23.730598 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.730571 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:01:23.730710 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.730694 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:01:23.730786 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.730772 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:01:23.740470 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.740447 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-s5shw"] Apr 16 14:01:23.826406 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.826363 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6d7569c8cd-4qw74"] Apr 16 14:01:23.829302 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.829280 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:23.831722 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.831681 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:01:23.831722 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.831708 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:01:23.832032 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.832013 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7shls\"" Apr 16 14:01:23.832135 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.832070 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:01:23.840310 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.840289 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:01:23.848373 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.848349 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d7569c8cd-4qw74"] Apr 16 14:01:23.855502 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.855478 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1a771ca7-2942-4693-8a9f-243a8e6f82d5-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-c4d9z\" (UID: \"1a771ca7-2942-4693-8a9f-243a8e6f82d5\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-c4d9z" Apr 16 14:01:23.855622 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.855508 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9033afb5-4784-4b50-813c-d22961325cf4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s5shw\" (UID: \"9033afb5-4784-4b50-813c-d22961325cf4\") " pod="openshift-insights/insights-runtime-extractor-s5shw" Apr 16 14:01:23.855622 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.855529 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfzf4\" (UniqueName: \"kubernetes.io/projected/9033afb5-4784-4b50-813c-d22961325cf4-kube-api-access-zfzf4\") pod \"insights-runtime-extractor-s5shw\" (UID: \"9033afb5-4784-4b50-813c-d22961325cf4\") " pod="openshift-insights/insights-runtime-extractor-s5shw" Apr 16 14:01:23.855622 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.855599 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1a771ca7-2942-4693-8a9f-243a8e6f82d5-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-c4d9z\" (UID: \"1a771ca7-2942-4693-8a9f-243a8e6f82d5\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-c4d9z" Apr 16 14:01:23.855744 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.855650 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9033afb5-4784-4b50-813c-d22961325cf4-crio-socket\") pod \"insights-runtime-extractor-s5shw\" (UID: \"9033afb5-4784-4b50-813c-d22961325cf4\") " pod="openshift-insights/insights-runtime-extractor-s5shw" Apr 16 14:01:23.855744 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.855697 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9033afb5-4784-4b50-813c-d22961325cf4-data-volume\") pod \"insights-runtime-extractor-s5shw\" (UID: \"9033afb5-4784-4b50-813c-d22961325cf4\") " pod="openshift-insights/insights-runtime-extractor-s5shw" Apr 16 14:01:23.855744 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.855717 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9033afb5-4784-4b50-813c-d22961325cf4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s5shw\" (UID: \"9033afb5-4784-4b50-813c-d22961325cf4\") " pod="openshift-insights/insights-runtime-extractor-s5shw" Apr 16 14:01:23.956587 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.956548 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lgpg\" (UniqueName: \"kubernetes.io/projected/870e7376-e0fa-40ca-ad2c-98fa6189639f-kube-api-access-8lgpg\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:23.956587 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.956590 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/870e7376-e0fa-40ca-ad2c-98fa6189639f-trusted-ca\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:23.956792 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.956626 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1a771ca7-2942-4693-8a9f-243a8e6f82d5-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-c4d9z\" (UID: \"1a771ca7-2942-4693-8a9f-243a8e6f82d5\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-c4d9z" Apr 16 14:01:23.956792 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.956749 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/870e7376-e0fa-40ca-ad2c-98fa6189639f-registry-tls\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:23.956792 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.956775 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/870e7376-e0fa-40ca-ad2c-98fa6189639f-installation-pull-secrets\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:23.956880 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.956826 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/870e7376-e0fa-40ca-ad2c-98fa6189639f-bound-sa-token\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:23.956880 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.956853 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1a771ca7-2942-4693-8a9f-243a8e6f82d5-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-c4d9z\" (UID: \"1a771ca7-2942-4693-8a9f-243a8e6f82d5\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-c4d9z" Apr 16 14:01:23.956939 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.956899 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfzf4\" (UniqueName: \"kubernetes.io/projected/9033afb5-4784-4b50-813c-d22961325cf4-kube-api-access-zfzf4\") pod \"insights-runtime-extractor-s5shw\" (UID: \"9033afb5-4784-4b50-813c-d22961325cf4\") " pod="openshift-insights/insights-runtime-extractor-s5shw" Apr 16 14:01:23.956984 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.956968 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/870e7376-e0fa-40ca-ad2c-98fa6189639f-image-registry-private-configuration\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:23.957038 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.957014 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/870e7376-e0fa-40ca-ad2c-98fa6189639f-ca-trust-extracted\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:23.957203 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.957047 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/870e7376-e0fa-40ca-ad2c-98fa6189639f-registry-certificates\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:23.957203 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.957097 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9033afb5-4784-4b50-813c-d22961325cf4-crio-socket\") pod \"insights-runtime-extractor-s5shw\" (UID: \"9033afb5-4784-4b50-813c-d22961325cf4\") " pod="openshift-insights/insights-runtime-extractor-s5shw" Apr 16 14:01:23.957203 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.957129 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9033afb5-4784-4b50-813c-d22961325cf4-data-volume\") pod \"insights-runtime-extractor-s5shw\" (UID: \"9033afb5-4784-4b50-813c-d22961325cf4\") " pod="openshift-insights/insights-runtime-extractor-s5shw" Apr 16 14:01:23.957203 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.957158 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9033afb5-4784-4b50-813c-d22961325cf4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s5shw\" (UID: \"9033afb5-4784-4b50-813c-d22961325cf4\") " pod="openshift-insights/insights-runtime-extractor-s5shw" Apr 16 14:01:23.957203 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.957175 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9033afb5-4784-4b50-813c-d22961325cf4-crio-socket\") pod \"insights-runtime-extractor-s5shw\" (UID: \"9033afb5-4784-4b50-813c-d22961325cf4\") " pod="openshift-insights/insights-runtime-extractor-s5shw" Apr 16 14:01:23.957203 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.957189 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9033afb5-4784-4b50-813c-d22961325cf4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s5shw\" (UID: \"9033afb5-4784-4b50-813c-d22961325cf4\") " pod="openshift-insights/insights-runtime-extractor-s5shw" Apr 16 14:01:23.957548 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.957501 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9033afb5-4784-4b50-813c-d22961325cf4-data-volume\") pod \"insights-runtime-extractor-s5shw\" (UID: \"9033afb5-4784-4b50-813c-d22961325cf4\") " pod="openshift-insights/insights-runtime-extractor-s5shw" Apr 16 14:01:23.957602 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.957555 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1a771ca7-2942-4693-8a9f-243a8e6f82d5-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-c4d9z\" (UID: \"1a771ca7-2942-4693-8a9f-243a8e6f82d5\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-c4d9z" Apr 16 14:01:23.957759 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.957740 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9033afb5-4784-4b50-813c-d22961325cf4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s5shw\" (UID: \"9033afb5-4784-4b50-813c-d22961325cf4\") " pod="openshift-insights/insights-runtime-extractor-s5shw" Apr 16 14:01:23.959269 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.959248 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1a771ca7-2942-4693-8a9f-243a8e6f82d5-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-c4d9z\" (UID: \"1a771ca7-2942-4693-8a9f-243a8e6f82d5\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-c4d9z" Apr 16 14:01:23.959460 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.959444 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9033afb5-4784-4b50-813c-d22961325cf4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s5shw\" (UID: \"9033afb5-4784-4b50-813c-d22961325cf4\") " pod="openshift-insights/insights-runtime-extractor-s5shw" Apr 16 14:01:23.965743 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:23.965709 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfzf4\" (UniqueName: \"kubernetes.io/projected/9033afb5-4784-4b50-813c-d22961325cf4-kube-api-access-zfzf4\") pod \"insights-runtime-extractor-s5shw\" (UID: \"9033afb5-4784-4b50-813c-d22961325cf4\") " pod="openshift-insights/insights-runtime-extractor-s5shw" Apr 16 14:01:24.021344 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.021302 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-c4d9z" Apr 16 14:01:24.036064 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.036036 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-s5shw" Apr 16 14:01:24.057747 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.057715 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8lgpg\" (UniqueName: \"kubernetes.io/projected/870e7376-e0fa-40ca-ad2c-98fa6189639f-kube-api-access-8lgpg\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:24.057947 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.057765 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/870e7376-e0fa-40ca-ad2c-98fa6189639f-trusted-ca\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:24.057947 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.057819 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/870e7376-e0fa-40ca-ad2c-98fa6189639f-registry-tls\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:24.057947 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.057850 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/870e7376-e0fa-40ca-ad2c-98fa6189639f-installation-pull-secrets\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:24.057947 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.057890 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/870e7376-e0fa-40ca-ad2c-98fa6189639f-bound-sa-token\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:24.057947 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.057932 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/870e7376-e0fa-40ca-ad2c-98fa6189639f-image-registry-private-configuration\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:24.058191 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.057969 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/870e7376-e0fa-40ca-ad2c-98fa6189639f-ca-trust-extracted\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:24.058191 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.057995 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/870e7376-e0fa-40ca-ad2c-98fa6189639f-registry-certificates\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:24.059426 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.059007 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/870e7376-e0fa-40ca-ad2c-98fa6189639f-registry-certificates\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:24.059650 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.059623 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/870e7376-e0fa-40ca-ad2c-98fa6189639f-ca-trust-extracted\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:24.060097 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.060075 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/870e7376-e0fa-40ca-ad2c-98fa6189639f-trusted-ca\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:24.061662 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.061639 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/870e7376-e0fa-40ca-ad2c-98fa6189639f-installation-pull-secrets\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:24.061737 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.061663 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/870e7376-e0fa-40ca-ad2c-98fa6189639f-registry-tls\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:24.062068 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.062048 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/870e7376-e0fa-40ca-ad2c-98fa6189639f-image-registry-private-configuration\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:24.071796 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.071771 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/870e7376-e0fa-40ca-ad2c-98fa6189639f-bound-sa-token\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:24.072599 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.072564 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lgpg\" (UniqueName: \"kubernetes.io/projected/870e7376-e0fa-40ca-ad2c-98fa6189639f-kube-api-access-8lgpg\") pod \"image-registry-6d7569c8cd-4qw74\" (UID: \"870e7376-e0fa-40ca-ad2c-98fa6189639f\") " pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:24.138901 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.138867 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:24.174734 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.174677 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-c4d9z"] Apr 16 14:01:24.176722 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:01:24.176687 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a771ca7_2942_4693_8a9f_243a8e6f82d5.slice/crio-ced8751eea48c100426f60c1537ef3de1691e089d1c262a3505e3487f14f8448 WatchSource:0}: Error finding container ced8751eea48c100426f60c1537ef3de1691e089d1c262a3505e3487f14f8448: Status 404 returned error can't find the container with id ced8751eea48c100426f60c1537ef3de1691e089d1c262a3505e3487f14f8448 Apr 16 14:01:24.198870 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.198658 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-s5shw"] Apr 16 14:01:24.202404 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:01:24.202357 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9033afb5_4784_4b50_813c_d22961325cf4.slice/crio-b8d8c928bb70594267a57a1a7e71ce132cec05be4d2b71868191243cb166bab5 WatchSource:0}: Error finding container b8d8c928bb70594267a57a1a7e71ce132cec05be4d2b71868191243cb166bab5: Status 404 returned error can't find the container with id b8d8c928bb70594267a57a1a7e71ce132cec05be4d2b71868191243cb166bab5 Apr 16 14:01:24.276355 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.276332 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d7569c8cd-4qw74"] Apr 16 14:01:24.279473 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:01:24.279444 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod870e7376_e0fa_40ca_ad2c_98fa6189639f.slice/crio-989f9c3a8e6fc4ede7e34c8e551be27b41b8ec8dbc678bc3e60210ff577ec148 WatchSource:0}: Error finding container 989f9c3a8e6fc4ede7e34c8e551be27b41b8ec8dbc678bc3e60210ff577ec148: Status 404 returned error can't find the container with id 989f9c3a8e6fc4ede7e34c8e551be27b41b8ec8dbc678bc3e60210ff577ec148 Apr 16 14:01:24.653123 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.653029 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s5shw" event={"ID":"9033afb5-4784-4b50-813c-d22961325cf4","Type":"ContainerStarted","Data":"c4a205191ca0e9f372577a7ee06e6f99e43e5de31e7c841f558751702585d624"} Apr 16 14:01:24.653123 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.653081 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s5shw" event={"ID":"9033afb5-4784-4b50-813c-d22961325cf4","Type":"ContainerStarted","Data":"b8d8c928bb70594267a57a1a7e71ce132cec05be4d2b71868191243cb166bab5"} Apr 16 14:01:24.654634 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.654603 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" event={"ID":"870e7376-e0fa-40ca-ad2c-98fa6189639f","Type":"ContainerStarted","Data":"b690b8f4cc4901a36064576f9e423dae22ebb69ef4c2dfd30acd1c548d795d09"} Apr 16 14:01:24.654780 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.654640 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" event={"ID":"870e7376-e0fa-40ca-ad2c-98fa6189639f","Type":"ContainerStarted","Data":"989f9c3a8e6fc4ede7e34c8e551be27b41b8ec8dbc678bc3e60210ff577ec148"} Apr 16 14:01:24.654780 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.654740 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:01:24.655940 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.655919 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-c4d9z" event={"ID":"1a771ca7-2942-4693-8a9f-243a8e6f82d5","Type":"ContainerStarted","Data":"ced8751eea48c100426f60c1537ef3de1691e089d1c262a3505e3487f14f8448"} Apr 16 14:01:24.674029 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:24.673977 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" podStartSLOduration=1.6739626699999999 podStartE2EDuration="1.67396267s" podCreationTimestamp="2026-04-16 14:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:01:24.673339593 +0000 UTC m=+122.013714461" watchObservedRunningTime="2026-04-16 14:01:24.67396267 +0000 UTC m=+122.014337501" Apr 16 14:01:25.660833 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:25.660746 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s5shw" event={"ID":"9033afb5-4784-4b50-813c-d22961325cf4","Type":"ContainerStarted","Data":"01585ed194c8316617647dbdce900762cf31b9c2b09e27f3746ccde694228510"} Apr 16 14:01:25.661963 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:25.661938 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-c4d9z" event={"ID":"1a771ca7-2942-4693-8a9f-243a8e6f82d5","Type":"ContainerStarted","Data":"66d54562255f585b91a21c0c3ef45f64113d142eb4ab8c6499afa2b14669f620"} Apr 16 14:01:25.677548 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:25.677500 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-c4d9z" podStartSLOduration=1.59408658 podStartE2EDuration="2.677486619s" podCreationTimestamp="2026-04-16 14:01:23 +0000 UTC" firstStartedPulling="2026-04-16 14:01:24.178915377 +0000 UTC m=+121.519290209" lastFinishedPulling="2026-04-16 14:01:25.262315418 +0000 UTC m=+122.602690248" observedRunningTime="2026-04-16 14:01:25.677188961 +0000 UTC m=+123.017563815" watchObservedRunningTime="2026-04-16 14:01:25.677486619 +0000 UTC m=+123.017861471" Apr 16 14:01:27.850146 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:27.850103 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" Apr 16 14:01:27.850146 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:27.850152 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" Apr 16 14:01:27.850622 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:27.850568 2569 scope.go:117] "RemoveContainer" containerID="c9dc9694c18d38f0027d2285e1acae441ba1076677a92cbde93c79457e44f188" Apr 16 14:01:27.850795 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:27.850776 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-kpwpf_openshift-console-operator(cb232208-c05b-4821-9c83-1582341d5232)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" podUID="cb232208-c05b-4821-9c83-1582341d5232" Apr 16 14:01:28.671358 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:28.671317 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s5shw" event={"ID":"9033afb5-4784-4b50-813c-d22961325cf4","Type":"ContainerStarted","Data":"b47101c8b6b36767d439ca1693f832576ee2c238d93ad838d68c1ddf6e253d39"} Apr 16 14:01:28.691888 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:28.691828 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-s5shw" podStartSLOduration=1.827593024 podStartE2EDuration="5.691812455s" podCreationTimestamp="2026-04-16 14:01:23 +0000 UTC" firstStartedPulling="2026-04-16 14:01:24.273358489 +0000 UTC m=+121.613733324" lastFinishedPulling="2026-04-16 14:01:28.137577923 +0000 UTC m=+125.477952755" observedRunningTime="2026-04-16 14:01:28.690575917 +0000 UTC m=+126.030950770" watchObservedRunningTime="2026-04-16 14:01:28.691812455 +0000 UTC m=+126.032187309" Apr 16 14:01:29.505768 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:29.505710 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ht2bj\" (UID: \"a78f3464-a81c-413a-a11a-ba6020b56874\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj" Apr 16 14:01:29.508104 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:29.508084 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a78f3464-a81c-413a-a11a-ba6020b56874-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-ht2bj\" (UID: \"a78f3464-a81c-413a-a11a-ba6020b56874\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj" Apr 16 14:01:29.751060 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:29.751026 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-4lwtk\"" Apr 16 14:01:29.759964 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:29.759909 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj" Apr 16 14:01:29.878582 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:29.878553 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj"] Apr 16 14:01:29.881500 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:01:29.881469 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda78f3464_a81c_413a_a11a_ba6020b56874.slice/crio-4de34046c47c509c2bc83ea50f5983442a691bdc3c50446057b476ee662e7240 WatchSource:0}: Error finding container 4de34046c47c509c2bc83ea50f5983442a691bdc3c50446057b476ee662e7240: Status 404 returned error can't find the container with id 4de34046c47c509c2bc83ea50f5983442a691bdc3c50446057b476ee662e7240 Apr 16 14:01:30.677489 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:30.677452 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj" event={"ID":"a78f3464-a81c-413a-a11a-ba6020b56874","Type":"ContainerStarted","Data":"4de34046c47c509c2bc83ea50f5983442a691bdc3c50446057b476ee662e7240"} Apr 16 14:01:32.683848 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:32.683812 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj" event={"ID":"a78f3464-a81c-413a-a11a-ba6020b56874","Type":"ContainerStarted","Data":"7e07467bac9b66853b4812c16be45db6fcfc28613f6bab4c1f172d4650df5695"} Apr 16 14:01:32.700277 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:32.700219 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-ht2bj" podStartSLOduration=33.967573665 podStartE2EDuration="35.700200598s" podCreationTimestamp="2026-04-16 14:00:57 +0000 UTC" firstStartedPulling="2026-04-16 14:01:29.883328628 +0000 UTC m=+127.223703459" lastFinishedPulling="2026-04-16 14:01:31.615955556 +0000 UTC m=+128.956330392" observedRunningTime="2026-04-16 14:01:32.699343174 +0000 UTC m=+130.039718027" watchObservedRunningTime="2026-04-16 14:01:32.700200598 +0000 UTC m=+130.040575452" Apr 16 14:01:33.136381 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:33.136290 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs\") pod \"network-metrics-daemon-lfj5m\" (UID: \"44d7f301-04c1-422a-a689-9d0e4f02952c\") " pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 14:01:33.138593 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:33.138567 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44d7f301-04c1-422a-a689-9d0e4f02952c-metrics-certs\") pod \"network-metrics-daemon-lfj5m\" (UID: \"44d7f301-04c1-422a-a689-9d0e4f02952c\") " pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 14:01:33.322893 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:33.322863 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x5z8x\"" Apr 16 14:01:33.330889 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:33.330862 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lfj5m" Apr 16 14:01:33.451028 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:33.450999 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lfj5m"] Apr 16 14:01:33.454054 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:01:33.454023 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44d7f301_04c1_422a_a689_9d0e4f02952c.slice/crio-dab2dd6fa3f4474e540120bf6ece1565c24d5c606029388da313928a329dc141 WatchSource:0}: Error finding container dab2dd6fa3f4474e540120bf6ece1565c24d5c606029388da313928a329dc141: Status 404 returned error can't find the container with id dab2dd6fa3f4474e540120bf6ece1565c24d5c606029388da313928a329dc141 Apr 16 14:01:33.687700 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:33.687666 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lfj5m" event={"ID":"44d7f301-04c1-422a-a689-9d0e4f02952c","Type":"ContainerStarted","Data":"dab2dd6fa3f4474e540120bf6ece1565c24d5c606029388da313928a329dc141"} Apr 16 14:01:36.700682 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:36.700644 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lfj5m" event={"ID":"44d7f301-04c1-422a-a689-9d0e4f02952c","Type":"ContainerStarted","Data":"7c2295b6c84be7b21146f5bfa1ab607a81205520842a92e6b1e8b08c632d8ea7"} Apr 16 14:01:36.701038 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:36.700688 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lfj5m" event={"ID":"44d7f301-04c1-422a-a689-9d0e4f02952c","Type":"ContainerStarted","Data":"3e630e12ebc357edacaef0f19feba93e0f0f4da4cb5037f36a8c39d84b1d9761"} Apr 16 14:01:36.722234 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:36.722181 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lfj5m" podStartSLOduration=131.156664727 podStartE2EDuration="2m13.722162828s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 14:01:33.455919622 +0000 UTC m=+130.796294453" lastFinishedPulling="2026-04-16 14:01:36.021417716 +0000 UTC m=+133.361792554" observedRunningTime="2026-04-16 14:01:36.720624756 +0000 UTC m=+134.060999609" watchObservedRunningTime="2026-04-16 14:01:36.722162828 +0000 UTC m=+134.062537683" Apr 16 14:01:40.309107 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.309080 2569 scope.go:117] "RemoveContainer" containerID="c9dc9694c18d38f0027d2285e1acae441ba1076677a92cbde93c79457e44f188" Apr 16 14:01:40.309486 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:40.309232 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-kpwpf_openshift-console-operator(cb232208-c05b-4821-9c83-1582341d5232)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" podUID="cb232208-c05b-4821-9c83-1582341d5232" Apr 16 14:01:40.615748 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.615665 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-qbddt"] Apr 16 14:01:40.618013 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.617994 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" Apr 16 14:01:40.620508 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.620480 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 14:01:40.620508 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.620497 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:01:40.620658 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.620512 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 14:01:40.620658 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.620535 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-5zldq\"" Apr 16 14:01:40.620658 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.620609 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 14:01:40.634626 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.634605 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-qbddt"] Apr 16 14:01:40.635506 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.635487 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-5866k"] Apr 16 14:01:40.637373 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.637357 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.639237 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.639219 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:01:40.639492 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.639472 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:01:40.639589 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.639489 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wstcq\"" Apr 16 14:01:40.639589 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.639472 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:01:40.797968 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.797931 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/29aa9554-40e1-4efd-b0bc-34ab8445a858-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-qbddt\" (UID: \"29aa9554-40e1-4efd-b0bc-34ab8445a858\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" Apr 16 14:01:40.797968 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.797973 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-sys\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.798203 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.797997 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/29aa9554-40e1-4efd-b0bc-34ab8445a858-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-qbddt\" (UID: \"29aa9554-40e1-4efd-b0bc-34ab8445a858\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" Apr 16 14:01:40.798203 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.798044 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-node-exporter-textfile\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.798203 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.798079 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-node-exporter-tls\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.798203 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.798141 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.798203 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.798193 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-node-exporter-wtmp\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.798450 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.798211 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/29aa9554-40e1-4efd-b0bc-34ab8445a858-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-qbddt\" (UID: \"29aa9554-40e1-4efd-b0bc-34ab8445a858\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" Apr 16 14:01:40.798450 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.798231 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/29aa9554-40e1-4efd-b0bc-34ab8445a858-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-qbddt\" (UID: \"29aa9554-40e1-4efd-b0bc-34ab8445a858\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" Apr 16 14:01:40.798450 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.798253 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-root\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.798450 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.798271 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-node-exporter-accelerators-collector-config\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.798450 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.798288 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4shq\" (UniqueName: \"kubernetes.io/projected/29aa9554-40e1-4efd-b0bc-34ab8445a858-kube-api-access-p4shq\") pod \"kube-state-metrics-7479c89684-qbddt\" (UID: \"29aa9554-40e1-4efd-b0bc-34ab8445a858\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" Apr 16 14:01:40.798450 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.798315 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/29aa9554-40e1-4efd-b0bc-34ab8445a858-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-qbddt\" (UID: \"29aa9554-40e1-4efd-b0bc-34ab8445a858\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" Apr 16 14:01:40.798450 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.798348 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57nn4\" (UniqueName: \"kubernetes.io/projected/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-kube-api-access-57nn4\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.798450 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.798380 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-metrics-client-ca\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.898790 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.898704 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57nn4\" (UniqueName: \"kubernetes.io/projected/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-kube-api-access-57nn4\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.898790 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.898762 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-metrics-client-ca\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.900420 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.898938 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/29aa9554-40e1-4efd-b0bc-34ab8445a858-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-qbddt\" (UID: \"29aa9554-40e1-4efd-b0bc-34ab8445a858\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" Apr 16 14:01:40.900420 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.898986 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-sys\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.900420 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.899013 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/29aa9554-40e1-4efd-b0bc-34ab8445a858-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-qbddt\" (UID: \"29aa9554-40e1-4efd-b0bc-34ab8445a858\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" Apr 16 14:01:40.900420 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.899062 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-node-exporter-textfile\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.900420 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.899123 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-sys\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.900420 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.899197 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-node-exporter-tls\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.900420 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.899244 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.900420 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.899282 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-node-exporter-wtmp\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.900420 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.899309 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/29aa9554-40e1-4efd-b0bc-34ab8445a858-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-qbddt\" (UID: \"29aa9554-40e1-4efd-b0bc-34ab8445a858\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" Apr 16 14:01:40.900420 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.899342 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/29aa9554-40e1-4efd-b0bc-34ab8445a858-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-qbddt\" (UID: \"29aa9554-40e1-4efd-b0bc-34ab8445a858\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" Apr 16 14:01:40.900420 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.899373 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-node-exporter-textfile\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.900420 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.899377 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-root\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.900420 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.899428 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-metrics-client-ca\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.900420 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.899462 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-node-exporter-accelerators-collector-config\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.900420 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.899490 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4shq\" (UniqueName: \"kubernetes.io/projected/29aa9554-40e1-4efd-b0bc-34ab8445a858-kube-api-access-p4shq\") pod \"kube-state-metrics-7479c89684-qbddt\" (UID: \"29aa9554-40e1-4efd-b0bc-34ab8445a858\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" Apr 16 14:01:40.900420 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.899491 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-node-exporter-wtmp\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.901815 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.899517 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/29aa9554-40e1-4efd-b0bc-34ab8445a858-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-qbddt\" (UID: \"29aa9554-40e1-4efd-b0bc-34ab8445a858\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" Apr 16 14:01:40.901815 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:40.899574 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 14:01:40.901815 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:40.899647 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-node-exporter-tls podName:0aabbbc1-d84a-4c1a-ae38-92ce1b31b836 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:41.399624372 +0000 UTC m=+138.739999202 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-node-exporter-tls") pod "node-exporter-5866k" (UID: "0aabbbc1-d84a-4c1a-ae38-92ce1b31b836") : secret "node-exporter-tls" not found Apr 16 14:01:40.901815 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.899911 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/29aa9554-40e1-4efd-b0bc-34ab8445a858-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-qbddt\" (UID: \"29aa9554-40e1-4efd-b0bc-34ab8445a858\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" Apr 16 14:01:40.901815 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.900190 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-root\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.901815 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.900211 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/29aa9554-40e1-4efd-b0bc-34ab8445a858-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-qbddt\" (UID: \"29aa9554-40e1-4efd-b0bc-34ab8445a858\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" Apr 16 14:01:40.901815 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.900292 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/29aa9554-40e1-4efd-b0bc-34ab8445a858-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-qbddt\" (UID: \"29aa9554-40e1-4efd-b0bc-34ab8445a858\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" Apr 16 14:01:40.901815 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.900457 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-node-exporter-accelerators-collector-config\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.902213 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.902004 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/29aa9554-40e1-4efd-b0bc-34ab8445a858-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-qbddt\" (UID: \"29aa9554-40e1-4efd-b0bc-34ab8445a858\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" Apr 16 14:01:40.902213 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.902020 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/29aa9554-40e1-4efd-b0bc-34ab8445a858-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-qbddt\" (UID: \"29aa9554-40e1-4efd-b0bc-34ab8445a858\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" Apr 16 14:01:40.902853 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.902834 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.922250 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.922226 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57nn4\" (UniqueName: \"kubernetes.io/projected/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-kube-api-access-57nn4\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:40.922945 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.922925 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4shq\" (UniqueName: \"kubernetes.io/projected/29aa9554-40e1-4efd-b0bc-34ab8445a858-kube-api-access-p4shq\") pod \"kube-state-metrics-7479c89684-qbddt\" (UID: \"29aa9554-40e1-4efd-b0bc-34ab8445a858\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" Apr 16 14:01:40.926562 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:40.926547 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" Apr 16 14:01:41.061041 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:41.061008 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-qbddt"] Apr 16 14:01:41.064332 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:01:41.064305 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29aa9554_40e1_4efd_b0bc_34ab8445a858.slice/crio-32b76a84803e0f6032696d41919307c1d7a9aca8d6f94ab8f1e75f7dfdeed98b WatchSource:0}: Error finding container 32b76a84803e0f6032696d41919307c1d7a9aca8d6f94ab8f1e75f7dfdeed98b: Status 404 returned error can't find the container with id 32b76a84803e0f6032696d41919307c1d7a9aca8d6f94ab8f1e75f7dfdeed98b Apr 16 14:01:41.405414 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:41.405368 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-node-exporter-tls\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:41.407587 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:41.407569 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0aabbbc1-d84a-4c1a-ae38-92ce1b31b836-node-exporter-tls\") pod \"node-exporter-5866k\" (UID: \"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836\") " pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:41.549932 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:41.549902 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5866k" Apr 16 14:01:41.557800 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:01:41.557765 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aabbbc1_d84a_4c1a_ae38_92ce1b31b836.slice/crio-cfd982b75f788db593e733eeabf1fa43adeaed8122dccaa6f112806974df6579 WatchSource:0}: Error finding container cfd982b75f788db593e733eeabf1fa43adeaed8122dccaa6f112806974df6579: Status 404 returned error can't find the container with id cfd982b75f788db593e733eeabf1fa43adeaed8122dccaa6f112806974df6579 Apr 16 14:01:41.713718 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:41.713678 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5866k" event={"ID":"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836","Type":"ContainerStarted","Data":"cfd982b75f788db593e733eeabf1fa43adeaed8122dccaa6f112806974df6579"} Apr 16 14:01:41.714633 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:41.714614 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" event={"ID":"29aa9554-40e1-4efd-b0bc-34ab8445a858","Type":"ContainerStarted","Data":"32b76a84803e0f6032696d41919307c1d7a9aca8d6f94ab8f1e75f7dfdeed98b"} Apr 16 14:01:43.725592 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:43.725546 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" event={"ID":"29aa9554-40e1-4efd-b0bc-34ab8445a858","Type":"ContainerStarted","Data":"d406eb4fd5490890e11d7f3dc706caeddb698dc814848a7342b19c8611df9ebb"} Apr 16 14:01:43.725592 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:43.725590 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" event={"ID":"29aa9554-40e1-4efd-b0bc-34ab8445a858","Type":"ContainerStarted","Data":"e64815c214431a861da55e32234adc725db12fb7756666164db23bceb65fa67f"} Apr 16 14:01:43.726072 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:43.725603 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" event={"ID":"29aa9554-40e1-4efd-b0bc-34ab8445a858","Type":"ContainerStarted","Data":"8b657b3f3e64160e79c6b4d156959420a9c9c8ac30277cfa6fe47c2d46f449e2"} Apr 16 14:01:43.750957 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:43.750899 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-qbddt" podStartSLOduration=2.131082536 podStartE2EDuration="3.750881885s" podCreationTimestamp="2026-04-16 14:01:40 +0000 UTC" firstStartedPulling="2026-04-16 14:01:41.066116573 +0000 UTC m=+138.406491404" lastFinishedPulling="2026-04-16 14:01:42.685915902 +0000 UTC m=+140.026290753" observedRunningTime="2026-04-16 14:01:43.749619088 +0000 UTC m=+141.089993965" watchObservedRunningTime="2026-04-16 14:01:43.750881885 +0000 UTC m=+141.091256739" Apr 16 14:01:44.143427 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:44.143377 2569 patch_prober.go:28] interesting pod/image-registry-6d7569c8cd-4qw74 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:01:44.143573 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:44.143459 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" podUID="870e7376-e0fa-40ca-ad2c-98fa6189639f" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:01:44.729113 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:44.729083 2569 generic.go:358] "Generic (PLEG): container finished" podID="0aabbbc1-d84a-4c1a-ae38-92ce1b31b836" containerID="81b76548a42468fda0d66de3d5adba4de3964013882b6cba17282c88ae6ea4bd" exitCode=0 Apr 16 14:01:44.729525 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:44.729173 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5866k" event={"ID":"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836","Type":"ContainerDied","Data":"81b76548a42468fda0d66de3d5adba4de3964013882b6cba17282c88ae6ea4bd"} Apr 16 14:01:45.665795 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:45.665759 2569 patch_prober.go:28] interesting pod/image-registry-6d7569c8cd-4qw74 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:01:45.666088 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:45.665816 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" podUID="870e7376-e0fa-40ca-ad2c-98fa6189639f" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:01:45.733761 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:45.733724 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5866k" event={"ID":"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836","Type":"ContainerStarted","Data":"58949f7d5c45be53a4cddfdf0812fbc6556854eb9432c7ea8e19cba67cedd94f"} Apr 16 14:01:45.733761 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:45.733759 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5866k" event={"ID":"0aabbbc1-d84a-4c1a-ae38-92ce1b31b836","Type":"ContainerStarted","Data":"87bfe6502b8387679516bc5b6136c2a8a70ff46bd71287b6e9a55c0cc7bb7056"} Apr 16 14:01:45.756939 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:45.756882 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-5866k" podStartSLOduration=3.317803098 podStartE2EDuration="5.756866881s" podCreationTimestamp="2026-04-16 14:01:40 +0000 UTC" firstStartedPulling="2026-04-16 14:01:41.559442067 +0000 UTC m=+138.899816897" lastFinishedPulling="2026-04-16 14:01:43.998505849 +0000 UTC m=+141.338880680" observedRunningTime="2026-04-16 14:01:45.755789894 +0000 UTC m=+143.096164746" watchObservedRunningTime="2026-04-16 14:01:45.756866881 +0000 UTC m=+143.097241733" Apr 16 14:01:54.143605 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:54.143567 2569 patch_prober.go:28] interesting pod/image-registry-6d7569c8cd-4qw74 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:01:54.144085 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:54.143660 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" podUID="870e7376-e0fa-40ca-ad2c-98fa6189639f" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:01:55.308752 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:55.308725 2569 scope.go:117] "RemoveContainer" containerID="c9dc9694c18d38f0027d2285e1acae441ba1076677a92cbde93c79457e44f188" Apr 16 14:01:55.665898 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:55.665806 2569 patch_prober.go:28] interesting pod/image-registry-6d7569c8cd-4qw74 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:01:55.665898 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:55.665851 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" podUID="870e7376-e0fa-40ca-ad2c-98fa6189639f" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:01:55.759032 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:55.759005 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:01:55.759172 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:55.759053 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" event={"ID":"cb232208-c05b-4821-9c83-1582341d5232","Type":"ContainerStarted","Data":"b1c57c0259fd5601145d3852b4b04ed602e4e6d79f06b1dd65591c1440eab091"} Apr 16 14:01:55.759308 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:55.759285 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" Apr 16 14:01:55.777612 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:55.777570 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" podStartSLOduration=54.049640847 podStartE2EDuration="58.777558709s" podCreationTimestamp="2026-04-16 14:00:57 +0000 UTC" firstStartedPulling="2026-04-16 14:00:57.989540676 +0000 UTC m=+95.329915506" lastFinishedPulling="2026-04-16 14:01:02.717458526 +0000 UTC m=+100.057833368" observedRunningTime="2026-04-16 14:01:55.776296968 +0000 UTC m=+153.116671821" watchObservedRunningTime="2026-04-16 14:01:55.777558709 +0000 UTC m=+153.117933596" Apr 16 14:01:56.017961 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:56.017933 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-kpwpf" Apr 16 14:01:56.216770 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:56.216733 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-wqjg7"] Apr 16 14:01:56.218605 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:56.218586 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-wqjg7" Apr 16 14:01:56.222203 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:56.222183 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 14:01:56.222974 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:56.222956 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 14:01:56.223076 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:56.223069 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-jjj6p\"" Apr 16 14:01:56.240428 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:56.240391 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-wqjg7"] Apr 16 14:01:56.317964 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:56.317893 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cft4w\" (UniqueName: \"kubernetes.io/projected/06c957ef-10ea-4050-a9a1-35994a3e35f8-kube-api-access-cft4w\") pod \"downloads-586b57c7b4-wqjg7\" (UID: \"06c957ef-10ea-4050-a9a1-35994a3e35f8\") " pod="openshift-console/downloads-586b57c7b4-wqjg7" Apr 16 14:01:56.419088 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:56.419057 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cft4w\" (UniqueName: \"kubernetes.io/projected/06c957ef-10ea-4050-a9a1-35994a3e35f8-kube-api-access-cft4w\") pod \"downloads-586b57c7b4-wqjg7\" (UID: \"06c957ef-10ea-4050-a9a1-35994a3e35f8\") " pod="openshift-console/downloads-586b57c7b4-wqjg7" Apr 16 14:01:56.428022 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:56.427992 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cft4w\" (UniqueName: \"kubernetes.io/projected/06c957ef-10ea-4050-a9a1-35994a3e35f8-kube-api-access-cft4w\") pod \"downloads-586b57c7b4-wqjg7\" (UID: \"06c957ef-10ea-4050-a9a1-35994a3e35f8\") " pod="openshift-console/downloads-586b57c7b4-wqjg7" Apr 16 14:01:56.527382 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:56.527351 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-wqjg7" Apr 16 14:01:56.651581 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:56.651551 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-wqjg7"] Apr 16 14:01:56.654234 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:01:56.654191 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06c957ef_10ea_4050_a9a1_35994a3e35f8.slice/crio-6474eda489061d64d2a40f73bfb5d3fd782a745da451d09b9d486af439a0fbab WatchSource:0}: Error finding container 6474eda489061d64d2a40f73bfb5d3fd782a745da451d09b9d486af439a0fbab: Status 404 returned error can't find the container with id 6474eda489061d64d2a40f73bfb5d3fd782a745da451d09b9d486af439a0fbab Apr 16 14:01:56.762688 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:56.762659 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-wqjg7" event={"ID":"06c957ef-10ea-4050-a9a1-35994a3e35f8","Type":"ContainerStarted","Data":"6474eda489061d64d2a40f73bfb5d3fd782a745da451d09b9d486af439a0fbab"} Apr 16 14:01:58.505168 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:58.505122 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-p6gtp" podUID="59e98d1e-f9cf-4faa-bd64-a597149d3bc7" Apr 16 14:01:58.513364 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:01:58.513229 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-b4d9q" podUID="852d4d38-4926-4c6a-a9ad-11a60019138a" Apr 16 14:01:58.768781 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:58.768655 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b4d9q" Apr 16 14:01:58.768781 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:01:58.768662 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p6gtp" Apr 16 14:02:03.479741 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:03.479698 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert\") pod \"ingress-canary-b4d9q\" (UID: \"852d4d38-4926-4c6a-a9ad-11a60019138a\") " pod="openshift-ingress-canary/ingress-canary-b4d9q" Apr 16 14:02:03.480292 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:03.479745 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls\") pod \"dns-default-p6gtp\" (UID: \"59e98d1e-f9cf-4faa-bd64-a597149d3bc7\") " pod="openshift-dns/dns-default-p6gtp" Apr 16 14:02:03.482265 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:03.482236 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59e98d1e-f9cf-4faa-bd64-a597149d3bc7-metrics-tls\") pod \"dns-default-p6gtp\" (UID: \"59e98d1e-f9cf-4faa-bd64-a597149d3bc7\") " pod="openshift-dns/dns-default-p6gtp" Apr 16 14:02:03.483074 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:03.483047 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/852d4d38-4926-4c6a-a9ad-11a60019138a-cert\") pod \"ingress-canary-b4d9q\" (UID: \"852d4d38-4926-4c6a-a9ad-11a60019138a\") " pod="openshift-ingress-canary/ingress-canary-b4d9q" Apr 16 14:02:03.571451 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:03.571424 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbpqw\"" Apr 16 14:02:03.571595 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:03.571464 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tpzlf\"" Apr 16 14:02:03.580118 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:03.580087 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p6gtp" Apr 16 14:02:03.580252 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:03.580089 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b4d9q" Apr 16 14:02:03.723916 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:03.723817 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p6gtp"] Apr 16 14:02:03.726307 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:02:03.726275 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59e98d1e_f9cf_4faa_bd64_a597149d3bc7.slice/crio-8c311476d30e431ce9d67907644a5df017d2d852a4af66851c9f865b12a53d64 WatchSource:0}: Error finding container 8c311476d30e431ce9d67907644a5df017d2d852a4af66851c9f865b12a53d64: Status 404 returned error can't find the container with id 8c311476d30e431ce9d67907644a5df017d2d852a4af66851c9f865b12a53d64 Apr 16 14:02:03.742089 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:03.742030 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b4d9q"] Apr 16 14:02:03.746355 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:02:03.746322 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod852d4d38_4926_4c6a_a9ad_11a60019138a.slice/crio-a813290ca0bf35607a10cc9439d24965cfb88481ce33cd671fdf917fdbb2bdce WatchSource:0}: Error finding container a813290ca0bf35607a10cc9439d24965cfb88481ce33cd671fdf917fdbb2bdce: Status 404 returned error can't find the container with id a813290ca0bf35607a10cc9439d24965cfb88481ce33cd671fdf917fdbb2bdce Apr 16 14:02:03.782789 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:03.782754 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p6gtp" event={"ID":"59e98d1e-f9cf-4faa-bd64-a597149d3bc7","Type":"ContainerStarted","Data":"8c311476d30e431ce9d67907644a5df017d2d852a4af66851c9f865b12a53d64"} Apr 16 14:02:03.783859 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:03.783835 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b4d9q" event={"ID":"852d4d38-4926-4c6a-a9ad-11a60019138a","Type":"ContainerStarted","Data":"a813290ca0bf35607a10cc9439d24965cfb88481ce33cd671fdf917fdbb2bdce"} Apr 16 14:02:04.144105 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:04.144027 2569 patch_prober.go:28] interesting pod/image-registry-6d7569c8cd-4qw74 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:02:04.144105 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:04.144095 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" podUID="870e7376-e0fa-40ca-ad2c-98fa6189639f" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:02:04.144300 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:04.144141 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:02:04.144723 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:04.144675 2569 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"b690b8f4cc4901a36064576f9e423dae22ebb69ef4c2dfd30acd1c548d795d09"} pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" containerMessage="Container registry failed liveness probe, will be restarted" Apr 16 14:02:04.149897 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:04.149849 2569 patch_prober.go:28] interesting pod/image-registry-6d7569c8cd-4qw74 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:02:04.150037 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:04.149905 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" podUID="870e7376-e0fa-40ca-ad2c-98fa6189639f" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:02:05.792711 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:05.792619 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p6gtp" event={"ID":"59e98d1e-f9cf-4faa-bd64-a597149d3bc7","Type":"ContainerStarted","Data":"bb0cb22164789865b0dd947a042d665d5b6117f827c4891967284de3b1ee3f97"} Apr 16 14:02:05.792711 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:05.792662 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p6gtp" event={"ID":"59e98d1e-f9cf-4faa-bd64-a597149d3bc7","Type":"ContainerStarted","Data":"ce0424d00fdebe2baf8d5b4217b1a18bb0e6caf8777075aef728c6ed58f82d6f"} Apr 16 14:02:05.793304 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:05.792776 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-p6gtp" Apr 16 14:02:05.810703 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:05.810651 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-p6gtp" podStartSLOduration=129.156467833 podStartE2EDuration="2m10.810634412s" podCreationTimestamp="2026-04-16 13:59:55 +0000 UTC" firstStartedPulling="2026-04-16 14:02:03.729266017 +0000 UTC m=+161.069640855" lastFinishedPulling="2026-04-16 14:02:05.383432588 +0000 UTC m=+162.723807434" observedRunningTime="2026-04-16 14:02:05.809500592 +0000 UTC m=+163.149875457" watchObservedRunningTime="2026-04-16 14:02:05.810634412 +0000 UTC m=+163.151009268" Apr 16 14:02:06.796623 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:06.796584 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b4d9q" event={"ID":"852d4d38-4926-4c6a-a9ad-11a60019138a","Type":"ContainerStarted","Data":"81afffe4d3d3c85705a764482b48f033f795e062108f1175d508835d50e0f239"} Apr 16 14:02:06.814141 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:06.814067 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-b4d9q" podStartSLOduration=129.244491401 podStartE2EDuration="2m11.814049736s" podCreationTimestamp="2026-04-16 13:59:55 +0000 UTC" firstStartedPulling="2026-04-16 14:02:03.748690006 +0000 UTC m=+161.089064839" lastFinishedPulling="2026-04-16 14:02:06.31824833 +0000 UTC m=+163.658623174" observedRunningTime="2026-04-16 14:02:06.813794012 +0000 UTC m=+164.154168870" watchObservedRunningTime="2026-04-16 14:02:06.814049736 +0000 UTC m=+164.154424591" Apr 16 14:02:13.817986 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:13.817945 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-wqjg7" event={"ID":"06c957ef-10ea-4050-a9a1-35994a3e35f8","Type":"ContainerStarted","Data":"2699be3220b4caa218631b8302a4b95e847d0ff75d6b9d4009b3ce0e2b6ffc97"} Apr 16 14:02:13.818524 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:13.818158 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-wqjg7" Apr 16 14:02:13.819480 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:13.819448 2569 generic.go:358] "Generic (PLEG): container finished" podID="5f8c6a6f-0f8c-4610-a6c5-f33111156650" containerID="9d42854938c9f26bfb7f9938aa28c6d8393aa93bfc1b06427503810e1dc9879a" exitCode=0 Apr 16 14:02:13.819617 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:13.819512 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bg6tj" event={"ID":"5f8c6a6f-0f8c-4610-a6c5-f33111156650","Type":"ContainerDied","Data":"9d42854938c9f26bfb7f9938aa28c6d8393aa93bfc1b06427503810e1dc9879a"} Apr 16 14:02:13.819953 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:13.819834 2569 scope.go:117] "RemoveContainer" containerID="9d42854938c9f26bfb7f9938aa28c6d8393aa93bfc1b06427503810e1dc9879a" Apr 16 14:02:13.834226 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:13.834204 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-wqjg7" Apr 16 14:02:13.844928 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:13.844887 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-wqjg7" podStartSLOduration=1.357484802 podStartE2EDuration="17.844873908s" podCreationTimestamp="2026-04-16 14:01:56 +0000 UTC" firstStartedPulling="2026-04-16 14:01:56.656048332 +0000 UTC m=+153.996423162" lastFinishedPulling="2026-04-16 14:02:13.143437425 +0000 UTC m=+170.483812268" observedRunningTime="2026-04-16 14:02:13.843643828 +0000 UTC m=+171.184018681" watchObservedRunningTime="2026-04-16 14:02:13.844873908 +0000 UTC m=+171.185248760" Apr 16 14:02:14.149159 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.149084 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:02:14.522577 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.522538 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-569cc5dbc4-z49sm"] Apr 16 14:02:14.526096 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.526068 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.529519 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.529494 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 14:02:14.529646 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.529595 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 14:02:14.530334 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.529861 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 14:02:14.530334 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.529890 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 14:02:14.530334 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.529933 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-vvc8w\"" Apr 16 14:02:14.530334 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.529955 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 14:02:14.535356 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.535336 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 14:02:14.539317 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.539295 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-569cc5dbc4-z49sm"] Apr 16 14:02:14.678487 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.678455 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-console-config\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.678676 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.678514 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-oauth-serving-cert\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.678676 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.678547 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-console-serving-cert\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.678676 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.678592 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clwd2\" (UniqueName: \"kubernetes.io/projected/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-kube-api-access-clwd2\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.678828 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.678688 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-trusted-ca-bundle\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.678828 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.678792 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-console-oauth-config\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.678959 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.678836 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-service-ca\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.779797 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.779714 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-console-oauth-config\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.779797 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.779763 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-service-ca\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.779797 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.779799 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-console-config\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.780025 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.779841 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-oauth-serving-cert\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.780025 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.779868 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-console-serving-cert\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.780025 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.779902 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clwd2\" (UniqueName: \"kubernetes.io/projected/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-kube-api-access-clwd2\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.780025 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.779937 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-trusted-ca-bundle\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.780667 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.780640 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-oauth-serving-cert\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.780834 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.780809 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-trusted-ca-bundle\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.780834 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.780721 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-service-ca\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.780955 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.780713 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-console-config\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.782621 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.782595 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-console-serving-cert\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.782728 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.782633 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-console-oauth-config\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.789127 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.789081 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clwd2\" (UniqueName: \"kubernetes.io/projected/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-kube-api-access-clwd2\") pod \"console-569cc5dbc4-z49sm\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.825159 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.825117 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-bg6tj" event={"ID":"5f8c6a6f-0f8c-4610-a6c5-f33111156650","Type":"ContainerStarted","Data":"591ab6c3b68d5548bfa973f99b1893d90975ec656bb134f709733d644ceb8a62"} Apr 16 14:02:14.837720 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.837677 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:14.984699 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:14.984673 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-569cc5dbc4-z49sm"] Apr 16 14:02:14.987451 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:02:14.987391 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ca3d2bf_b21e_4c5a_ac30_ff817fb6aa24.slice/crio-21457db7a9d9891ce70c2fb8eac22ee1492a98e0f2d6c92ee85e29df9daf9bfe WatchSource:0}: Error finding container 21457db7a9d9891ce70c2fb8eac22ee1492a98e0f2d6c92ee85e29df9daf9bfe: Status 404 returned error can't find the container with id 21457db7a9d9891ce70c2fb8eac22ee1492a98e0f2d6c92ee85e29df9daf9bfe Apr 16 14:02:15.799607 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:15.799490 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-p6gtp" Apr 16 14:02:15.829992 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:15.829863 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-569cc5dbc4-z49sm" event={"ID":"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24","Type":"ContainerStarted","Data":"21457db7a9d9891ce70c2fb8eac22ee1492a98e0f2d6c92ee85e29df9daf9bfe"} Apr 16 14:02:19.846455 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:19.846415 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-569cc5dbc4-z49sm" event={"ID":"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24","Type":"ContainerStarted","Data":"a1e1a26c3226174be967605c4076bd5f4a4f03d0da7e5a81e68e5d3719bbf500"} Apr 16 14:02:19.864075 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:19.864028 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-569cc5dbc4-z49sm" podStartSLOduration=2.130145888 podStartE2EDuration="5.864014488s" podCreationTimestamp="2026-04-16 14:02:14 +0000 UTC" firstStartedPulling="2026-04-16 14:02:14.989692308 +0000 UTC m=+172.330067142" lastFinishedPulling="2026-04-16 14:02:18.723560896 +0000 UTC m=+176.063935742" observedRunningTime="2026-04-16 14:02:19.862119009 +0000 UTC m=+177.202493862" watchObservedRunningTime="2026-04-16 14:02:19.864014488 +0000 UTC m=+177.204389339" Apr 16 14:02:24.838021 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:24.837982 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:24.838021 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:24.838024 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:24.842692 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:24.842670 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:24.863955 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:24.863906 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:02:29.165103 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:29.165057 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" podUID="870e7376-e0fa-40ca-ad2c-98fa6189639f" containerName="registry" containerID="cri-o://b690b8f4cc4901a36064576f9e423dae22ebb69ef4c2dfd30acd1c548d795d09" gracePeriod=30 Apr 16 14:02:30.878747 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:30.878711 2569 generic.go:358] "Generic (PLEG): container finished" podID="870e7376-e0fa-40ca-ad2c-98fa6189639f" containerID="b690b8f4cc4901a36064576f9e423dae22ebb69ef4c2dfd30acd1c548d795d09" exitCode=0 Apr 16 14:02:30.879288 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:30.878798 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" event={"ID":"870e7376-e0fa-40ca-ad2c-98fa6189639f","Type":"ContainerDied","Data":"b690b8f4cc4901a36064576f9e423dae22ebb69ef4c2dfd30acd1c548d795d09"} Apr 16 14:02:30.879288 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:30.878831 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" event={"ID":"870e7376-e0fa-40ca-ad2c-98fa6189639f","Type":"ContainerStarted","Data":"22f961683babb03597f040259f255c1a3006ba79eea45386b991bbaa1c8e9526"} Apr 16 14:02:30.879288 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:30.878877 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:02:36.057276 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:36.057239 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-ht2bj_a78f3464-a81c-413a-a11a-ba6020b56874/cluster-monitoring-operator/0.log" Apr 16 14:02:36.255235 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:36.255210 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-qbddt_29aa9554-40e1-4efd-b0bc-34ab8445a858/kube-state-metrics/0.log" Apr 16 14:02:36.454215 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:36.454185 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-qbddt_29aa9554-40e1-4efd-b0bc-34ab8445a858/kube-rbac-proxy-main/0.log" Apr 16 14:02:36.654840 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:36.654805 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-qbddt_29aa9554-40e1-4efd-b0bc-34ab8445a858/kube-rbac-proxy-self/0.log" Apr 16 14:02:37.854466 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:37.854438 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5866k_0aabbbc1-d84a-4c1a-ae38-92ce1b31b836/init-textfile/0.log" Apr 16 14:02:38.055901 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:38.055875 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5866k_0aabbbc1-d84a-4c1a-ae38-92ce1b31b836/node-exporter/0.log" Apr 16 14:02:38.257428 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:38.257382 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5866k_0aabbbc1-d84a-4c1a-ae38-92ce1b31b836/kube-rbac-proxy/0.log" Apr 16 14:02:42.857786 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:42.857746 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-c4d9z_1a771ca7-2942-4693-8a9f-243a8e6f82d5/networking-console-plugin/0.log" Apr 16 14:02:43.054936 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:43.054909 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:02:43.262049 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:43.262000 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/3.log" Apr 16 14:02:43.454689 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:43.454658 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-569cc5dbc4-z49sm_9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24/console/0.log" Apr 16 14:02:43.856786 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:43.856757 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-wqjg7_06c957ef-10ea-4050-a9a1-35994a3e35f8/download-server/0.log" Apr 16 14:02:44.255297 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:44.255265 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6d7569c8cd-4qw74_870e7376-e0fa-40ca-ad2c-98fa6189639f/registry/0.log" Apr 16 14:02:44.455784 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:44.455759 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6d7569c8cd-4qw74_870e7376-e0fa-40ca-ad2c-98fa6189639f/registry/1.log" Apr 16 14:02:44.854738 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:44.854711 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xv4n6_d6588176-b995-4b14-80e6-c2ba40893912/node-ca/0.log" Apr 16 14:02:45.254967 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:45.254941 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5c4787ff58-x4l8s_e5a40466-a66f-4e5a-b8ea-43dd46b22ac1/router/0.log" Apr 16 14:02:45.654110 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:45.654027 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-b4d9q_852d4d38-4926-4c6a-a9ad-11a60019138a/serve-healthcheck-canary/0.log" Apr 16 14:02:51.886234 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:02:51.886202 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6d7569c8cd-4qw74" Apr 16 14:03:19.776769 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:19.776729 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-569cc5dbc4-z49sm"] Apr 16 14:03:44.796392 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:44.796309 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-569cc5dbc4-z49sm" podUID="9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24" containerName="console" containerID="cri-o://a1e1a26c3226174be967605c4076bd5f4a4f03d0da7e5a81e68e5d3719bbf500" gracePeriod=15 Apr 16 14:03:44.860748 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:44.860708 2569 patch_prober.go:28] interesting pod/console-569cc5dbc4-z49sm container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.134.0.17:8443/health\": dial tcp 10.134.0.17:8443: connect: connection refused" start-of-body= Apr 16 14:03:44.860903 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:44.860775 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-569cc5dbc4-z49sm" podUID="9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24" containerName="console" probeResult="failure" output="Get \"https://10.134.0.17:8443/health\": dial tcp 10.134.0.17:8443: connect: connection refused" Apr 16 14:03:45.035735 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.035711 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-569cc5dbc4-z49sm_9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24/console/0.log" Apr 16 14:03:45.035891 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.035783 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:03:45.093795 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.093710 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-569cc5dbc4-z49sm_9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24/console/0.log" Apr 16 14:03:45.093795 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.093755 2569 generic.go:358] "Generic (PLEG): container finished" podID="9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24" containerID="a1e1a26c3226174be967605c4076bd5f4a4f03d0da7e5a81e68e5d3719bbf500" exitCode=2 Apr 16 14:03:45.094018 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.093794 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-569cc5dbc4-z49sm" event={"ID":"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24","Type":"ContainerDied","Data":"a1e1a26c3226174be967605c4076bd5f4a4f03d0da7e5a81e68e5d3719bbf500"} Apr 16 14:03:45.094018 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.093820 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-569cc5dbc4-z49sm" event={"ID":"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24","Type":"ContainerDied","Data":"21457db7a9d9891ce70c2fb8eac22ee1492a98e0f2d6c92ee85e29df9daf9bfe"} Apr 16 14:03:45.094018 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.093823 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-569cc5dbc4-z49sm" Apr 16 14:03:45.094018 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.093837 2569 scope.go:117] "RemoveContainer" containerID="a1e1a26c3226174be967605c4076bd5f4a4f03d0da7e5a81e68e5d3719bbf500" Apr 16 14:03:45.101249 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.101229 2569 scope.go:117] "RemoveContainer" containerID="a1e1a26c3226174be967605c4076bd5f4a4f03d0da7e5a81e68e5d3719bbf500" Apr 16 14:03:45.101557 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:03:45.101533 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1e1a26c3226174be967605c4076bd5f4a4f03d0da7e5a81e68e5d3719bbf500\": container with ID starting with a1e1a26c3226174be967605c4076bd5f4a4f03d0da7e5a81e68e5d3719bbf500 not found: ID does not exist" containerID="a1e1a26c3226174be967605c4076bd5f4a4f03d0da7e5a81e68e5d3719bbf500" Apr 16 14:03:45.101636 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.101564 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1e1a26c3226174be967605c4076bd5f4a4f03d0da7e5a81e68e5d3719bbf500"} err="failed to get container status \"a1e1a26c3226174be967605c4076bd5f4a4f03d0da7e5a81e68e5d3719bbf500\": rpc error: code = NotFound desc = could not find container \"a1e1a26c3226174be967605c4076bd5f4a4f03d0da7e5a81e68e5d3719bbf500\": container with ID starting with a1e1a26c3226174be967605c4076bd5f4a4f03d0da7e5a81e68e5d3719bbf500 not found: ID does not exist" Apr 16 14:03:45.197730 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.197689 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-trusted-ca-bundle\") pod \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " Apr 16 14:03:45.197922 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.197742 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-service-ca\") pod \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " Apr 16 14:03:45.197922 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.197763 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-oauth-serving-cert\") pod \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " Apr 16 14:03:45.197922 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.197852 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clwd2\" (UniqueName: \"kubernetes.io/projected/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-kube-api-access-clwd2\") pod \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " Apr 16 14:03:45.197922 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.197902 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-console-config\") pod \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " Apr 16 14:03:45.198136 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.197942 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-console-serving-cert\") pod \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " Apr 16 14:03:45.198136 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.198009 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-console-oauth-config\") pod \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\" (UID: \"9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24\") " Apr 16 14:03:45.198281 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.198258 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24" (UID: "9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:45.198337 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.198273 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-service-ca" (OuterVolumeSpecName: "service-ca") pod "9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24" (UID: "9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:45.198419 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.198370 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-console-config" (OuterVolumeSpecName: "console-config") pod "9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24" (UID: "9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:45.198492 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.198388 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24" (UID: "9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:45.200193 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.200169 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-kube-api-access-clwd2" (OuterVolumeSpecName: "kube-api-access-clwd2") pod "9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24" (UID: "9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24"). InnerVolumeSpecName "kube-api-access-clwd2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:03:45.200193 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.200174 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24" (UID: "9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:45.200337 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.200187 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24" (UID: "9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:45.298823 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.298782 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-console-oauth-config\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Apr 16 14:03:45.298823 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.298815 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-trusted-ca-bundle\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Apr 16 14:03:45.298823 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.298826 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-service-ca\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Apr 16 14:03:45.298823 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.298835 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-oauth-serving-cert\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Apr 16 14:03:45.299086 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.298844 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-clwd2\" (UniqueName: \"kubernetes.io/projected/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-kube-api-access-clwd2\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Apr 16 14:03:45.299086 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.298853 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-console-config\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Apr 16 14:03:45.299086 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.298861 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24-console-serving-cert\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Apr 16 14:03:45.415311 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.415224 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-569cc5dbc4-z49sm"] Apr 16 14:03:45.427219 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:45.427188 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-569cc5dbc4-z49sm"] Apr 16 14:03:47.312212 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:03:47.312179 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24" path="/var/lib/kubelet/pods/9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24/volumes" Apr 16 14:04:23.196259 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:04:23.196226 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:04:23.196885 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:04:23.196270 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:04:23.202284 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:04:23.202262 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:04:23.202388 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:04:23.202269 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:04:23.205712 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:04:23.205689 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 14:05:03.298929 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:03.298886 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-l9j7q"] Apr 16 14:05:03.299333 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:03.299311 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24" containerName="console" Apr 16 14:05:03.299333 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:03.299326 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24" containerName="console" Apr 16 14:05:03.299434 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:03.299367 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ca3d2bf-b21e-4c5a-ac30-ff817fb6aa24" containerName="console" Apr 16 14:05:03.302053 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:03.302038 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l9j7q" Apr 16 14:05:03.304590 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:03.304567 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:05:03.318768 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:03.318743 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-l9j7q"] Apr 16 14:05:03.376046 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:03.376004 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4be6a59-e6b2-4033-95df-e0f99a6fe1e3-original-pull-secret\") pod \"global-pull-secret-syncer-l9j7q\" (UID: \"d4be6a59-e6b2-4033-95df-e0f99a6fe1e3\") " pod="kube-system/global-pull-secret-syncer-l9j7q" Apr 16 14:05:03.376241 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:03.376064 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d4be6a59-e6b2-4033-95df-e0f99a6fe1e3-kubelet-config\") pod \"global-pull-secret-syncer-l9j7q\" (UID: \"d4be6a59-e6b2-4033-95df-e0f99a6fe1e3\") " pod="kube-system/global-pull-secret-syncer-l9j7q" Apr 16 14:05:03.376241 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:03.376104 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d4be6a59-e6b2-4033-95df-e0f99a6fe1e3-dbus\") pod \"global-pull-secret-syncer-l9j7q\" (UID: \"d4be6a59-e6b2-4033-95df-e0f99a6fe1e3\") " pod="kube-system/global-pull-secret-syncer-l9j7q" Apr 16 14:05:03.477227 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:03.477186 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d4be6a59-e6b2-4033-95df-e0f99a6fe1e3-kubelet-config\") pod \"global-pull-secret-syncer-l9j7q\" (UID: \"d4be6a59-e6b2-4033-95df-e0f99a6fe1e3\") " pod="kube-system/global-pull-secret-syncer-l9j7q" Apr 16 14:05:03.477227 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:03.477227 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d4be6a59-e6b2-4033-95df-e0f99a6fe1e3-dbus\") pod \"global-pull-secret-syncer-l9j7q\" (UID: \"d4be6a59-e6b2-4033-95df-e0f99a6fe1e3\") " pod="kube-system/global-pull-secret-syncer-l9j7q" Apr 16 14:05:03.477461 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:03.477284 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4be6a59-e6b2-4033-95df-e0f99a6fe1e3-original-pull-secret\") pod \"global-pull-secret-syncer-l9j7q\" (UID: \"d4be6a59-e6b2-4033-95df-e0f99a6fe1e3\") " pod="kube-system/global-pull-secret-syncer-l9j7q" Apr 16 14:05:03.477461 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:03.477314 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d4be6a59-e6b2-4033-95df-e0f99a6fe1e3-kubelet-config\") pod \"global-pull-secret-syncer-l9j7q\" (UID: \"d4be6a59-e6b2-4033-95df-e0f99a6fe1e3\") " pod="kube-system/global-pull-secret-syncer-l9j7q" Apr 16 14:05:03.477530 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:03.477461 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d4be6a59-e6b2-4033-95df-e0f99a6fe1e3-dbus\") pod \"global-pull-secret-syncer-l9j7q\" (UID: \"d4be6a59-e6b2-4033-95df-e0f99a6fe1e3\") " pod="kube-system/global-pull-secret-syncer-l9j7q" Apr 16 14:05:03.479552 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:03.479527 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d4be6a59-e6b2-4033-95df-e0f99a6fe1e3-original-pull-secret\") pod \"global-pull-secret-syncer-l9j7q\" (UID: \"d4be6a59-e6b2-4033-95df-e0f99a6fe1e3\") " pod="kube-system/global-pull-secret-syncer-l9j7q" Apr 16 14:05:03.618206 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:03.618115 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-l9j7q" Apr 16 14:05:03.742805 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:03.742779 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-l9j7q"] Apr 16 14:05:03.745498 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:05:03.745468 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4be6a59_e6b2_4033_95df_e0f99a6fe1e3.slice/crio-c7f1202076c3936da192f7248456762012f688ebfed0228366f109475ac44383 WatchSource:0}: Error finding container c7f1202076c3936da192f7248456762012f688ebfed0228366f109475ac44383: Status 404 returned error can't find the container with id c7f1202076c3936da192f7248456762012f688ebfed0228366f109475ac44383 Apr 16 14:05:03.747111 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:03.747095 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:05:04.295603 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:04.295560 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-l9j7q" event={"ID":"d4be6a59-e6b2-4033-95df-e0f99a6fe1e3","Type":"ContainerStarted","Data":"c7f1202076c3936da192f7248456762012f688ebfed0228366f109475ac44383"} Apr 16 14:05:09.314309 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:09.314272 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-l9j7q" event={"ID":"d4be6a59-e6b2-4033-95df-e0f99a6fe1e3","Type":"ContainerStarted","Data":"230db66dd6730e66deafe4aac72085ae97c93bc1ca10fc2dab5cd21c4860b1de"} Apr 16 14:05:09.330189 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:09.330141 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-l9j7q" podStartSLOduration=1.79848177 podStartE2EDuration="6.330126044s" podCreationTimestamp="2026-04-16 14:05:03 +0000 UTC" firstStartedPulling="2026-04-16 14:05:03.74722276 +0000 UTC m=+341.087597591" lastFinishedPulling="2026-04-16 14:05:08.278867031 +0000 UTC m=+345.619241865" observedRunningTime="2026-04-16 14:05:09.328284987 +0000 UTC m=+346.668659840" watchObservedRunningTime="2026-04-16 14:05:09.330126044 +0000 UTC m=+346.670500953" Apr 16 14:05:41.230324 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:41.230242 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd"] Apr 16 14:05:41.253181 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:41.253152 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd"] Apr 16 14:05:41.253362 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:41.253297 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd" Apr 16 14:05:41.256457 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:41.256430 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-d8xkz\"" Apr 16 14:05:41.257119 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:41.257100 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:05:41.257363 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:41.257345 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:05:41.267036 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:41.267017 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45a848a6-05bc-4782-8b3a-862718f28a9a-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd\" (UID: \"45a848a6-05bc-4782-8b3a-862718f28a9a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd" Apr 16 14:05:41.267116 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:41.267046 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvkbr\" (UniqueName: \"kubernetes.io/projected/45a848a6-05bc-4782-8b3a-862718f28a9a-kube-api-access-zvkbr\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd\" (UID: \"45a848a6-05bc-4782-8b3a-862718f28a9a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd" Apr 16 14:05:41.267155 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:41.267111 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45a848a6-05bc-4782-8b3a-862718f28a9a-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd\" (UID: \"45a848a6-05bc-4782-8b3a-862718f28a9a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd" Apr 16 14:05:41.367786 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:41.367748 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45a848a6-05bc-4782-8b3a-862718f28a9a-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd\" (UID: \"45a848a6-05bc-4782-8b3a-862718f28a9a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd" Apr 16 14:05:41.367963 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:41.367812 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45a848a6-05bc-4782-8b3a-862718f28a9a-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd\" (UID: \"45a848a6-05bc-4782-8b3a-862718f28a9a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd" Apr 16 14:05:41.367963 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:41.367934 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvkbr\" (UniqueName: \"kubernetes.io/projected/45a848a6-05bc-4782-8b3a-862718f28a9a-kube-api-access-zvkbr\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd\" (UID: \"45a848a6-05bc-4782-8b3a-862718f28a9a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd" Apr 16 14:05:41.368173 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:41.368154 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45a848a6-05bc-4782-8b3a-862718f28a9a-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd\" (UID: \"45a848a6-05bc-4782-8b3a-862718f28a9a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd" Apr 16 14:05:41.368211 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:41.368160 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45a848a6-05bc-4782-8b3a-862718f28a9a-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd\" (UID: \"45a848a6-05bc-4782-8b3a-862718f28a9a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd" Apr 16 14:05:41.376893 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:41.376867 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvkbr\" (UniqueName: \"kubernetes.io/projected/45a848a6-05bc-4782-8b3a-862718f28a9a-kube-api-access-zvkbr\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd\" (UID: \"45a848a6-05bc-4782-8b3a-862718f28a9a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd" Apr 16 14:05:41.562433 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:41.562314 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd" Apr 16 14:05:41.682741 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:41.682708 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd"] Apr 16 14:05:41.686097 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:05:41.686072 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a848a6_05bc_4782_8b3a_862718f28a9a.slice/crio-48abceb1307c73de67b7ce46d57d2a0dcc1033bb9064c02cf6fbd9ec2923feb5 WatchSource:0}: Error finding container 48abceb1307c73de67b7ce46d57d2a0dcc1033bb9064c02cf6fbd9ec2923feb5: Status 404 returned error can't find the container with id 48abceb1307c73de67b7ce46d57d2a0dcc1033bb9064c02cf6fbd9ec2923feb5 Apr 16 14:05:42.403687 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:42.403648 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd" event={"ID":"45a848a6-05bc-4782-8b3a-862718f28a9a","Type":"ContainerStarted","Data":"48abceb1307c73de67b7ce46d57d2a0dcc1033bb9064c02cf6fbd9ec2923feb5"} Apr 16 14:05:47.419672 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:47.419635 2569 generic.go:358] "Generic (PLEG): container finished" podID="45a848a6-05bc-4782-8b3a-862718f28a9a" containerID="8628b2ea8b8115aa8527c6d162d1c271b0880daa42204614b4f3d980de1a2769" exitCode=0 Apr 16 14:05:47.420063 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:47.419723 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd" event={"ID":"45a848a6-05bc-4782-8b3a-862718f28a9a","Type":"ContainerDied","Data":"8628b2ea8b8115aa8527c6d162d1c271b0880daa42204614b4f3d980de1a2769"} Apr 16 14:05:50.429725 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:50.429688 2569 generic.go:358] "Generic (PLEG): container finished" podID="45a848a6-05bc-4782-8b3a-862718f28a9a" containerID="8c2df0c0a92c9c8bab315b19a507446f9c759f4005c1fa088aad50ab5f78df81" exitCode=0 Apr 16 14:05:50.430185 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:50.429779 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd" event={"ID":"45a848a6-05bc-4782-8b3a-862718f28a9a","Type":"ContainerDied","Data":"8c2df0c0a92c9c8bab315b19a507446f9c759f4005c1fa088aad50ab5f78df81"} Apr 16 14:05:57.456762 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:57.456727 2569 generic.go:358] "Generic (PLEG): container finished" podID="45a848a6-05bc-4782-8b3a-862718f28a9a" containerID="48bfcb8da87193fff79d7a9253c4750cda799c48ded8247c11e169811eb83c5a" exitCode=0 Apr 16 14:05:57.457145 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:57.456807 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd" event={"ID":"45a848a6-05bc-4782-8b3a-862718f28a9a","Type":"ContainerDied","Data":"48bfcb8da87193fff79d7a9253c4750cda799c48ded8247c11e169811eb83c5a"} Apr 16 14:05:58.578985 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:58.578962 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd" Apr 16 14:05:58.714958 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:58.714864 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45a848a6-05bc-4782-8b3a-862718f28a9a-util\") pod \"45a848a6-05bc-4782-8b3a-862718f28a9a\" (UID: \"45a848a6-05bc-4782-8b3a-862718f28a9a\") " Apr 16 14:05:58.714958 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:58.714941 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45a848a6-05bc-4782-8b3a-862718f28a9a-bundle\") pod \"45a848a6-05bc-4782-8b3a-862718f28a9a\" (UID: \"45a848a6-05bc-4782-8b3a-862718f28a9a\") " Apr 16 14:05:58.715156 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:58.714987 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvkbr\" (UniqueName: \"kubernetes.io/projected/45a848a6-05bc-4782-8b3a-862718f28a9a-kube-api-access-zvkbr\") pod \"45a848a6-05bc-4782-8b3a-862718f28a9a\" (UID: \"45a848a6-05bc-4782-8b3a-862718f28a9a\") " Apr 16 14:05:58.715465 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:58.715433 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45a848a6-05bc-4782-8b3a-862718f28a9a-bundle" (OuterVolumeSpecName: "bundle") pod "45a848a6-05bc-4782-8b3a-862718f28a9a" (UID: "45a848a6-05bc-4782-8b3a-862718f28a9a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:05:58.717124 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:58.717095 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45a848a6-05bc-4782-8b3a-862718f28a9a-kube-api-access-zvkbr" (OuterVolumeSpecName: "kube-api-access-zvkbr") pod "45a848a6-05bc-4782-8b3a-862718f28a9a" (UID: "45a848a6-05bc-4782-8b3a-862718f28a9a"). InnerVolumeSpecName "kube-api-access-zvkbr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:05:58.720370 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:58.720345 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45a848a6-05bc-4782-8b3a-862718f28a9a-util" (OuterVolumeSpecName: "util") pod "45a848a6-05bc-4782-8b3a-862718f28a9a" (UID: "45a848a6-05bc-4782-8b3a-862718f28a9a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:05:58.815701 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:58.815664 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45a848a6-05bc-4782-8b3a-862718f28a9a-util\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Apr 16 14:05:58.815701 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:58.815694 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45a848a6-05bc-4782-8b3a-862718f28a9a-bundle\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Apr 16 14:05:58.815701 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:58.815704 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zvkbr\" (UniqueName: \"kubernetes.io/projected/45a848a6-05bc-4782-8b3a-862718f28a9a-kube-api-access-zvkbr\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Apr 16 14:05:59.464795 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:59.464756 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd" event={"ID":"45a848a6-05bc-4782-8b3a-862718f28a9a","Type":"ContainerDied","Data":"48abceb1307c73de67b7ce46d57d2a0dcc1033bb9064c02cf6fbd9ec2923feb5"} Apr 16 14:05:59.464795 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:59.464795 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48abceb1307c73de67b7ce46d57d2a0dcc1033bb9064c02cf6fbd9ec2923feb5" Apr 16 14:05:59.464990 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:05:59.464847 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c9gvvd" Apr 16 14:06:03.231877 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.231840 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-blh4b"] Apr 16 14:06:03.232328 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.232128 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45a848a6-05bc-4782-8b3a-862718f28a9a" containerName="pull" Apr 16 14:06:03.232328 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.232141 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a848a6-05bc-4782-8b3a-862718f28a9a" containerName="pull" Apr 16 14:06:03.232328 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.232153 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45a848a6-05bc-4782-8b3a-862718f28a9a" containerName="util" Apr 16 14:06:03.232328 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.232158 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a848a6-05bc-4782-8b3a-862718f28a9a" containerName="util" Apr 16 14:06:03.232328 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.232167 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45a848a6-05bc-4782-8b3a-862718f28a9a" containerName="extract" Apr 16 14:06:03.232328 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.232172 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a848a6-05bc-4782-8b3a-862718f28a9a" containerName="extract" Apr 16 14:06:03.232328 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.232224 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="45a848a6-05bc-4782-8b3a-862718f28a9a" containerName="extract" Apr 16 14:06:03.271453 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.271418 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-blh4b"] Apr 16 14:06:03.271613 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.271575 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-blh4b" Apr 16 14:06:03.275260 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.275238 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 14:06:03.275507 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.275488 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-44zj5\"" Apr 16 14:06:03.275616 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.275596 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 14:06:03.276281 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.276266 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 14:06:03.352068 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.352033 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m26d2\" (UniqueName: \"kubernetes.io/projected/448bd791-aeec-4394-8407-46909378518d-kube-api-access-m26d2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-blh4b\" (UID: \"448bd791-aeec-4394-8407-46909378518d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-blh4b" Apr 16 14:06:03.352068 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.352078 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/448bd791-aeec-4394-8407-46909378518d-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-blh4b\" (UID: \"448bd791-aeec-4394-8407-46909378518d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-blh4b" Apr 16 14:06:03.452678 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.452640 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m26d2\" (UniqueName: \"kubernetes.io/projected/448bd791-aeec-4394-8407-46909378518d-kube-api-access-m26d2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-blh4b\" (UID: \"448bd791-aeec-4394-8407-46909378518d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-blh4b" Apr 16 14:06:03.452678 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.452685 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/448bd791-aeec-4394-8407-46909378518d-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-blh4b\" (UID: \"448bd791-aeec-4394-8407-46909378518d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-blh4b" Apr 16 14:06:03.455038 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.455008 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/448bd791-aeec-4394-8407-46909378518d-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-blh4b\" (UID: \"448bd791-aeec-4394-8407-46909378518d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-blh4b" Apr 16 14:06:03.461190 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.461168 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m26d2\" (UniqueName: \"kubernetes.io/projected/448bd791-aeec-4394-8407-46909378518d-kube-api-access-m26d2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-blh4b\" (UID: \"448bd791-aeec-4394-8407-46909378518d\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-blh4b" Apr 16 14:06:03.581573 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.581482 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-blh4b" Apr 16 14:06:03.708532 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:03.708500 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-blh4b"] Apr 16 14:06:03.711850 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:06:03.711823 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod448bd791_aeec_4394_8407_46909378518d.slice/crio-0c7d34e9d5a9ce11605207ba32f4059a0c6827d2a22abfbe4c3a5150c5d83a2d WatchSource:0}: Error finding container 0c7d34e9d5a9ce11605207ba32f4059a0c6827d2a22abfbe4c3a5150c5d83a2d: Status 404 returned error can't find the container with id 0c7d34e9d5a9ce11605207ba32f4059a0c6827d2a22abfbe4c3a5150c5d83a2d Apr 16 14:06:04.479773 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:04.479731 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-blh4b" event={"ID":"448bd791-aeec-4394-8407-46909378518d","Type":"ContainerStarted","Data":"0c7d34e9d5a9ce11605207ba32f4059a0c6827d2a22abfbe4c3a5150c5d83a2d"} Apr 16 14:06:08.492691 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:08.492660 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-blh4b" event={"ID":"448bd791-aeec-4394-8407-46909378518d","Type":"ContainerStarted","Data":"97a34b16a00abf313bd099babedf9284e211ea930498bf3e3551530e1ed79eda"} Apr 16 14:06:08.493121 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:08.492827 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-blh4b" Apr 16 14:06:08.513185 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:08.513133 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-blh4b" podStartSLOduration=0.832673073 podStartE2EDuration="5.513119098s" podCreationTimestamp="2026-04-16 14:06:03 +0000 UTC" firstStartedPulling="2026-04-16 14:06:03.714182167 +0000 UTC m=+401.054556998" lastFinishedPulling="2026-04-16 14:06:08.394628187 +0000 UTC m=+405.735003023" observedRunningTime="2026-04-16 14:06:08.511049771 +0000 UTC m=+405.851424623" watchObservedRunningTime="2026-04-16 14:06:08.513119098 +0000 UTC m=+405.853493950" Apr 16 14:06:08.953655 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:08.953615 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-54nqx"] Apr 16 14:06:08.970320 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:08.970287 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-54nqx"] Apr 16 14:06:08.970511 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:08.970370 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-54nqx" Apr 16 14:06:08.972507 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:08.972471 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 14:06:08.972507 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:08.972477 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-lpcpn\"" Apr 16 14:06:08.972713 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:08.972477 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 14:06:09.099158 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.099129 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/986c32cb-58f0-407f-a253-dea6caa26027-cabundle0\") pod \"keda-operator-ffbb595cb-54nqx\" (UID: \"986c32cb-58f0-407f-a253-dea6caa26027\") " pod="openshift-keda/keda-operator-ffbb595cb-54nqx" Apr 16 14:06:09.099321 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.099177 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/986c32cb-58f0-407f-a253-dea6caa26027-certificates\") pod \"keda-operator-ffbb595cb-54nqx\" (UID: \"986c32cb-58f0-407f-a253-dea6caa26027\") " pod="openshift-keda/keda-operator-ffbb595cb-54nqx" Apr 16 14:06:09.099321 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.099205 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7x7x\" (UniqueName: \"kubernetes.io/projected/986c32cb-58f0-407f-a253-dea6caa26027-kube-api-access-f7x7x\") pod \"keda-operator-ffbb595cb-54nqx\" (UID: \"986c32cb-58f0-407f-a253-dea6caa26027\") " pod="openshift-keda/keda-operator-ffbb595cb-54nqx" Apr 16 14:06:09.200289 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.200254 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/986c32cb-58f0-407f-a253-dea6caa26027-cabundle0\") pod \"keda-operator-ffbb595cb-54nqx\" (UID: \"986c32cb-58f0-407f-a253-dea6caa26027\") " pod="openshift-keda/keda-operator-ffbb595cb-54nqx" Apr 16 14:06:09.200477 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.200305 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/986c32cb-58f0-407f-a253-dea6caa26027-certificates\") pod \"keda-operator-ffbb595cb-54nqx\" (UID: \"986c32cb-58f0-407f-a253-dea6caa26027\") " pod="openshift-keda/keda-operator-ffbb595cb-54nqx" Apr 16 14:06:09.200477 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.200341 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7x7x\" (UniqueName: \"kubernetes.io/projected/986c32cb-58f0-407f-a253-dea6caa26027-kube-api-access-f7x7x\") pod \"keda-operator-ffbb595cb-54nqx\" (UID: \"986c32cb-58f0-407f-a253-dea6caa26027\") " pod="openshift-keda/keda-operator-ffbb595cb-54nqx" Apr 16 14:06:09.200559 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:09.200478 2569 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 16 14:06:09.200559 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:09.200499 2569 secret.go:281] references non-existent secret key: ca.crt Apr 16 14:06:09.200559 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:09.200508 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 14:06:09.200559 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:09.200524 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-54nqx: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 14:06:09.200681 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:09.200596 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/986c32cb-58f0-407f-a253-dea6caa26027-certificates podName:986c32cb-58f0-407f-a253-dea6caa26027 nodeName:}" failed. No retries permitted until 2026-04-16 14:06:09.700576597 +0000 UTC m=+407.040951433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/986c32cb-58f0-407f-a253-dea6caa26027-certificates") pod "keda-operator-ffbb595cb-54nqx" (UID: "986c32cb-58f0-407f-a253-dea6caa26027") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 14:06:09.200904 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.200885 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/986c32cb-58f0-407f-a253-dea6caa26027-cabundle0\") pod \"keda-operator-ffbb595cb-54nqx\" (UID: \"986c32cb-58f0-407f-a253-dea6caa26027\") " pod="openshift-keda/keda-operator-ffbb595cb-54nqx" Apr 16 14:06:09.208936 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.208874 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7x7x\" (UniqueName: \"kubernetes.io/projected/986c32cb-58f0-407f-a253-dea6caa26027-kube-api-access-f7x7x\") pod \"keda-operator-ffbb595cb-54nqx\" (UID: \"986c32cb-58f0-407f-a253-dea6caa26027\") " pod="openshift-keda/keda-operator-ffbb595cb-54nqx" Apr 16 14:06:09.398476 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.398446 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs"] Apr 16 14:06:09.411171 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.410935 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs" Apr 16 14:06:09.413059 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.413036 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 14:06:09.413059 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.413057 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs"] Apr 16 14:06:09.502863 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.502774 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5018b7f7-38b7-4034-8f83-530879b1668c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qqqqs\" (UID: \"5018b7f7-38b7-4034-8f83-530879b1668c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs" Apr 16 14:06:09.502863 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.502826 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/5018b7f7-38b7-4034-8f83-530879b1668c-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qqqqs\" (UID: \"5018b7f7-38b7-4034-8f83-530879b1668c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs" Apr 16 14:06:09.503239 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.502885 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz4ng\" (UniqueName: \"kubernetes.io/projected/5018b7f7-38b7-4034-8f83-530879b1668c-kube-api-access-nz4ng\") pod \"keda-metrics-apiserver-7c9f485588-qqqqs\" (UID: \"5018b7f7-38b7-4034-8f83-530879b1668c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs" Apr 16 14:06:09.604002 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.603962 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nz4ng\" (UniqueName: \"kubernetes.io/projected/5018b7f7-38b7-4034-8f83-530879b1668c-kube-api-access-nz4ng\") pod \"keda-metrics-apiserver-7c9f485588-qqqqs\" (UID: \"5018b7f7-38b7-4034-8f83-530879b1668c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs" Apr 16 14:06:09.604185 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.604091 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5018b7f7-38b7-4034-8f83-530879b1668c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qqqqs\" (UID: \"5018b7f7-38b7-4034-8f83-530879b1668c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs" Apr 16 14:06:09.604185 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.604135 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/5018b7f7-38b7-4034-8f83-530879b1668c-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qqqqs\" (UID: \"5018b7f7-38b7-4034-8f83-530879b1668c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs" Apr 16 14:06:09.604304 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:09.604259 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 14:06:09.604304 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:09.604277 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 14:06:09.604304 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:09.604294 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs: references non-existent secret key: tls.crt Apr 16 14:06:09.604494 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:09.604342 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5018b7f7-38b7-4034-8f83-530879b1668c-certificates podName:5018b7f7-38b7-4034-8f83-530879b1668c nodeName:}" failed. No retries permitted until 2026-04-16 14:06:10.104327822 +0000 UTC m=+407.444702658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5018b7f7-38b7-4034-8f83-530879b1668c-certificates") pod "keda-metrics-apiserver-7c9f485588-qqqqs" (UID: "5018b7f7-38b7-4034-8f83-530879b1668c") : references non-existent secret key: tls.crt Apr 16 14:06:09.604563 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.604539 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/5018b7f7-38b7-4034-8f83-530879b1668c-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qqqqs\" (UID: \"5018b7f7-38b7-4034-8f83-530879b1668c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs" Apr 16 14:06:09.613071 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.613044 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz4ng\" (UniqueName: \"kubernetes.io/projected/5018b7f7-38b7-4034-8f83-530879b1668c-kube-api-access-nz4ng\") pod \"keda-metrics-apiserver-7c9f485588-qqqqs\" (UID: \"5018b7f7-38b7-4034-8f83-530879b1668c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs" Apr 16 14:06:09.704725 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.704688 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/986c32cb-58f0-407f-a253-dea6caa26027-certificates\") pod \"keda-operator-ffbb595cb-54nqx\" (UID: \"986c32cb-58f0-407f-a253-dea6caa26027\") " pod="openshift-keda/keda-operator-ffbb595cb-54nqx" Apr 16 14:06:09.704889 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:09.704831 2569 secret.go:281] references non-existent secret key: ca.crt Apr 16 14:06:09.704889 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:09.704853 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 14:06:09.704889 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:09.704862 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-54nqx: references non-existent secret key: ca.crt Apr 16 14:06:09.705015 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:09.704916 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/986c32cb-58f0-407f-a253-dea6caa26027-certificates podName:986c32cb-58f0-407f-a253-dea6caa26027 nodeName:}" failed. No retries permitted until 2026-04-16 14:06:10.704901454 +0000 UTC m=+408.045276285 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/986c32cb-58f0-407f-a253-dea6caa26027-certificates") pod "keda-operator-ffbb595cb-54nqx" (UID: "986c32cb-58f0-407f-a253-dea6caa26027") : references non-existent secret key: ca.crt Apr 16 14:06:09.786958 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.786882 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-gjmz5"] Apr 16 14:06:09.798191 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.798159 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-gjmz5" Apr 16 14:06:09.800054 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.800033 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 14:06:09.800237 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.800217 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-gjmz5"] Apr 16 14:06:09.905845 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.905812 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwtg6\" (UniqueName: \"kubernetes.io/projected/7e444466-5661-4707-9f6c-6fe7275f0bfa-kube-api-access-gwtg6\") pod \"keda-admission-cf49989db-gjmz5\" (UID: \"7e444466-5661-4707-9f6c-6fe7275f0bfa\") " pod="openshift-keda/keda-admission-cf49989db-gjmz5" Apr 16 14:06:09.906012 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:09.905883 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7e444466-5661-4707-9f6c-6fe7275f0bfa-certificates\") pod \"keda-admission-cf49989db-gjmz5\" (UID: \"7e444466-5661-4707-9f6c-6fe7275f0bfa\") " pod="openshift-keda/keda-admission-cf49989db-gjmz5" Apr 16 14:06:10.006897 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:10.006862 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7e444466-5661-4707-9f6c-6fe7275f0bfa-certificates\") pod \"keda-admission-cf49989db-gjmz5\" (UID: \"7e444466-5661-4707-9f6c-6fe7275f0bfa\") " pod="openshift-keda/keda-admission-cf49989db-gjmz5" Apr 16 14:06:10.007066 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:10.007002 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwtg6\" (UniqueName: \"kubernetes.io/projected/7e444466-5661-4707-9f6c-6fe7275f0bfa-kube-api-access-gwtg6\") pod \"keda-admission-cf49989db-gjmz5\" (UID: \"7e444466-5661-4707-9f6c-6fe7275f0bfa\") " pod="openshift-keda/keda-admission-cf49989db-gjmz5" Apr 16 14:06:10.009428 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:10.009384 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7e444466-5661-4707-9f6c-6fe7275f0bfa-certificates\") pod \"keda-admission-cf49989db-gjmz5\" (UID: \"7e444466-5661-4707-9f6c-6fe7275f0bfa\") " pod="openshift-keda/keda-admission-cf49989db-gjmz5" Apr 16 14:06:10.015247 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:10.015220 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwtg6\" (UniqueName: \"kubernetes.io/projected/7e444466-5661-4707-9f6c-6fe7275f0bfa-kube-api-access-gwtg6\") pod \"keda-admission-cf49989db-gjmz5\" (UID: \"7e444466-5661-4707-9f6c-6fe7275f0bfa\") " pod="openshift-keda/keda-admission-cf49989db-gjmz5" Apr 16 14:06:10.108497 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:10.108385 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5018b7f7-38b7-4034-8f83-530879b1668c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qqqqs\" (UID: \"5018b7f7-38b7-4034-8f83-530879b1668c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs" Apr 16 14:06:10.108640 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:10.108539 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 14:06:10.108640 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:10.108556 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 14:06:10.108640 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:10.108574 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs: references non-existent secret key: tls.crt Apr 16 14:06:10.108640 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:10.108636 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5018b7f7-38b7-4034-8f83-530879b1668c-certificates podName:5018b7f7-38b7-4034-8f83-530879b1668c nodeName:}" failed. No retries permitted until 2026-04-16 14:06:11.108621179 +0000 UTC m=+408.448996011 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5018b7f7-38b7-4034-8f83-530879b1668c-certificates") pod "keda-metrics-apiserver-7c9f485588-qqqqs" (UID: "5018b7f7-38b7-4034-8f83-530879b1668c") : references non-existent secret key: tls.crt Apr 16 14:06:10.111302 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:10.111279 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-gjmz5" Apr 16 14:06:10.236291 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:10.236252 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-gjmz5"] Apr 16 14:06:10.240835 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:06:10.240807 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e444466_5661_4707_9f6c_6fe7275f0bfa.slice/crio-b8b3a466c2d92fa2ac0253299c3edc7d31d05788c44063b46a0a46008365f05e WatchSource:0}: Error finding container b8b3a466c2d92fa2ac0253299c3edc7d31d05788c44063b46a0a46008365f05e: Status 404 returned error can't find the container with id b8b3a466c2d92fa2ac0253299c3edc7d31d05788c44063b46a0a46008365f05e Apr 16 14:06:10.499585 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:10.499548 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-gjmz5" event={"ID":"7e444466-5661-4707-9f6c-6fe7275f0bfa","Type":"ContainerStarted","Data":"b8b3a466c2d92fa2ac0253299c3edc7d31d05788c44063b46a0a46008365f05e"} Apr 16 14:06:10.713088 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:10.713048 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/986c32cb-58f0-407f-a253-dea6caa26027-certificates\") pod \"keda-operator-ffbb595cb-54nqx\" (UID: \"986c32cb-58f0-407f-a253-dea6caa26027\") " pod="openshift-keda/keda-operator-ffbb595cb-54nqx" Apr 16 14:06:10.713481 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:10.713216 2569 secret.go:281] references non-existent secret key: ca.crt Apr 16 14:06:10.713481 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:10.713239 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 14:06:10.713481 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:10.713252 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-54nqx: references non-existent secret key: ca.crt Apr 16 14:06:10.713481 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:10.713330 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/986c32cb-58f0-407f-a253-dea6caa26027-certificates podName:986c32cb-58f0-407f-a253-dea6caa26027 nodeName:}" failed. No retries permitted until 2026-04-16 14:06:12.71330959 +0000 UTC m=+410.053684423 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/986c32cb-58f0-407f-a253-dea6caa26027-certificates") pod "keda-operator-ffbb595cb-54nqx" (UID: "986c32cb-58f0-407f-a253-dea6caa26027") : references non-existent secret key: ca.crt Apr 16 14:06:11.116918 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:11.116884 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5018b7f7-38b7-4034-8f83-530879b1668c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qqqqs\" (UID: \"5018b7f7-38b7-4034-8f83-530879b1668c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs" Apr 16 14:06:11.117150 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:11.117026 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 14:06:11.117150 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:11.117047 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 14:06:11.117150 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:11.117067 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs: references non-existent secret key: tls.crt Apr 16 14:06:11.117150 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:11.117119 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5018b7f7-38b7-4034-8f83-530879b1668c-certificates podName:5018b7f7-38b7-4034-8f83-530879b1668c nodeName:}" failed. No retries permitted until 2026-04-16 14:06:13.117104983 +0000 UTC m=+410.457479819 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5018b7f7-38b7-4034-8f83-530879b1668c-certificates") pod "keda-metrics-apiserver-7c9f485588-qqqqs" (UID: "5018b7f7-38b7-4034-8f83-530879b1668c") : references non-existent secret key: tls.crt Apr 16 14:06:12.506938 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:12.506899 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-gjmz5" event={"ID":"7e444466-5661-4707-9f6c-6fe7275f0bfa","Type":"ContainerStarted","Data":"5218de694e5147b585c1d9b0ed3eb60bb17da557a148ddaff8c00569153da82d"} Apr 16 14:06:12.507300 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:12.507010 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-gjmz5" Apr 16 14:06:12.526439 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:12.526367 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-gjmz5" podStartSLOduration=1.646127066 podStartE2EDuration="3.526349328s" podCreationTimestamp="2026-04-16 14:06:09 +0000 UTC" firstStartedPulling="2026-04-16 14:06:10.241985363 +0000 UTC m=+407.582360194" lastFinishedPulling="2026-04-16 14:06:12.122207625 +0000 UTC m=+409.462582456" observedRunningTime="2026-04-16 14:06:12.524480607 +0000 UTC m=+409.864855459" watchObservedRunningTime="2026-04-16 14:06:12.526349328 +0000 UTC m=+409.866724183" Apr 16 14:06:12.732311 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:12.732266 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/986c32cb-58f0-407f-a253-dea6caa26027-certificates\") pod \"keda-operator-ffbb595cb-54nqx\" (UID: \"986c32cb-58f0-407f-a253-dea6caa26027\") " pod="openshift-keda/keda-operator-ffbb595cb-54nqx" Apr 16 14:06:12.732493 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:12.732417 2569 secret.go:281] references non-existent secret key: ca.crt Apr 16 14:06:12.732493 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:12.732438 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 14:06:12.732493 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:12.732447 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-54nqx: references non-existent secret key: ca.crt Apr 16 14:06:12.732596 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:12.732508 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/986c32cb-58f0-407f-a253-dea6caa26027-certificates podName:986c32cb-58f0-407f-a253-dea6caa26027 nodeName:}" failed. No retries permitted until 2026-04-16 14:06:16.732492634 +0000 UTC m=+414.072867465 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/986c32cb-58f0-407f-a253-dea6caa26027-certificates") pod "keda-operator-ffbb595cb-54nqx" (UID: "986c32cb-58f0-407f-a253-dea6caa26027") : references non-existent secret key: ca.crt Apr 16 14:06:13.135318 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:13.135274 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5018b7f7-38b7-4034-8f83-530879b1668c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qqqqs\" (UID: \"5018b7f7-38b7-4034-8f83-530879b1668c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs" Apr 16 14:06:13.135550 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:13.135439 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 14:06:13.135550 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:13.135453 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 14:06:13.135550 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:13.135473 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs: references non-existent secret key: tls.crt Apr 16 14:06:13.135550 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:06:13.135525 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5018b7f7-38b7-4034-8f83-530879b1668c-certificates podName:5018b7f7-38b7-4034-8f83-530879b1668c nodeName:}" failed. No retries permitted until 2026-04-16 14:06:17.135509149 +0000 UTC m=+414.475883992 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5018b7f7-38b7-4034-8f83-530879b1668c-certificates") pod "keda-metrics-apiserver-7c9f485588-qqqqs" (UID: "5018b7f7-38b7-4034-8f83-530879b1668c") : references non-existent secret key: tls.crt Apr 16 14:06:16.764705 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:16.764666 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/986c32cb-58f0-407f-a253-dea6caa26027-certificates\") pod \"keda-operator-ffbb595cb-54nqx\" (UID: \"986c32cb-58f0-407f-a253-dea6caa26027\") " pod="openshift-keda/keda-operator-ffbb595cb-54nqx" Apr 16 14:06:16.767236 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:16.767211 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/986c32cb-58f0-407f-a253-dea6caa26027-certificates\") pod \"keda-operator-ffbb595cb-54nqx\" (UID: \"986c32cb-58f0-407f-a253-dea6caa26027\") " pod="openshift-keda/keda-operator-ffbb595cb-54nqx" Apr 16 14:06:16.781019 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:16.780983 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-54nqx" Apr 16 14:06:16.902435 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:16.902383 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-54nqx"] Apr 16 14:06:16.905589 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:06:16.905560 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod986c32cb_58f0_407f_a253_dea6caa26027.slice/crio-f225848c61d56178a630b88e6750ab5729a886d523f90037eb01caf81060a63f WatchSource:0}: Error finding container f225848c61d56178a630b88e6750ab5729a886d523f90037eb01caf81060a63f: Status 404 returned error can't find the container with id f225848c61d56178a630b88e6750ab5729a886d523f90037eb01caf81060a63f Apr 16 14:06:17.169445 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:17.169307 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5018b7f7-38b7-4034-8f83-530879b1668c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qqqqs\" (UID: \"5018b7f7-38b7-4034-8f83-530879b1668c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs" Apr 16 14:06:17.171894 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:17.171874 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5018b7f7-38b7-4034-8f83-530879b1668c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qqqqs\" (UID: \"5018b7f7-38b7-4034-8f83-530879b1668c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs" Apr 16 14:06:17.221905 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:17.221857 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs" Apr 16 14:06:17.345150 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:17.345124 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs"] Apr 16 14:06:17.347371 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:06:17.347343 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5018b7f7_38b7_4034_8f83_530879b1668c.slice/crio-689c463071255298e16e627842abb7583167b13dd66cca4b0f3c6171a0a3cf25 WatchSource:0}: Error finding container 689c463071255298e16e627842abb7583167b13dd66cca4b0f3c6171a0a3cf25: Status 404 returned error can't find the container with id 689c463071255298e16e627842abb7583167b13dd66cca4b0f3c6171a0a3cf25 Apr 16 14:06:17.524354 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:17.524319 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs" event={"ID":"5018b7f7-38b7-4034-8f83-530879b1668c","Type":"ContainerStarted","Data":"689c463071255298e16e627842abb7583167b13dd66cca4b0f3c6171a0a3cf25"} Apr 16 14:06:17.525325 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:17.525303 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-54nqx" event={"ID":"986c32cb-58f0-407f-a253-dea6caa26027","Type":"ContainerStarted","Data":"f225848c61d56178a630b88e6750ab5729a886d523f90037eb01caf81060a63f"} Apr 16 14:06:21.541767 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:21.541733 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs" event={"ID":"5018b7f7-38b7-4034-8f83-530879b1668c","Type":"ContainerStarted","Data":"f761288735cab9c35f7217a56f90cfb6f54f8c451f9f12797a571dbb46149a2a"} Apr 16 14:06:21.542259 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:21.541828 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs" Apr 16 14:06:21.543035 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:21.543016 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-54nqx" event={"ID":"986c32cb-58f0-407f-a253-dea6caa26027","Type":"ContainerStarted","Data":"936fc6a5892025bf2b90444bbc1b92b1fcdada4eb59efdee45b6bb3b7c858c8c"} Apr 16 14:06:21.543144 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:21.543111 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-54nqx" Apr 16 14:06:21.570346 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:21.570287 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs" podStartSLOduration=8.999287472 podStartE2EDuration="12.570271053s" podCreationTimestamp="2026-04-16 14:06:09 +0000 UTC" firstStartedPulling="2026-04-16 14:06:17.348739651 +0000 UTC m=+414.689114482" lastFinishedPulling="2026-04-16 14:06:20.919723229 +0000 UTC m=+418.260098063" observedRunningTime="2026-04-16 14:06:21.569095348 +0000 UTC m=+418.909470203" watchObservedRunningTime="2026-04-16 14:06:21.570271053 +0000 UTC m=+418.910645905" Apr 16 14:06:21.609142 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:21.609069 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-54nqx" podStartSLOduration=9.597047071 podStartE2EDuration="13.609053856s" podCreationTimestamp="2026-04-16 14:06:08 +0000 UTC" firstStartedPulling="2026-04-16 14:06:16.907299235 +0000 UTC m=+414.247674066" lastFinishedPulling="2026-04-16 14:06:20.919306019 +0000 UTC m=+418.259680851" observedRunningTime="2026-04-16 14:06:21.607974218 +0000 UTC m=+418.948349071" watchObservedRunningTime="2026-04-16 14:06:21.609053856 +0000 UTC m=+418.949428708" Apr 16 14:06:29.498795 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:29.498761 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-blh4b" Apr 16 14:06:32.550839 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:32.550810 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qqqqs" Apr 16 14:06:33.512576 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:33.512544 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-gjmz5" Apr 16 14:06:42.548615 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:06:42.548529 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-54nqx" Apr 16 14:07:14.850626 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:14.850592 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-dn9fg"] Apr 16 14:07:14.853783 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:14.853766 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-dn9fg" Apr 16 14:07:14.855717 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:14.855688 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 14:07:14.855859 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:14.855828 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 14:07:14.855922 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:14.855871 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-484xl\"" Apr 16 14:07:14.856190 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:14.856176 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 14:07:14.858999 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:14.858974 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-rh422"] Apr 16 14:07:14.862202 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:14.862169 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rh422" Apr 16 14:07:14.862918 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:14.862897 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-dn9fg"] Apr 16 14:07:14.863936 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:14.863914 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 14:07:14.864044 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:14.863916 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-2qbx4\"" Apr 16 14:07:14.869702 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:14.869682 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-rh422"] Apr 16 14:07:14.908715 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:14.908679 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d0d3edf-2f2c-493d-8dee-0685a90079dd-cert\") pod \"kserve-controller-manager-75d667c7c4-dn9fg\" (UID: \"2d0d3edf-2f2c-493d-8dee-0685a90079dd\") " pod="kserve/kserve-controller-manager-75d667c7c4-dn9fg" Apr 16 14:07:14.908872 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:14.908735 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxcz2\" (UniqueName: \"kubernetes.io/projected/2d0d3edf-2f2c-493d-8dee-0685a90079dd-kube-api-access-sxcz2\") pod \"kserve-controller-manager-75d667c7c4-dn9fg\" (UID: \"2d0d3edf-2f2c-493d-8dee-0685a90079dd\") " pod="kserve/kserve-controller-manager-75d667c7c4-dn9fg" Apr 16 14:07:15.009914 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:15.009855 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px6hx\" (UniqueName: \"kubernetes.io/projected/ab5ed564-2053-4ba2-b8dc-db1097e89bb3-kube-api-access-px6hx\") pod \"llmisvc-controller-manager-68cc5db7c4-rh422\" (UID: \"ab5ed564-2053-4ba2-b8dc-db1097e89bb3\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-rh422" Apr 16 14:07:15.009914 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:15.009923 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxcz2\" (UniqueName: \"kubernetes.io/projected/2d0d3edf-2f2c-493d-8dee-0685a90079dd-kube-api-access-sxcz2\") pod \"kserve-controller-manager-75d667c7c4-dn9fg\" (UID: \"2d0d3edf-2f2c-493d-8dee-0685a90079dd\") " pod="kserve/kserve-controller-manager-75d667c7c4-dn9fg" Apr 16 14:07:15.010140 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:15.009979 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab5ed564-2053-4ba2-b8dc-db1097e89bb3-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-rh422\" (UID: \"ab5ed564-2053-4ba2-b8dc-db1097e89bb3\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-rh422" Apr 16 14:07:15.010140 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:15.010016 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d0d3edf-2f2c-493d-8dee-0685a90079dd-cert\") pod \"kserve-controller-manager-75d667c7c4-dn9fg\" (UID: \"2d0d3edf-2f2c-493d-8dee-0685a90079dd\") " pod="kserve/kserve-controller-manager-75d667c7c4-dn9fg" Apr 16 14:07:15.010140 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:07:15.010132 2569 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 14:07:15.010230 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:07:15.010196 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d0d3edf-2f2c-493d-8dee-0685a90079dd-cert podName:2d0d3edf-2f2c-493d-8dee-0685a90079dd nodeName:}" failed. No retries permitted until 2026-04-16 14:07:15.510174573 +0000 UTC m=+472.850549405 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d0d3edf-2f2c-493d-8dee-0685a90079dd-cert") pod "kserve-controller-manager-75d667c7c4-dn9fg" (UID: "2d0d3edf-2f2c-493d-8dee-0685a90079dd") : secret "kserve-webhook-server-cert" not found Apr 16 14:07:15.021050 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:15.021018 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxcz2\" (UniqueName: \"kubernetes.io/projected/2d0d3edf-2f2c-493d-8dee-0685a90079dd-kube-api-access-sxcz2\") pod \"kserve-controller-manager-75d667c7c4-dn9fg\" (UID: \"2d0d3edf-2f2c-493d-8dee-0685a90079dd\") " pod="kserve/kserve-controller-manager-75d667c7c4-dn9fg" Apr 16 14:07:15.110689 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:15.110597 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-px6hx\" (UniqueName: \"kubernetes.io/projected/ab5ed564-2053-4ba2-b8dc-db1097e89bb3-kube-api-access-px6hx\") pod \"llmisvc-controller-manager-68cc5db7c4-rh422\" (UID: \"ab5ed564-2053-4ba2-b8dc-db1097e89bb3\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-rh422" Apr 16 14:07:15.110855 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:15.110692 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab5ed564-2053-4ba2-b8dc-db1097e89bb3-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-rh422\" (UID: \"ab5ed564-2053-4ba2-b8dc-db1097e89bb3\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-rh422" Apr 16 14:07:15.113139 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:15.113116 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab5ed564-2053-4ba2-b8dc-db1097e89bb3-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-rh422\" (UID: \"ab5ed564-2053-4ba2-b8dc-db1097e89bb3\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-rh422" Apr 16 14:07:15.118788 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:15.118755 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-px6hx\" (UniqueName: \"kubernetes.io/projected/ab5ed564-2053-4ba2-b8dc-db1097e89bb3-kube-api-access-px6hx\") pod \"llmisvc-controller-manager-68cc5db7c4-rh422\" (UID: \"ab5ed564-2053-4ba2-b8dc-db1097e89bb3\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-rh422" Apr 16 14:07:15.175896 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:15.175864 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rh422" Apr 16 14:07:15.299797 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:15.299768 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-rh422"] Apr 16 14:07:15.302120 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:07:15.302089 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podab5ed564_2053_4ba2_b8dc_db1097e89bb3.slice/crio-2df2c04a584867262a7aec22bc831d981c15c663a07909eb477ecb6c7c01e79e WatchSource:0}: Error finding container 2df2c04a584867262a7aec22bc831d981c15c663a07909eb477ecb6c7c01e79e: Status 404 returned error can't find the container with id 2df2c04a584867262a7aec22bc831d981c15c663a07909eb477ecb6c7c01e79e Apr 16 14:07:15.513996 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:15.513960 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d0d3edf-2f2c-493d-8dee-0685a90079dd-cert\") pod \"kserve-controller-manager-75d667c7c4-dn9fg\" (UID: \"2d0d3edf-2f2c-493d-8dee-0685a90079dd\") " pod="kserve/kserve-controller-manager-75d667c7c4-dn9fg" Apr 16 14:07:15.516425 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:15.516372 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d0d3edf-2f2c-493d-8dee-0685a90079dd-cert\") pod \"kserve-controller-manager-75d667c7c4-dn9fg\" (UID: \"2d0d3edf-2f2c-493d-8dee-0685a90079dd\") " pod="kserve/kserve-controller-manager-75d667c7c4-dn9fg" Apr 16 14:07:15.715284 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:15.715245 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rh422" event={"ID":"ab5ed564-2053-4ba2-b8dc-db1097e89bb3","Type":"ContainerStarted","Data":"2df2c04a584867262a7aec22bc831d981c15c663a07909eb477ecb6c7c01e79e"} Apr 16 14:07:15.765725 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:15.765648 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-dn9fg" Apr 16 14:07:16.189862 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:16.189825 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-dn9fg"] Apr 16 14:07:16.250688 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:07:16.250649 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d0d3edf_2f2c_493d_8dee_0685a90079dd.slice/crio-08e971cfad49376fa286bbfd7e6103012dee11464437eec439a8fb54d513394f WatchSource:0}: Error finding container 08e971cfad49376fa286bbfd7e6103012dee11464437eec439a8fb54d513394f: Status 404 returned error can't find the container with id 08e971cfad49376fa286bbfd7e6103012dee11464437eec439a8fb54d513394f Apr 16 14:07:16.720113 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:16.720064 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-dn9fg" event={"ID":"2d0d3edf-2f2c-493d-8dee-0685a90079dd","Type":"ContainerStarted","Data":"08e971cfad49376fa286bbfd7e6103012dee11464437eec439a8fb54d513394f"} Apr 16 14:07:17.725193 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:17.725148 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rh422" event={"ID":"ab5ed564-2053-4ba2-b8dc-db1097e89bb3","Type":"ContainerStarted","Data":"0f8acf8b70f01e432c39237b7fb94e65c65daab56887e991d85089894fb82e03"} Apr 16 14:07:17.725674 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:17.725288 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rh422" Apr 16 14:07:17.744304 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:17.744124 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rh422" podStartSLOduration=1.829714 podStartE2EDuration="3.744105443s" podCreationTimestamp="2026-04-16 14:07:14 +0000 UTC" firstStartedPulling="2026-04-16 14:07:15.303387964 +0000 UTC m=+472.643762795" lastFinishedPulling="2026-04-16 14:07:17.217779393 +0000 UTC m=+474.558154238" observedRunningTime="2026-04-16 14:07:17.742894934 +0000 UTC m=+475.083269789" watchObservedRunningTime="2026-04-16 14:07:17.744105443 +0000 UTC m=+475.084480296" Apr 16 14:07:19.734618 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:19.734582 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-dn9fg" event={"ID":"2d0d3edf-2f2c-493d-8dee-0685a90079dd","Type":"ContainerStarted","Data":"b5743eb48aca2fb2df98c7b542d3b3559495dca14ecacfbf783873df4c3bfeba"} Apr 16 14:07:19.734990 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:19.734714 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-75d667c7c4-dn9fg" Apr 16 14:07:48.732354 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:48.732318 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-rh422" Apr 16 14:07:48.750009 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:48.749955 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-75d667c7c4-dn9fg" podStartSLOduration=31.736101124 podStartE2EDuration="34.74994102s" podCreationTimestamp="2026-04-16 14:07:14 +0000 UTC" firstStartedPulling="2026-04-16 14:07:16.252370448 +0000 UTC m=+473.592745278" lastFinishedPulling="2026-04-16 14:07:19.26621034 +0000 UTC m=+476.606585174" observedRunningTime="2026-04-16 14:07:19.769018646 +0000 UTC m=+477.109393517" watchObservedRunningTime="2026-04-16 14:07:48.74994102 +0000 UTC m=+506.090315851" Apr 16 14:07:50.289852 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.289816 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-dn9fg"] Apr 16 14:07:50.290263 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.290063 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-75d667c7c4-dn9fg" podUID="2d0d3edf-2f2c-493d-8dee-0685a90079dd" containerName="manager" containerID="cri-o://b5743eb48aca2fb2df98c7b542d3b3559495dca14ecacfbf783873df4c3bfeba" gracePeriod=10 Apr 16 14:07:50.295055 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.295031 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-75d667c7c4-dn9fg" Apr 16 14:07:50.318429 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.318390 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-6sft9"] Apr 16 14:07:50.321928 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.321911 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-6sft9" Apr 16 14:07:50.331439 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.331412 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-6sft9"] Apr 16 14:07:50.404545 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.404504 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j97dr\" (UniqueName: \"kubernetes.io/projected/d78e403d-3f82-4398-b47d-64a302d5dbe6-kube-api-access-j97dr\") pod \"kserve-controller-manager-75d667c7c4-6sft9\" (UID: \"d78e403d-3f82-4398-b47d-64a302d5dbe6\") " pod="kserve/kserve-controller-manager-75d667c7c4-6sft9" Apr 16 14:07:50.404711 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.404604 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d78e403d-3f82-4398-b47d-64a302d5dbe6-cert\") pod \"kserve-controller-manager-75d667c7c4-6sft9\" (UID: \"d78e403d-3f82-4398-b47d-64a302d5dbe6\") " pod="kserve/kserve-controller-manager-75d667c7c4-6sft9" Apr 16 14:07:50.506187 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.506153 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j97dr\" (UniqueName: \"kubernetes.io/projected/d78e403d-3f82-4398-b47d-64a302d5dbe6-kube-api-access-j97dr\") pod \"kserve-controller-manager-75d667c7c4-6sft9\" (UID: \"d78e403d-3f82-4398-b47d-64a302d5dbe6\") " pod="kserve/kserve-controller-manager-75d667c7c4-6sft9" Apr 16 14:07:50.506366 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.506205 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d78e403d-3f82-4398-b47d-64a302d5dbe6-cert\") pod \"kserve-controller-manager-75d667c7c4-6sft9\" (UID: \"d78e403d-3f82-4398-b47d-64a302d5dbe6\") " pod="kserve/kserve-controller-manager-75d667c7c4-6sft9" Apr 16 14:07:50.508805 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.508698 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d78e403d-3f82-4398-b47d-64a302d5dbe6-cert\") pod \"kserve-controller-manager-75d667c7c4-6sft9\" (UID: \"d78e403d-3f82-4398-b47d-64a302d5dbe6\") " pod="kserve/kserve-controller-manager-75d667c7c4-6sft9" Apr 16 14:07:50.515117 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.515090 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j97dr\" (UniqueName: \"kubernetes.io/projected/d78e403d-3f82-4398-b47d-64a302d5dbe6-kube-api-access-j97dr\") pod \"kserve-controller-manager-75d667c7c4-6sft9\" (UID: \"d78e403d-3f82-4398-b47d-64a302d5dbe6\") " pod="kserve/kserve-controller-manager-75d667c7c4-6sft9" Apr 16 14:07:50.527563 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.527541 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-dn9fg" Apr 16 14:07:50.607057 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.606969 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxcz2\" (UniqueName: \"kubernetes.io/projected/2d0d3edf-2f2c-493d-8dee-0685a90079dd-kube-api-access-sxcz2\") pod \"2d0d3edf-2f2c-493d-8dee-0685a90079dd\" (UID: \"2d0d3edf-2f2c-493d-8dee-0685a90079dd\") " Apr 16 14:07:50.607057 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.607019 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d0d3edf-2f2c-493d-8dee-0685a90079dd-cert\") pod \"2d0d3edf-2f2c-493d-8dee-0685a90079dd\" (UID: \"2d0d3edf-2f2c-493d-8dee-0685a90079dd\") " Apr 16 14:07:50.609260 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.609230 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d0d3edf-2f2c-493d-8dee-0685a90079dd-cert" (OuterVolumeSpecName: "cert") pod "2d0d3edf-2f2c-493d-8dee-0685a90079dd" (UID: "2d0d3edf-2f2c-493d-8dee-0685a90079dd"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:07:50.609260 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.609235 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d0d3edf-2f2c-493d-8dee-0685a90079dd-kube-api-access-sxcz2" (OuterVolumeSpecName: "kube-api-access-sxcz2") pod "2d0d3edf-2f2c-493d-8dee-0685a90079dd" (UID: "2d0d3edf-2f2c-493d-8dee-0685a90079dd"). InnerVolumeSpecName "kube-api-access-sxcz2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:07:50.670637 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.670580 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-6sft9" Apr 16 14:07:50.707883 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.707849 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sxcz2\" (UniqueName: \"kubernetes.io/projected/2d0d3edf-2f2c-493d-8dee-0685a90079dd-kube-api-access-sxcz2\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Apr 16 14:07:50.707883 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.707880 2569 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d0d3edf-2f2c-493d-8dee-0685a90079dd-cert\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Apr 16 14:07:50.790845 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.790812 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-6sft9"] Apr 16 14:07:50.794108 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:07:50.794083 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd78e403d_3f82_4398_b47d_64a302d5dbe6.slice/crio-a2b3d7f9f26187edaa547544aeab4ddd6dcd65d18ba1b2ee3a140be4661f7954 WatchSource:0}: Error finding container a2b3d7f9f26187edaa547544aeab4ddd6dcd65d18ba1b2ee3a140be4661f7954: Status 404 returned error can't find the container with id a2b3d7f9f26187edaa547544aeab4ddd6dcd65d18ba1b2ee3a140be4661f7954 Apr 16 14:07:50.832593 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.832560 2569 generic.go:358] "Generic (PLEG): container finished" podID="2d0d3edf-2f2c-493d-8dee-0685a90079dd" containerID="b5743eb48aca2fb2df98c7b542d3b3559495dca14ecacfbf783873df4c3bfeba" exitCode=0 Apr 16 14:07:50.832771 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.832626 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-dn9fg" Apr 16 14:07:50.832771 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.832643 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-dn9fg" event={"ID":"2d0d3edf-2f2c-493d-8dee-0685a90079dd","Type":"ContainerDied","Data":"b5743eb48aca2fb2df98c7b542d3b3559495dca14ecacfbf783873df4c3bfeba"} Apr 16 14:07:50.832771 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.832705 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-dn9fg" event={"ID":"2d0d3edf-2f2c-493d-8dee-0685a90079dd","Type":"ContainerDied","Data":"08e971cfad49376fa286bbfd7e6103012dee11464437eec439a8fb54d513394f"} Apr 16 14:07:50.832771 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.832730 2569 scope.go:117] "RemoveContainer" containerID="b5743eb48aca2fb2df98c7b542d3b3559495dca14ecacfbf783873df4c3bfeba" Apr 16 14:07:50.833807 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.833762 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-6sft9" event={"ID":"d78e403d-3f82-4398-b47d-64a302d5dbe6","Type":"ContainerStarted","Data":"a2b3d7f9f26187edaa547544aeab4ddd6dcd65d18ba1b2ee3a140be4661f7954"} Apr 16 14:07:50.840655 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.840640 2569 scope.go:117] "RemoveContainer" containerID="b5743eb48aca2fb2df98c7b542d3b3559495dca14ecacfbf783873df4c3bfeba" Apr 16 14:07:50.840895 ip-10-0-142-16 kubenswrapper[2569]: E0416 14:07:50.840879 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5743eb48aca2fb2df98c7b542d3b3559495dca14ecacfbf783873df4c3bfeba\": container with ID starting with b5743eb48aca2fb2df98c7b542d3b3559495dca14ecacfbf783873df4c3bfeba not found: ID does not exist" containerID="b5743eb48aca2fb2df98c7b542d3b3559495dca14ecacfbf783873df4c3bfeba" Apr 16 14:07:50.840960 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.840904 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5743eb48aca2fb2df98c7b542d3b3559495dca14ecacfbf783873df4c3bfeba"} err="failed to get container status \"b5743eb48aca2fb2df98c7b542d3b3559495dca14ecacfbf783873df4c3bfeba\": rpc error: code = NotFound desc = could not find container \"b5743eb48aca2fb2df98c7b542d3b3559495dca14ecacfbf783873df4c3bfeba\": container with ID starting with b5743eb48aca2fb2df98c7b542d3b3559495dca14ecacfbf783873df4c3bfeba not found: ID does not exist" Apr 16 14:07:50.852890 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.852862 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-dn9fg"] Apr 16 14:07:50.856274 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:50.856250 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-dn9fg"] Apr 16 14:07:51.313801 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:51.313765 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d0d3edf-2f2c-493d-8dee-0685a90079dd" path="/var/lib/kubelet/pods/2d0d3edf-2f2c-493d-8dee-0685a90079dd/volumes" Apr 16 14:07:52.842450 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:52.842388 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-6sft9" event={"ID":"d78e403d-3f82-4398-b47d-64a302d5dbe6","Type":"ContainerStarted","Data":"e6897111e24c880d218e52bd217b29c36fc834efbda6075446a2d8cb0fa9bce8"} Apr 16 14:07:52.842869 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:52.842488 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-75d667c7c4-6sft9" Apr 16 14:07:52.859063 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:07:52.859013 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-75d667c7c4-6sft9" podStartSLOduration=1.653968088 podStartE2EDuration="2.858997572s" podCreationTimestamp="2026-04-16 14:07:50 +0000 UTC" firstStartedPulling="2026-04-16 14:07:50.795294911 +0000 UTC m=+508.135669742" lastFinishedPulling="2026-04-16 14:07:52.000324367 +0000 UTC m=+509.340699226" observedRunningTime="2026-04-16 14:07:52.858133016 +0000 UTC m=+510.198507869" watchObservedRunningTime="2026-04-16 14:07:52.858997572 +0000 UTC m=+510.199372486" Apr 16 14:08:09.459457 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.459420 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5699487cfc-4sntl"] Apr 16 14:08:09.459854 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.459831 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d0d3edf-2f2c-493d-8dee-0685a90079dd" containerName="manager" Apr 16 14:08:09.459897 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.459858 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d0d3edf-2f2c-493d-8dee-0685a90079dd" containerName="manager" Apr 16 14:08:09.459959 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.459942 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d0d3edf-2f2c-493d-8dee-0685a90079dd" containerName="manager" Apr 16 14:08:09.463504 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.463480 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.466147 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.466122 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 14:08:09.466365 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.466130 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-vvc8w\"" Apr 16 14:08:09.466499 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.466146 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 14:08:09.466499 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.466158 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 14:08:09.466602 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.466203 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 14:08:09.466816 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.466792 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 14:08:09.470671 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.470649 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 14:08:09.474754 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.474727 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5699487cfc-4sntl"] Apr 16 14:08:09.569064 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.569024 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-service-ca\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.569237 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.569123 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-console-serving-cert\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.569237 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.569159 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-trusted-ca-bundle\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.569237 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.569205 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-console-oauth-config\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.569237 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.569227 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-console-config\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.569384 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.569250 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-oauth-serving-cert\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.569384 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.569265 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th8hr\" (UniqueName: \"kubernetes.io/projected/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-kube-api-access-th8hr\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.670511 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.670470 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-console-serving-cert\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.670511 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.670511 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-trusted-ca-bundle\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.670746 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.670543 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-console-oauth-config\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.670746 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.670594 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-console-config\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.670746 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.670644 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-oauth-serving-cert\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.670746 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.670672 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-th8hr\" (UniqueName: \"kubernetes.io/projected/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-kube-api-access-th8hr\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.670746 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.670711 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-service-ca\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.671464 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.671443 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-console-config\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.671563 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.671514 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-service-ca\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.671563 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.671548 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-oauth-serving-cert\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.671689 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.671674 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-trusted-ca-bundle\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.673065 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.673045 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-console-serving-cert\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.673137 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.673092 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-console-oauth-config\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.678372 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.678350 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-th8hr\" (UniqueName: \"kubernetes.io/projected/7b48f4ca-269a-4977-8fec-3b31a1fd22b0-kube-api-access-th8hr\") pod \"console-5699487cfc-4sntl\" (UID: \"7b48f4ca-269a-4977-8fec-3b31a1fd22b0\") " pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.776096 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.776010 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:09.898093 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:09.898064 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5699487cfc-4sntl"] Apr 16 14:08:09.900508 ip-10-0-142-16 kubenswrapper[2569]: W0416 14:08:09.900478 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b48f4ca_269a_4977_8fec_3b31a1fd22b0.slice/crio-5eca2fe6d1d42765d4aad293e765cc2e357b164091056bfcf96680e10d34f5f2 WatchSource:0}: Error finding container 5eca2fe6d1d42765d4aad293e765cc2e357b164091056bfcf96680e10d34f5f2: Status 404 returned error can't find the container with id 5eca2fe6d1d42765d4aad293e765cc2e357b164091056bfcf96680e10d34f5f2 Apr 16 14:08:10.899300 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:10.899264 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5699487cfc-4sntl" event={"ID":"7b48f4ca-269a-4977-8fec-3b31a1fd22b0","Type":"ContainerStarted","Data":"93699e246350ed49faf74427aec2218375c289cf879fc0b33d5d7b21ae3ff6a5"} Apr 16 14:08:10.899300 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:10.899304 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5699487cfc-4sntl" event={"ID":"7b48f4ca-269a-4977-8fec-3b31a1fd22b0","Type":"ContainerStarted","Data":"5eca2fe6d1d42765d4aad293e765cc2e357b164091056bfcf96680e10d34f5f2"} Apr 16 14:08:10.916571 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:10.916514 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5699487cfc-4sntl" podStartSLOduration=1.916494327 podStartE2EDuration="1.916494327s" podCreationTimestamp="2026-04-16 14:08:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:08:10.915286708 +0000 UTC m=+528.255661561" watchObservedRunningTime="2026-04-16 14:08:10.916494327 +0000 UTC m=+528.256869178" Apr 16 14:08:19.776712 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:19.776614 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:19.776712 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:19.776665 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:19.781384 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:19.781359 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:19.932062 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:19.932022 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5699487cfc-4sntl" Apr 16 14:08:23.851077 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:08:23.851046 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-75d667c7c4-6sft9" Apr 16 14:09:23.219212 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:09:23.219176 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:09:23.221109 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:09:23.221084 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:09:23.224413 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:09:23.224375 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:09:23.225791 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:09:23.225770 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:14:23.239861 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:14:23.239784 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:14:23.248182 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:14:23.248153 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:14:23.251740 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:14:23.251719 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:14:23.254284 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:14:23.254260 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:19:23.268419 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:19:23.268373 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:19:23.271449 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:19:23.271424 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:19:23.273784 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:19:23.273763 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:19:23.277064 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:19:23.277048 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:24:23.288339 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:24:23.288302 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:24:23.293192 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:24:23.293171 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:24:23.293496 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:24:23.293479 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:24:23.298296 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:24:23.298275 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:29:23.308359 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:29:23.308325 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:29:23.314503 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:29:23.314476 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:29:23.314673 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:29:23.314656 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:29:23.319381 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:29:23.319360 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:34:23.329573 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:34:23.329544 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:34:23.335099 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:34:23.335074 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:34:23.335694 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:34:23.335676 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:34:23.340904 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:34:23.340887 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:39:23.349315 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:39:23.349288 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:39:23.354453 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:39:23.354430 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:39:23.356502 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:39:23.356484 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:39:23.360941 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:39:23.360924 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:44:23.369077 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:44:23.369050 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:44:23.374540 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:44:23.374515 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:44:23.377561 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:44:23.377542 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:44:23.382563 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:44:23.382543 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:49:23.389075 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:49:23.389048 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:49:23.393855 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:49:23.393833 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:49:23.403555 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:49:23.403535 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:49:23.410912 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:49:23.410890 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:54:23.408734 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:54:23.408640 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:54:23.413487 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:54:23.413470 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:54:23.431620 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:54:23.431599 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:54:23.436590 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:54:23.436575 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:59:23.427668 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:59:23.427643 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:59:23.432603 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:59:23.432578 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 14:59:23.452470 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:59:23.452448 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 14:59:23.457314 ip-10-0-142-16 kubenswrapper[2569]: I0416 14:59:23.457297 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 15:04:17.674113 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:17.674081 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-l9j7q_d4be6a59-e6b2-4033-95df-e0f99a6fe1e3/global-pull-secret-syncer/0.log" Apr 16 15:04:17.775021 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:17.774990 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-f5zrv_2b74162e-fbf4-4ede-b8ad-5623c1094615/konnectivity-agent/0.log" Apr 16 15:04:17.920996 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:17.920967 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-16.ec2.internal_104ec126e58c79948296ecdd10d4aa5b/haproxy/0.log" Apr 16 15:04:21.799346 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:21.799315 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-ht2bj_a78f3464-a81c-413a-a11a-ba6020b56874/cluster-monitoring-operator/0.log" Apr 16 15:04:21.824838 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:21.824808 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-qbddt_29aa9554-40e1-4efd-b0bc-34ab8445a858/kube-state-metrics/0.log" Apr 16 15:04:21.847495 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:21.847465 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-qbddt_29aa9554-40e1-4efd-b0bc-34ab8445a858/kube-rbac-proxy-main/0.log" Apr 16 15:04:21.873934 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:21.873910 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-qbddt_29aa9554-40e1-4efd-b0bc-34ab8445a858/kube-rbac-proxy-self/0.log" Apr 16 15:04:22.048039 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:22.048015 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5866k_0aabbbc1-d84a-4c1a-ae38-92ce1b31b836/node-exporter/0.log" Apr 16 15:04:22.075131 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:22.075062 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5866k_0aabbbc1-d84a-4c1a-ae38-92ce1b31b836/kube-rbac-proxy/0.log" Apr 16 15:04:22.100199 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:22.100176 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5866k_0aabbbc1-d84a-4c1a-ae38-92ce1b31b836/init-textfile/0.log" Apr 16 15:04:23.447226 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:23.447120 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 15:04:23.453086 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:23.452163 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 15:04:23.473050 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:23.473028 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 15:04:23.478196 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:23.478178 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 15:04:23.845269 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:23.845191 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-c4d9z_1a771ca7-2942-4693-8a9f-243a8e6f82d5/networking-console-plugin/0.log" Apr 16 15:04:24.282667 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:24.282634 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/2.log" Apr 16 15:04:24.287780 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:24.287753 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kpwpf_cb232208-c05b-4821-9c83-1582341d5232/console-operator/3.log" Apr 16 15:04:24.655995 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:24.655924 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5699487cfc-4sntl_7b48f4ca-269a-4977-8fec-3b31a1fd22b0/console/0.log" Apr 16 15:04:24.688035 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:24.688008 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-wqjg7_06c957ef-10ea-4050-a9a1-35994a3e35f8/download-server/0.log" Apr 16 15:04:25.105904 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.105869 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r"] Apr 16 15:04:25.109130 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.109107 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" Apr 16 15:04:25.111122 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.111106 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g9h4j\"/\"openshift-service-ca.crt\"" Apr 16 15:04:25.111606 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.111590 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-g9h4j\"/\"default-dockercfg-df925\"" Apr 16 15:04:25.111661 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.111612 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g9h4j\"/\"kube-root-ca.crt\"" Apr 16 15:04:25.117141 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.117110 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r"] Apr 16 15:04:25.290844 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.290809 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e8450bd0-3851-4a59-b103-0fac21bd21ee-proc\") pod \"perf-node-gather-daemonset-twj9r\" (UID: \"e8450bd0-3851-4a59-b103-0fac21bd21ee\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" Apr 16 15:04:25.290844 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.290844 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e8450bd0-3851-4a59-b103-0fac21bd21ee-podres\") pod \"perf-node-gather-daemonset-twj9r\" (UID: \"e8450bd0-3851-4a59-b103-0fac21bd21ee\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" Apr 16 15:04:25.291050 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.290872 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8450bd0-3851-4a59-b103-0fac21bd21ee-sys\") pod \"perf-node-gather-daemonset-twj9r\" (UID: \"e8450bd0-3851-4a59-b103-0fac21bd21ee\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" Apr 16 15:04:25.291050 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.290901 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8450bd0-3851-4a59-b103-0fac21bd21ee-lib-modules\") pod \"perf-node-gather-daemonset-twj9r\" (UID: \"e8450bd0-3851-4a59-b103-0fac21bd21ee\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" Apr 16 15:04:25.291050 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.290959 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knrkv\" (UniqueName: \"kubernetes.io/projected/e8450bd0-3851-4a59-b103-0fac21bd21ee-kube-api-access-knrkv\") pod \"perf-node-gather-daemonset-twj9r\" (UID: \"e8450bd0-3851-4a59-b103-0fac21bd21ee\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" Apr 16 15:04:25.392378 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.392295 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e8450bd0-3851-4a59-b103-0fac21bd21ee-proc\") pod \"perf-node-gather-daemonset-twj9r\" (UID: \"e8450bd0-3851-4a59-b103-0fac21bd21ee\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" Apr 16 15:04:25.392378 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.392326 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e8450bd0-3851-4a59-b103-0fac21bd21ee-podres\") pod \"perf-node-gather-daemonset-twj9r\" (UID: \"e8450bd0-3851-4a59-b103-0fac21bd21ee\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" Apr 16 15:04:25.392378 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.392352 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8450bd0-3851-4a59-b103-0fac21bd21ee-sys\") pod \"perf-node-gather-daemonset-twj9r\" (UID: \"e8450bd0-3851-4a59-b103-0fac21bd21ee\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" Apr 16 15:04:25.392378 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.392371 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8450bd0-3851-4a59-b103-0fac21bd21ee-lib-modules\") pod \"perf-node-gather-daemonset-twj9r\" (UID: \"e8450bd0-3851-4a59-b103-0fac21bd21ee\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" Apr 16 15:04:25.392676 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.392453 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-knrkv\" (UniqueName: \"kubernetes.io/projected/e8450bd0-3851-4a59-b103-0fac21bd21ee-kube-api-access-knrkv\") pod \"perf-node-gather-daemonset-twj9r\" (UID: \"e8450bd0-3851-4a59-b103-0fac21bd21ee\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" Apr 16 15:04:25.392676 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.392464 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8450bd0-3851-4a59-b103-0fac21bd21ee-sys\") pod \"perf-node-gather-daemonset-twj9r\" (UID: \"e8450bd0-3851-4a59-b103-0fac21bd21ee\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" Apr 16 15:04:25.392676 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.392470 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e8450bd0-3851-4a59-b103-0fac21bd21ee-podres\") pod \"perf-node-gather-daemonset-twj9r\" (UID: \"e8450bd0-3851-4a59-b103-0fac21bd21ee\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" Apr 16 15:04:25.392676 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.392367 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e8450bd0-3851-4a59-b103-0fac21bd21ee-proc\") pod \"perf-node-gather-daemonset-twj9r\" (UID: \"e8450bd0-3851-4a59-b103-0fac21bd21ee\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" Apr 16 15:04:25.392676 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.392526 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8450bd0-3851-4a59-b103-0fac21bd21ee-lib-modules\") pod \"perf-node-gather-daemonset-twj9r\" (UID: \"e8450bd0-3851-4a59-b103-0fac21bd21ee\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" Apr 16 15:04:25.399931 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.399911 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-knrkv\" (UniqueName: \"kubernetes.io/projected/e8450bd0-3851-4a59-b103-0fac21bd21ee-kube-api-access-knrkv\") pod \"perf-node-gather-daemonset-twj9r\" (UID: \"e8450bd0-3851-4a59-b103-0fac21bd21ee\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" Apr 16 15:04:25.420312 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.420291 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" Apr 16 15:04:25.535888 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.535861 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r"] Apr 16 15:04:25.538537 ip-10-0-142-16 kubenswrapper[2569]: W0416 15:04:25.538511 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode8450bd0_3851_4a59_b103_0fac21bd21ee.slice/crio-dec5bba35e4bfb87120d04aded2e18b657ee6bed989a0235ea19d9f7b195c80f WatchSource:0}: Error finding container dec5bba35e4bfb87120d04aded2e18b657ee6bed989a0235ea19d9f7b195c80f: Status 404 returned error can't find the container with id dec5bba35e4bfb87120d04aded2e18b657ee6bed989a0235ea19d9f7b195c80f Apr 16 15:04:25.540090 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.540071 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:04:25.868609 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.868579 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p6gtp_59e98d1e-f9cf-4faa-bd64-a597149d3bc7/dns/0.log" Apr 16 15:04:25.889052 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.889027 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p6gtp_59e98d1e-f9cf-4faa-bd64-a597149d3bc7/kube-rbac-proxy/0.log" Apr 16 15:04:25.935446 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:25.935415 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-p4h9n_ca33a748-eaae-40ea-9131-81e3f97ea69d/dns-node-resolver/0.log" Apr 16 15:04:26.365141 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:26.365047 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" event={"ID":"e8450bd0-3851-4a59-b103-0fac21bd21ee","Type":"ContainerStarted","Data":"22beab804829408eb90aa7a2f311d5276bc2768421b56f809f26835cec596ee5"} Apr 16 15:04:26.365141 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:26.365083 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" event={"ID":"e8450bd0-3851-4a59-b103-0fac21bd21ee","Type":"ContainerStarted","Data":"dec5bba35e4bfb87120d04aded2e18b657ee6bed989a0235ea19d9f7b195c80f"} Apr 16 15:04:26.365366 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:26.365191 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" Apr 16 15:04:26.373962 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:26.373935 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6d7569c8cd-4qw74_870e7376-e0fa-40ca-ad2c-98fa6189639f/registry/0.log" Apr 16 15:04:26.381231 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:26.381171 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" podStartSLOduration=1.381157797 podStartE2EDuration="1.381157797s" podCreationTimestamp="2026-04-16 15:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:04:26.378938861 +0000 UTC m=+3903.719313755" watchObservedRunningTime="2026-04-16 15:04:26.381157797 +0000 UTC m=+3903.721532650" Apr 16 15:04:26.392134 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:26.392111 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6d7569c8cd-4qw74_870e7376-e0fa-40ca-ad2c-98fa6189639f/registry/1.log" Apr 16 15:04:26.457471 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:26.457449 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xv4n6_d6588176-b995-4b14-80e6-c2ba40893912/node-ca/0.log" Apr 16 15:04:27.256076 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:27.256047 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5c4787ff58-x4l8s_e5a40466-a66f-4e5a-b8ea-43dd46b22ac1/router/0.log" Apr 16 15:04:27.627612 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:27.627544 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-b4d9q_852d4d38-4926-4c6a-a9ad-11a60019138a/serve-healthcheck-canary/0.log" Apr 16 15:04:28.286153 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:28.286121 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s5shw_9033afb5-4784-4b50-813c-d22961325cf4/kube-rbac-proxy/0.log" Apr 16 15:04:28.350327 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:28.350300 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s5shw_9033afb5-4784-4b50-813c-d22961325cf4/exporter/0.log" Apr 16 15:04:28.408365 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:28.408336 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s5shw_9033afb5-4784-4b50-813c-d22961325cf4/extractor/0.log" Apr 16 15:04:30.419343 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:30.419315 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-75d667c7c4-6sft9_d78e403d-3f82-4398-b47d-64a302d5dbe6/manager/0.log" Apr 16 15:04:30.440660 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:30.440608 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-rh422_ab5ed564-2053-4ba2-b8dc-db1097e89bb3/manager/0.log" Apr 16 15:04:32.380786 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:32.380757 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-twj9r" Apr 16 15:04:36.844445 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:36.844416 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wlchs_44914f75-f504-40b2-932d-a36d8319394c/kube-multus-additional-cni-plugins/0.log" Apr 16 15:04:36.868770 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:36.868694 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wlchs_44914f75-f504-40b2-932d-a36d8319394c/egress-router-binary-copy/0.log" Apr 16 15:04:36.890568 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:36.890546 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wlchs_44914f75-f504-40b2-932d-a36d8319394c/cni-plugins/0.log" Apr 16 15:04:36.912041 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:36.912023 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wlchs_44914f75-f504-40b2-932d-a36d8319394c/bond-cni-plugin/0.log" Apr 16 15:04:36.933731 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:36.933710 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wlchs_44914f75-f504-40b2-932d-a36d8319394c/routeoverride-cni/0.log" Apr 16 15:04:36.956996 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:36.956980 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wlchs_44914f75-f504-40b2-932d-a36d8319394c/whereabouts-cni-bincopy/0.log" Apr 16 15:04:36.978356 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:36.978340 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wlchs_44914f75-f504-40b2-932d-a36d8319394c/whereabouts-cni/0.log" Apr 16 15:04:37.036354 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:37.036327 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t2gj8_5f104cac-2458-4f3f-b7d2-b71aef2dff52/kube-multus/0.log" Apr 16 15:04:37.119291 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:37.119218 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lfj5m_44d7f301-04c1-422a-a689-9d0e4f02952c/network-metrics-daemon/0.log" Apr 16 15:04:37.140277 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:37.140253 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lfj5m_44d7f301-04c1-422a-a689-9d0e4f02952c/kube-rbac-proxy/0.log" Apr 16 15:04:38.531130 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:38.531006 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-controller/0.log" Apr 16 15:04:38.552091 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:38.552067 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/0.log" Apr 16 15:04:38.568244 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:38.568222 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovn-acl-logging/1.log" Apr 16 15:04:38.589689 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:38.589672 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/kube-rbac-proxy-node/0.log" Apr 16 15:04:38.611462 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:38.611436 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 15:04:38.632757 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:38.632739 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/northd/0.log" Apr 16 15:04:38.656116 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:38.656097 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/nbdb/0.log" Apr 16 15:04:38.680190 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:38.680173 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/sbdb/0.log" Apr 16 15:04:38.774633 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:38.774603 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-prt2z_e6def905-3f86-432f-b6ba-a5f4649cc324/ovnkube-controller/0.log" Apr 16 15:04:39.864170 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:39.864142 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-7t4bf_f1a2e25c-5259-48d8-865a-7328810adf10/check-endpoints/0.log" Apr 16 15:04:39.889038 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:39.889012 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-f6dlw_9698ff93-a877-4a74-b2ff-29e433108995/network-check-target-container/0.log" Apr 16 15:04:40.878085 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:40.878060 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-lpgrv_ba1a0175-ff05-496f-a2ab-9b87059cf3c3/iptables-alerter/0.log" Apr 16 15:04:41.606831 ip-10-0-142-16 kubenswrapper[2569]: I0416 15:04:41.606804 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-ql6p9_bdcb6f5a-276e-476f-ace0-4bc3f243da52/tuned/0.log"