Apr 21 06:23:42.550402 ip-10-0-138-76 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 06:23:42.550418 ip-10-0-138-76 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 06:23:42.550428 ip-10-0-138-76 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 06:23:42.550775 ip-10-0-138-76 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 06:23:52.594232 ip-10-0-138-76 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 06:23:52.594250 ip-10-0-138-76 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 2d93ae3ae3e6414aba10b66175ed96da -- Apr 21 06:26:25.391869 ip-10-0-138-76 systemd[1]: Starting Kubernetes Kubelet... Apr 21 06:26:25.857859 ip-10-0-138-76 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 06:26:25.857859 ip-10-0-138-76 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 06:26:25.857859 ip-10-0-138-76 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 06:26:25.857859 ip-10-0-138-76 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 06:26:25.857859 ip-10-0-138-76 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 06:26:25.859415 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.859342 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 06:26:25.863196 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863182 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 06:26:25.863196 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863197 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 06:26:25.863254 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863201 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 06:26:25.863254 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863205 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 06:26:25.863254 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863208 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 06:26:25.863254 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863211 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 06:26:25.863254 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863214 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 06:26:25.863254 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863217 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 06:26:25.863254 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863220 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 06:26:25.863254 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863223 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 06:26:25.863254 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863227 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 06:26:25.863254 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863230 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 06:26:25.863254 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863233 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 06:26:25.863254 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863236 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 06:26:25.863254 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863239 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 06:26:25.863254 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863242 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 06:26:25.863254 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863245 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 06:26:25.863254 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863248 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 06:26:25.863254 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863251 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 06:26:25.863254 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863258 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 06:26:25.863254 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863261 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 06:26:25.863254 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863264 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 06:26:25.863749 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863267 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 06:26:25.863749 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863270 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 06:26:25.863749 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863273 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 06:26:25.863749 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863276 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 06:26:25.863749 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863279 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 06:26:25.863749 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863281 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 06:26:25.863749 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863284 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 06:26:25.863749 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863286 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 21 06:26:25.863749 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863289 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 06:26:25.863749 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863291 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 06:26:25.863749 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863294 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 06:26:25.863749 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863296 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 06:26:25.863749 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863299 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 06:26:25.863749 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863301 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 06:26:25.863749 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863304 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 06:26:25.863749 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863306 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 06:26:25.863749 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863309 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 06:26:25.863749 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863311 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 06:26:25.863749 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863314 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 06:26:25.863749 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863316 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 06:26:25.864249 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863319 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 06:26:25.864249 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863321 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 06:26:25.864249 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863323 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 06:26:25.864249 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863326 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 06:26:25.864249 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863329 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 06:26:25.864249 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863332 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 06:26:25.864249 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863334 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 06:26:25.864249 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863337 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 06:26:25.864249 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863339 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 06:26:25.864249 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863342 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 06:26:25.864249 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863345 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 06:26:25.864249 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863347 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 06:26:25.864249 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863350 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 06:26:25.864249 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863353 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 06:26:25.864249 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863356 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 06:26:25.864249 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863359 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 06:26:25.864249 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863362 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 06:26:25.864249 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863364 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 06:26:25.864249 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863370 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 06:26:25.864756 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863373 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 06:26:25.864756 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863376 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 06:26:25.864756 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863379 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 06:26:25.864756 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863382 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 06:26:25.864756 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863385 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 06:26:25.864756 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863387 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 06:26:25.864756 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863391 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 06:26:25.864756 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863394 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 06:26:25.864756 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863396 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 06:26:25.864756 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863399 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 06:26:25.864756 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863402 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 06:26:25.864756 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863405 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 06:26:25.864756 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863407 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 06:26:25.864756 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863410 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 06:26:25.864756 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863412 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 06:26:25.864756 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863415 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 06:26:25.864756 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863418 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 06:26:25.864756 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863421 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 06:26:25.864756 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863424 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 06:26:25.864756 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863426 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 06:26:25.865232 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863429 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 06:26:25.865232 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863432 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 06:26:25.865232 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863437 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 06:26:25.865232 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863440 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 06:26:25.865232 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863443 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 06:26:25.865232 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863810 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 06:26:25.865232 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863816 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 06:26:25.865232 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863819 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 06:26:25.865232 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863821 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 06:26:25.865232 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863824 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 21 06:26:25.865232 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863826 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 06:26:25.865232 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863829 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 06:26:25.865232 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863832 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 06:26:25.865232 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863834 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 06:26:25.865232 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863837 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 06:26:25.865232 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863839 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 06:26:25.865232 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863842 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 06:26:25.865232 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863845 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 06:26:25.865232 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863847 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 06:26:25.865693 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863866 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 06:26:25.865693 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863870 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 06:26:25.865693 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863873 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 06:26:25.865693 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863876 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 06:26:25.865693 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863879 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 06:26:25.865693 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863882 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 06:26:25.865693 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863885 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 06:26:25.865693 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863888 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 06:26:25.865693 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863891 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 06:26:25.865693 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863894 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 06:26:25.865693 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863897 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 06:26:25.865693 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863900 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 06:26:25.865693 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863903 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 06:26:25.865693 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863906 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 06:26:25.865693 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863908 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 06:26:25.865693 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863911 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 06:26:25.865693 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863914 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 06:26:25.865693 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863917 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 06:26:25.865693 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863919 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 06:26:25.865693 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863922 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 06:26:25.866219 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863924 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 06:26:25.866219 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863927 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 06:26:25.866219 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863930 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 06:26:25.866219 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863932 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 06:26:25.866219 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863935 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 06:26:25.866219 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863937 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 06:26:25.866219 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863939 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 06:26:25.866219 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863942 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 06:26:25.866219 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863945 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 06:26:25.866219 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863947 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 06:26:25.866219 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863949 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 06:26:25.866219 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863952 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 06:26:25.866219 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863955 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 06:26:25.866219 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863958 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 06:26:25.866219 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863960 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 06:26:25.866219 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863963 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 06:26:25.866219 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863965 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 06:26:25.866219 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863970 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 06:26:25.866219 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863973 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 06:26:25.866745 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863976 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 06:26:25.866745 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863979 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 06:26:25.866745 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863982 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 06:26:25.866745 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863984 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 06:26:25.866745 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863988 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 06:26:25.866745 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863991 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 06:26:25.866745 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863993 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 06:26:25.866745 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863996 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 06:26:25.866745 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.863998 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 06:26:25.866745 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864001 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 06:26:25.866745 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864003 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 06:26:25.866745 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864006 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 06:26:25.866745 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864009 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 06:26:25.866745 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864011 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 06:26:25.866745 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864014 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 06:26:25.866745 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864017 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 06:26:25.866745 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864021 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 06:26:25.866745 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864023 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 06:26:25.866745 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864026 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864028 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864031 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864033 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864036 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864038 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864040 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864043 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864046 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864048 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864051 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864054 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864056 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864059 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864062 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864134 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864145 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864150 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864155 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864159 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864162 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 06:26:25.867197 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864166 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864171 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864174 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864177 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864180 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864184 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864188 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864191 2571 flags.go:64] FLAG: --cgroup-root="" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864194 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864196 2571 flags.go:64] FLAG: --client-ca-file="" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864199 2571 flags.go:64] FLAG: --cloud-config="" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864202 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864205 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864209 2571 flags.go:64] FLAG: --cluster-domain="" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864212 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864215 2571 flags.go:64] FLAG: --config-dir="" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864218 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864221 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864225 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864228 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864231 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864234 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864238 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864241 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 06:26:25.867720 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864244 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864247 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864251 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864256 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864259 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864262 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864265 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864268 2571 flags.go:64] FLAG: --enable-server="true" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864271 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864275 2571 flags.go:64] FLAG: --event-burst="100" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864278 2571 flags.go:64] FLAG: --event-qps="50" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864281 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864284 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864287 2571 flags.go:64] FLAG: --eviction-hard="" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864291 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864294 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864296 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864300 2571 flags.go:64] FLAG: --eviction-soft="" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864303 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864306 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864308 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864311 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864314 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864317 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864320 2571 flags.go:64] FLAG: --feature-gates="" Apr 21 06:26:25.868288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864324 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864326 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864329 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864332 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864336 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864339 2571 flags.go:64] FLAG: --help="false" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864342 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-138-76.ec2.internal" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864345 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864348 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864353 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864356 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864359 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864362 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864365 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864367 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864370 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864373 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864376 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864379 2571 flags.go:64] FLAG: --kube-reserved="" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864382 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864385 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864388 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864391 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864394 2571 flags.go:64] FLAG: --lock-file="" Apr 21 06:26:25.868941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864396 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864399 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864402 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864407 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864410 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864415 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864418 2571 flags.go:64] FLAG: --logging-format="text" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864421 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864424 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864427 2571 flags.go:64] FLAG: --manifest-url="" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864430 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864434 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864437 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864441 2571 flags.go:64] FLAG: --max-pods="110" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864444 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864446 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864449 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864453 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864456 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864459 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864462 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864469 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864472 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864475 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 06:26:25.869505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864478 2571 flags.go:64] FLAG: --pod-cidr="" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864481 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864486 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864488 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864492 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864495 2571 flags.go:64] FLAG: --port="10250" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864497 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864501 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a67797474473cc44" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864504 2571 flags.go:64] FLAG: --qos-reserved="" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864507 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864510 2571 flags.go:64] FLAG: --register-node="true" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864527 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864530 2571 flags.go:64] FLAG: --register-with-taints="" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864536 2571 flags.go:64] FLAG: --registry-burst="10" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864538 2571 flags.go:64] FLAG: --registry-qps="5" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864541 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864544 2571 flags.go:64] FLAG: --reserved-memory="" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864548 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864551 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864554 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864556 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864559 2571 flags.go:64] FLAG: --runonce="false" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864562 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864565 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864568 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 21 06:26:25.870095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864571 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864576 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864579 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864582 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864585 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864588 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864591 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864594 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864596 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864599 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864603 2571 flags.go:64] FLAG: --system-cgroups="" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864605 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864623 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864627 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864630 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864636 2571 flags.go:64] FLAG: --tls-min-version="" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864639 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864642 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864645 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864648 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864652 2571 flags.go:64] FLAG: --v="2" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864656 2571 flags.go:64] FLAG: --version="false" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864660 2571 flags.go:64] FLAG: --vmodule="" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864664 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.864667 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 06:26:25.870703 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864748 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 06:26:25.871320 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864751 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 06:26:25.871320 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864755 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 06:26:25.871320 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864758 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 06:26:25.871320 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864761 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 21 06:26:25.871320 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864764 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 06:26:25.871320 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864767 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 06:26:25.871320 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864770 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 06:26:25.871320 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864774 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 06:26:25.871320 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864777 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 06:26:25.871320 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864779 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 06:26:25.871320 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864782 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 06:26:25.871320 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864784 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 06:26:25.871320 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864787 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 06:26:25.871320 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864789 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 06:26:25.871320 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864792 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 06:26:25.871320 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864795 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 06:26:25.871320 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864798 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 06:26:25.871320 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864800 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 06:26:25.871320 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864803 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 06:26:25.871320 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864805 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 06:26:25.872214 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864809 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 06:26:25.872214 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864813 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 06:26:25.872214 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864816 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 06:26:25.872214 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864818 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 06:26:25.872214 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864821 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 06:26:25.872214 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864823 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 06:26:25.872214 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864827 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 06:26:25.872214 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864830 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 06:26:25.872214 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864832 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 06:26:25.872214 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864835 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 06:26:25.872214 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864837 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 06:26:25.872214 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864840 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 06:26:25.872214 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864842 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 06:26:25.872214 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864845 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 06:26:25.872214 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864847 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 06:26:25.872214 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864850 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 06:26:25.872214 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864852 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 06:26:25.872214 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864855 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 06:26:25.872214 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864858 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 06:26:25.872842 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864862 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 06:26:25.872842 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864865 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 06:26:25.872842 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864867 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 06:26:25.872842 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864870 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 06:26:25.872842 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864873 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 06:26:25.872842 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864875 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 06:26:25.872842 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864878 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 06:26:25.872842 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864880 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 06:26:25.872842 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864883 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 06:26:25.872842 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864885 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 06:26:25.872842 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864888 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 06:26:25.872842 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864890 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 06:26:25.872842 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864893 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 06:26:25.872842 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864895 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 06:26:25.872842 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864898 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 06:26:25.872842 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864900 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 06:26:25.872842 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864903 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 06:26:25.872842 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864905 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 06:26:25.872842 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864908 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 06:26:25.873328 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864912 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 06:26:25.873328 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864915 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 06:26:25.873328 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864917 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 06:26:25.873328 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864920 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 06:26:25.873328 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864922 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 06:26:25.873328 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864925 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 06:26:25.873328 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864928 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 06:26:25.873328 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864930 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 06:26:25.873328 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864933 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 06:26:25.873328 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864935 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 06:26:25.873328 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864938 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 06:26:25.873328 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864940 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 06:26:25.873328 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864943 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 06:26:25.873328 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864947 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 06:26:25.873328 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864950 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 06:26:25.873328 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864952 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 06:26:25.873328 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864955 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 06:26:25.873328 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864958 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 06:26:25.873328 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864961 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 06:26:25.873328 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864963 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 06:26:25.873829 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864967 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 06:26:25.873829 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864970 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 06:26:25.873829 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864973 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 06:26:25.873829 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864976 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 06:26:25.873829 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864978 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 06:26:25.873829 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864981 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 06:26:25.873829 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.864983 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 06:26:25.873829 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.865842 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 06:26:25.874087 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.874066 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 06:26:25.874120 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.874090 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 06:26:25.874152 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874139 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 06:26:25.874152 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874146 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 06:26:25.874152 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874151 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 06:26:25.874234 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874156 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 06:26:25.874234 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874161 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 06:26:25.874234 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874164 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 06:26:25.874234 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874168 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 06:26:25.874234 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874171 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 06:26:25.874234 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874174 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 06:26:25.874234 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874176 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 06:26:25.874234 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874179 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 06:26:25.874234 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874182 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 06:26:25.874234 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874185 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 06:26:25.874234 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874188 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 06:26:25.874234 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874191 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 06:26:25.874234 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874193 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 06:26:25.874234 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874196 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 06:26:25.874234 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874199 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 06:26:25.874234 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874201 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 06:26:25.874234 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874204 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 06:26:25.874234 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874207 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 06:26:25.874234 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874209 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 06:26:25.874234 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874213 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 06:26:25.874735 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874216 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 06:26:25.874735 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874219 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 06:26:25.874735 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874221 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 06:26:25.874735 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874224 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 06:26:25.874735 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874234 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 06:26:25.874735 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874239 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 06:26:25.874735 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874243 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 21 06:26:25.874735 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874248 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 06:26:25.874735 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874252 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 06:26:25.874735 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874255 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 06:26:25.874735 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874258 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 06:26:25.874735 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874260 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 06:26:25.874735 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874263 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 06:26:25.874735 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874266 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 06:26:25.874735 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874268 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 06:26:25.874735 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874271 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 06:26:25.874735 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874273 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 06:26:25.874735 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874276 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 06:26:25.874735 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874278 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 06:26:25.874735 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874281 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 06:26:25.875222 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874283 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 06:26:25.875222 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874286 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 06:26:25.875222 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874289 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 06:26:25.875222 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874292 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 06:26:25.875222 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874294 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 06:26:25.875222 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874297 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 06:26:25.875222 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874299 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 06:26:25.875222 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874301 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 06:26:25.875222 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874304 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 06:26:25.875222 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874308 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 06:26:25.875222 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874311 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 06:26:25.875222 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874315 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 06:26:25.875222 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874320 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 06:26:25.875222 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874325 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 06:26:25.875222 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874329 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 06:26:25.875222 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874333 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 06:26:25.875222 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874336 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 06:26:25.875222 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874339 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 06:26:25.875222 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874341 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 06:26:25.875684 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874344 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 06:26:25.875684 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874347 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 06:26:25.875684 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874349 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 06:26:25.875684 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874352 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 06:26:25.875684 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874354 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 06:26:25.875684 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874357 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 06:26:25.875684 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874359 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 06:26:25.875684 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874362 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 06:26:25.875684 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874364 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 06:26:25.875684 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874367 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 06:26:25.875684 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874369 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 06:26:25.875684 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874372 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 06:26:25.875684 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874374 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 06:26:25.875684 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874377 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 06:26:25.875684 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874380 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 06:26:25.875684 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874383 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 06:26:25.875684 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874386 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 06:26:25.875684 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874388 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 06:26:25.875684 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874392 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 06:26:25.876170 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874399 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 06:26:25.876170 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874403 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 06:26:25.876170 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874408 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 06:26:25.876170 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874413 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 06:26:25.876170 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874417 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 06:26:25.876170 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.874422 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 06:26:25.876170 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874538 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 06:26:25.876170 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874544 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 06:26:25.876170 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874548 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 06:26:25.876170 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874551 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 06:26:25.876170 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874554 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 06:26:25.876170 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874557 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 06:26:25.876170 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874561 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 06:26:25.876170 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874565 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 06:26:25.876170 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874570 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 06:26:25.876554 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874576 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 06:26:25.876554 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874579 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 06:26:25.876554 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874581 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 06:26:25.876554 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874584 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 06:26:25.876554 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874586 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 06:26:25.876554 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874589 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 06:26:25.876554 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874591 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 06:26:25.876554 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874594 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 06:26:25.876554 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874597 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 06:26:25.876554 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874599 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 06:26:25.876554 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874602 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 06:26:25.876554 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874604 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 06:26:25.876554 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874607 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 06:26:25.876554 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874610 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 06:26:25.876554 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874613 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 06:26:25.876554 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874615 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 06:26:25.876554 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874618 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 06:26:25.876554 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874620 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 06:26:25.876554 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874623 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 06:26:25.877014 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874625 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 06:26:25.877014 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874628 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 06:26:25.877014 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874630 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 06:26:25.877014 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874633 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 06:26:25.877014 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874635 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 06:26:25.877014 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874639 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 06:26:25.877014 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874644 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 06:26:25.877014 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874648 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 06:26:25.877014 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874652 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 06:26:25.877014 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874657 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 06:26:25.877014 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874660 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 06:26:25.877014 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874663 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 06:26:25.877014 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874665 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 06:26:25.877014 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874668 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 06:26:25.877014 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874671 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 06:26:25.877014 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874673 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 06:26:25.877014 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874676 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 06:26:25.877014 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874678 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 06:26:25.877014 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874681 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 06:26:25.877014 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874684 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 06:26:25.877504 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874686 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 06:26:25.877504 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874689 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 06:26:25.877504 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874691 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 06:26:25.877504 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874694 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 06:26:25.877504 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874697 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 06:26:25.877504 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874699 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 21 06:26:25.877504 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874702 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 06:26:25.877504 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874705 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 06:26:25.877504 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874708 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 06:26:25.877504 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874711 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 06:26:25.877504 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874713 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 06:26:25.877504 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874716 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 06:26:25.877504 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874720 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 06:26:25.877504 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874724 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 06:26:25.877504 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874729 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 06:26:25.877504 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874733 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 06:26:25.877504 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874738 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 06:26:25.877504 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874743 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 06:26:25.877504 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874746 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 06:26:25.877976 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874749 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 06:26:25.877976 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874752 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 06:26:25.877976 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874755 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 06:26:25.877976 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874758 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 06:26:25.877976 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874761 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 06:26:25.877976 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874764 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 06:26:25.877976 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874767 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 06:26:25.877976 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874770 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 06:26:25.877976 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874773 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 06:26:25.877976 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874775 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 06:26:25.877976 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874778 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 06:26:25.877976 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874781 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 06:26:25.877976 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874784 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 06:26:25.877976 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874787 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 06:26:25.877976 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874789 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 06:26:25.877976 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874792 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 06:26:25.877976 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874795 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 06:26:25.877976 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874797 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 06:26:25.877976 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:25.874800 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 06:26:25.878450 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.874805 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 06:26:25.878450 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.875795 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 06:26:25.878450 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.877833 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 06:26:25.878862 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.878851 2571 server.go:1019] "Starting client certificate rotation" Apr 21 06:26:25.878962 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.878946 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 06:26:25.878997 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.878988 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 06:26:25.907808 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.907792 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 06:26:25.910680 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.910665 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 06:26:25.927916 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.927897 2571 log.go:25] "Validated CRI v1 runtime API" Apr 21 06:26:25.937662 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.937636 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 06:26:25.937861 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.937847 2571 log.go:25] "Validated CRI v1 image API" Apr 21 06:26:25.939032 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.939014 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 06:26:25.941349 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.941328 2571 fs.go:135] Filesystem UUIDs: map[79fc0091-9abc-4358-8029-78a620abdb1b:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 c0552977-a77c-447c-8cb0-19f07335f970:/dev/nvme0n1p4] Apr 21 06:26:25.941431 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.941348 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 06:26:25.946734 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.946627 2571 manager.go:217] Machine: {Timestamp:2026-04-21 06:26:25.944955318 +0000 UTC m=+0.429606332 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3103271 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a2e1372a3fb686d6f7dbede2c519d SystemUUID:ec2a2e13-72a3-fb68-6d6f-7dbede2c519d BootID:2d93ae3a-e3e6-414a-ba10-b66175ed96da Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:77:1e:a9:d9:43 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:77:1e:a9:d9:43 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:22:3c:90:97:6f:27 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 06:26:25.946734 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.946721 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 06:26:25.946915 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.946852 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 06:26:25.949310 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.949283 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 06:26:25.949467 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.949312 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-76.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 06:26:25.949573 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.949477 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 06:26:25.949573 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.949489 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 06:26:25.949573 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.949506 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 06:26:25.950579 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.950566 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 06:26:25.952151 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.952138 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 21 06:26:25.952408 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.952396 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 06:26:25.954305 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.954289 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8xhhz" Apr 21 06:26:25.955093 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.955081 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 21 06:26:25.955146 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.955100 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 06:26:25.955146 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.955116 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 06:26:25.955146 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.955129 2571 kubelet.go:397] "Adding apiserver pod source" Apr 21 06:26:25.955274 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.955151 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 06:26:25.956608 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.956595 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 06:26:25.956671 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.956617 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 06:26:25.959862 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.959845 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 06:26:25.961225 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.961211 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 06:26:25.962277 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.962259 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8xhhz" Apr 21 06:26:25.963256 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.963241 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 06:26:25.963318 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.963262 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 06:26:25.963318 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.963287 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 06:26:25.963318 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.963299 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 06:26:25.963318 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.963309 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 06:26:25.963318 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.963315 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 06:26:25.963318 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.963321 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 06:26:25.963482 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.963327 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 06:26:25.963482 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.963334 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 06:26:25.963482 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.963340 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 06:26:25.963482 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.963355 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 06:26:25.963482 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.963363 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 06:26:25.964322 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.964310 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 06:26:25.964322 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.964321 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 06:26:25.967716 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.967702 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 06:26:25.967795 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.967739 2571 server.go:1295] "Started kubelet" Apr 21 06:26:25.967875 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.967830 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 06:26:25.968556 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.967912 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 06:26:25.968630 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.968579 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 06:26:25.968558 ip-10-0-138-76 systemd[1]: Started Kubernetes Kubelet. Apr 21 06:26:25.970187 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.970162 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 06:26:25.974701 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.974674 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 06:26:25.974989 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.974973 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 21 06:26:25.977871 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.977853 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-76.ec2.internal" not found Apr 21 06:26:25.978926 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.978908 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 06:26:25.979024 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.979004 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 06:26:25.979613 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.979599 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 06:26:25.979743 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.979725 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 06:26:25.981121 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:25.981098 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-76.ec2.internal\" not found" Apr 21 06:26:25.981805 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.981733 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 06:26:25.981896 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.981886 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 06:26:25.982076 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.982059 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 21 06:26:25.982076 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.982076 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 21 06:26:25.982404 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:25.982381 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 06:26:25.982509 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.982496 2571 factory.go:55] Registering systemd factory Apr 21 06:26:25.982593 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.982535 2571 factory.go:223] Registration of the systemd container factory successfully Apr 21 06:26:25.982767 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.982756 2571 factory.go:153] Registering CRI-O factory Apr 21 06:26:25.982767 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.982768 2571 factory.go:223] Registration of the crio container factory successfully Apr 21 06:26:25.982857 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.982846 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 06:26:25.982894 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.982868 2571 factory.go:103] Registering Raw factory Apr 21 06:26:25.982894 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.982884 2571 manager.go:1196] Started watching for new ooms in manager Apr 21 06:26:25.983532 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.983497 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 06:26:25.983784 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.983766 2571 manager.go:319] Starting recovery of all containers Apr 21 06:26:25.986040 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:25.986016 2571 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-76.ec2.internal\" not found" node="ip-10-0-138-76.ec2.internal" Apr 21 06:26:25.992958 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.992937 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-76.ec2.internal" not found Apr 21 06:26:25.993460 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.993443 2571 manager.go:324] Recovery completed Apr 21 06:26:25.997649 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.997637 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 06:26:25.999322 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.999305 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-76.ec2.internal" event="NodeHasSufficientMemory" Apr 21 06:26:25.999395 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.999332 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-76.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 06:26:25.999395 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.999342 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-76.ec2.internal" event="NodeHasSufficientPID" Apr 21 06:26:25.999834 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.999817 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 06:26:25.999834 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.999832 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 06:26:25.999921 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:25.999848 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 21 06:26:26.002595 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.002578 2571 policy_none.go:49] "None policy: Start" Apr 21 06:26:26.002672 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.002598 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 06:26:26.002672 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.002611 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 21 06:26:26.037076 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.037062 2571 manager.go:341] "Starting Device Plugin manager" Apr 21 06:26:26.049547 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:26.037132 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 06:26:26.049547 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.037145 2571 server.go:85] "Starting device plugin registration server" Apr 21 06:26:26.049547 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.037393 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 06:26:26.049547 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.037410 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 06:26:26.049547 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.037588 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 06:26:26.049547 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.037660 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 06:26:26.049547 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.037669 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 06:26:26.049547 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:26.038010 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 06:26:26.049547 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:26.038035 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-76.ec2.internal\" not found" Apr 21 06:26:26.052563 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.052550 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-76.ec2.internal" not found Apr 21 06:26:26.109064 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.109007 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 06:26:26.110165 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.110148 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 06:26:26.110231 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.110173 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 06:26:26.110231 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.110188 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 06:26:26.110231 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.110195 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 06:26:26.110231 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:26.110225 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 06:26:26.112503 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.112479 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 06:26:26.138399 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.138381 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 06:26:26.139221 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.139195 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-76.ec2.internal" event="NodeHasSufficientMemory" Apr 21 06:26:26.139294 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.139231 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-76.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 06:26:26.139294 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.139246 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-76.ec2.internal" event="NodeHasSufficientPID" Apr 21 06:26:26.139294 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.139289 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-76.ec2.internal" Apr 21 06:26:26.148715 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.148697 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-76.ec2.internal" Apr 21 06:26:26.210828 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.210779 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-76.ec2.internal"] Apr 21 06:26:26.213052 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.213039 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-76.ec2.internal" Apr 21 06:26:26.213136 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.213040 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal" Apr 21 06:26:26.228999 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.228984 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-76.ec2.internal" Apr 21 06:26:26.233210 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.233198 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal" Apr 21 06:26:26.248973 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.248955 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 06:26:26.251588 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.251576 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 06:26:26.284624 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.284607 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/443cc1db22b53511937a5739cfa2cb24-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal\" (UID: \"443cc1db22b53511937a5739cfa2cb24\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal" Apr 21 06:26:26.284696 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.284632 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/443cc1db22b53511937a5739cfa2cb24-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal\" (UID: \"443cc1db22b53511937a5739cfa2cb24\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal" Apr 21 06:26:26.284696 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.284649 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/131d5a43792cacf9a9c03a2052451cbd-config\") pod \"kube-apiserver-proxy-ip-10-0-138-76.ec2.internal\" (UID: \"131d5a43792cacf9a9c03a2052451cbd\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-76.ec2.internal" Apr 21 06:26:26.385808 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.385758 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/443cc1db22b53511937a5739cfa2cb24-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal\" (UID: \"443cc1db22b53511937a5739cfa2cb24\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal" Apr 21 06:26:26.385808 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.385796 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/443cc1db22b53511937a5739cfa2cb24-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal\" (UID: \"443cc1db22b53511937a5739cfa2cb24\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal" Apr 21 06:26:26.385914 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.385839 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/443cc1db22b53511937a5739cfa2cb24-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal\" (UID: \"443cc1db22b53511937a5739cfa2cb24\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal" Apr 21 06:26:26.385914 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.385857 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/131d5a43792cacf9a9c03a2052451cbd-config\") pod \"kube-apiserver-proxy-ip-10-0-138-76.ec2.internal\" (UID: \"131d5a43792cacf9a9c03a2052451cbd\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-76.ec2.internal" Apr 21 06:26:26.385914 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.385890 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/131d5a43792cacf9a9c03a2052451cbd-config\") pod \"kube-apiserver-proxy-ip-10-0-138-76.ec2.internal\" (UID: \"131d5a43792cacf9a9c03a2052451cbd\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-76.ec2.internal" Apr 21 06:26:26.386004 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.385923 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/443cc1db22b53511937a5739cfa2cb24-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal\" (UID: \"443cc1db22b53511937a5739cfa2cb24\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal" Apr 21 06:26:26.551278 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.551258 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-76.ec2.internal" Apr 21 06:26:26.553774 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.553749 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal" Apr 21 06:26:26.878302 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.878228 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 06:26:26.879073 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.878371 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 06:26:26.879073 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.878371 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 06:26:26.879073 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.878402 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 06:26:26.956163 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.956137 2571 apiserver.go:52] "Watching apiserver" Apr 21 06:26:26.964073 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.964040 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 06:21:25 +0000 UTC" deadline="2027-12-06 09:02:21.936005745 +0000 UTC" Apr 21 06:26:26.964073 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.964071 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14258h35m54.971937152s" Apr 21 06:26:26.964186 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.964138 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 06:26:26.965821 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.965802 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-kf6hb","kube-system/kube-apiserver-proxy-ip-10-0-138-76.ec2.internal","openshift-cluster-node-tuning-operator/tuned-p6x5s","openshift-dns/node-resolver-9x6w4","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal","openshift-multus/multus-additional-cni-plugins-stw9q","openshift-network-diagnostics/network-check-target-zk874","openshift-network-operator/iptables-alerter-59vzs","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb","openshift-image-registry/node-ca-qvz4w","openshift-multus/multus-vpw6q","openshift-multus/network-metrics-daemon-qjchj","openshift-ovn-kubernetes/ovnkube-node-z8rxc"] Apr 21 06:26:26.968077 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.968055 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kf6hb" Apr 21 06:26:26.969162 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.969136 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:26.969284 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.969230 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9x6w4" Apr 21 06:26:26.970412 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.970390 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 06:26:26.970412 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.970402 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-x4kcr\"" Apr 21 06:26:26.970680 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.970663 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 06:26:26.970996 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.970972 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:26.971687 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.971648 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 06:26:26.971780 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.971727 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-lb22k\"" Apr 21 06:26:26.971846 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.971817 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 06:26:26.972040 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.972018 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-2nkqr\"" Apr 21 06:26:26.972133 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.972019 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 06:26:26.972380 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.972360 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 06:26:26.973083 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.973064 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 06:26:26.973182 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.973171 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 06:26:26.973243 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.973183 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 06:26:26.973298 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.973258 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 06:26:26.973507 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.973489 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-9dlfm\"" Apr 21 06:26:26.973609 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.973572 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:26.973974 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.973947 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-59vzs" Apr 21 06:26:26.974067 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:26.973967 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zk874" podUID="fdd3109b-7468-400c-b587-0e2d50c0911b" Apr 21 06:26:26.974679 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.974582 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 06:26:26.976380 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.976249 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 06:26:26.976380 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.976294 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:26.976380 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.976252 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6rn2f\"" Apr 21 06:26:26.976602 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.976544 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 06:26:26.977097 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.977082 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 06:26:26.977728 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.977714 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qvz4w" Apr 21 06:26:26.978474 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.978458 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 06:26:26.978587 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.978463 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-j8zxn\"" Apr 21 06:26:26.978634 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.978599 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 06:26:26.978924 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.978909 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vpw6q" Apr 21 06:26:26.979088 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.979075 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 06:26:26.979451 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.979435 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 06:26:26.980075 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.980057 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 06:26:26.980210 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.980061 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 06:26:26.980210 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.980060 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 06:26:26.980210 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.980205 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:26.980489 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:26.980276 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qjchj" podUID="546de538-b56a-4ad2-baeb-3d59144586fb" Apr 21 06:26:26.980489 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.980399 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-284kh\"" Apr 21 06:26:26.981692 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.981675 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-h2gmf\"" Apr 21 06:26:26.981841 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.981708 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 06:26:26.981841 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.981749 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:26.982643 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.982626 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 06:26:26.983992 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.983956 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 06:26:26.984070 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.984016 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vgk92\"" Apr 21 06:26:26.985216 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.985199 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 06:26:26.985216 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.985208 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 06:26:26.985430 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.985291 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 06:26:26.985483 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.985438 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 06:26:26.985561 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.985492 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 06:26:26.990024 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990007 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 06:26:26.990127 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990024 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6mpm\" (UniqueName: \"kubernetes.io/projected/9b8417f4-abc8-485b-8bfc-78987d632957-kube-api-access-q6mpm\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:26.990127 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990066 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8eba033d-48d3-4a60-b429-c79feb5274f3-iptables-alerter-script\") pod \"iptables-alerter-59vzs\" (UID: \"8eba033d-48d3-4a60-b429-c79feb5274f3\") " pod="openshift-network-operator/iptables-alerter-59vzs" Apr 21 06:26:26.990127 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990094 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-sysctl-d\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:26.990127 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990119 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-sys\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:26.990279 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990146 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2b985311-2ecb-45b4-8665-a9a42cef2837-cnibin\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:26.990279 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990173 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2b985311-2ecb-45b4-8665-a9a42cef2837-cni-binary-copy\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:26.990279 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990196 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2b985311-2ecb-45b4-8665-a9a42cef2837-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:26.990279 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990221 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-registration-dir\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:26.990279 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990245 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-multus-socket-dir-parent\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:26.990279 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990270 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-cnibin\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:26.990582 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990294 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-host-run-netns\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:26.990582 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990319 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-multus-conf-dir\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:26.990582 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990342 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4x7l\" (UniqueName: \"kubernetes.io/projected/a81bc131-222a-47ad-9171-0a4db0b65c51-kube-api-access-k4x7l\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:26.990582 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990382 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-log-socket\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:26.990582 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990408 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-run-ovn-kubernetes\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:26.990582 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990436 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9b8417f4-abc8-485b-8bfc-78987d632957-ovnkube-config\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:26.990582 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990455 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9b8417f4-abc8-485b-8bfc-78987d632957-ovn-node-metrics-cert\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:26.990582 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990470 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a81bc131-222a-47ad-9171-0a4db0b65c51-multus-daemon-config\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:26.990582 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990507 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6e96c3e6-5bac-49c9-b707-018f191114fa-tmp\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:26.990582 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990575 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b985311-2ecb-45b4-8665-a9a42cef2837-system-cni-dir\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:26.990971 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990602 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:26.990971 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990626 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-socket-dir\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:26.990971 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990650 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-device-dir\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:26.990971 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990674 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-hostroot\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:26.990971 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990696 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-run-systemd\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:26.990971 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990720 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-system-cni-dir\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:26.990971 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990763 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-sysctl-conf\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:26.990971 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990788 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-var-lib-kubelet\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:26.990971 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990805 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-tuned\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:26.990971 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990820 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b319b9ca-8134-427f-bce9-921c4216c413-serviceca\") pod \"node-ca-qvz4w\" (UID: \"b319b9ca-8134-427f-bce9-921c4216c413\") " pod="openshift-image-registry/node-ca-qvz4w" Apr 21 06:26:26.990971 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990841 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-run-netns\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:26.990971 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990863 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-var-lib-openvswitch\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:26.990971 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990885 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-multus-cni-dir\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:26.990971 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990909 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs\") pod \"network-metrics-daemon-qjchj\" (UID: \"546de538-b56a-4ad2-baeb-3d59144586fb\") " pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:26.990971 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990935 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-cni-netd\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:26.990971 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990953 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1acddd86-ee36-4689-b8ab-ef158e2b4a47-konnectivity-ca\") pod \"konnectivity-agent-kf6hb\" (UID: \"1acddd86-ee36-4689-b8ab-ef158e2b4a47\") " pod="kube-system/konnectivity-agent-kf6hb" Apr 21 06:26:26.990971 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990973 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-run\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:26.991557 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.990996 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-kubernetes\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:26.991557 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991036 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-host-var-lib-cni-bin\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:26.991557 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991084 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-host-var-lib-kubelet\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:26.991557 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991114 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r85dd\" (UniqueName: \"kubernetes.io/projected/546de538-b56a-4ad2-baeb-3d59144586fb-kube-api-access-r85dd\") pod \"network-metrics-daemon-qjchj\" (UID: \"546de538-b56a-4ad2-baeb-3d59144586fb\") " pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:26.991557 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991139 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-etc-openvswitch\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:26.991557 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991173 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9b8417f4-abc8-485b-8bfc-78987d632957-env-overrides\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:26.991557 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991194 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-etc-selinux\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:26.991557 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991218 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hl6q\" (UniqueName: \"kubernetes.io/projected/b319b9ca-8134-427f-bce9-921c4216c413-kube-api-access-5hl6q\") pod \"node-ca-qvz4w\" (UID: \"b319b9ca-8134-427f-bce9-921c4216c413\") " pod="openshift-image-registry/node-ca-qvz4w" Apr 21 06:26:26.991557 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991247 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-cni-bin\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:26.991557 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991268 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlktq\" (UniqueName: \"kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq\") pod \"network-check-target-zk874\" (UID: \"fdd3109b-7468-400c-b587-0e2d50c0911b\") " pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:26.991557 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991286 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2b985311-2ecb-45b4-8665-a9a42cef2837-os-release\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:26.991557 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991322 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-systemd\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:26.991557 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991349 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-etc-kubernetes\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:26.991557 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991389 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-systemd-units\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:26.991557 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991415 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1acddd86-ee36-4689-b8ab-ef158e2b4a47-agent-certs\") pod \"konnectivity-agent-kf6hb\" (UID: \"1acddd86-ee36-4689-b8ab-ef158e2b4a47\") " pod="kube-system/konnectivity-agent-kf6hb" Apr 21 06:26:26.991557 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991447 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a90e0df9-bd86-4e43-ab44-ddd45d0f1a43-tmp-dir\") pod \"node-resolver-9x6w4\" (UID: \"a90e0df9-bd86-4e43-ab44-ddd45d0f1a43\") " pod="openshift-dns/node-resolver-9x6w4" Apr 21 06:26:26.992205 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991472 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8eba033d-48d3-4a60-b429-c79feb5274f3-host-slash\") pod \"iptables-alerter-59vzs\" (UID: \"8eba033d-48d3-4a60-b429-c79feb5274f3\") " pod="openshift-network-operator/iptables-alerter-59vzs" Apr 21 06:26:26.992205 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991492 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b319b9ca-8134-427f-bce9-921c4216c413-host\") pod \"node-ca-qvz4w\" (UID: \"b319b9ca-8134-427f-bce9-921c4216c413\") " pod="openshift-image-registry/node-ca-qvz4w" Apr 21 06:26:26.992205 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991509 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c82w2\" (UniqueName: \"kubernetes.io/projected/a90e0df9-bd86-4e43-ab44-ddd45d0f1a43-kube-api-access-c82w2\") pod \"node-resolver-9x6w4\" (UID: \"a90e0df9-bd86-4e43-ab44-ddd45d0f1a43\") " pod="openshift-dns/node-resolver-9x6w4" Apr 21 06:26:26.992205 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991541 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-lib-modules\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:26.992205 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991562 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-node-log\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:26.992205 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991593 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2b985311-2ecb-45b4-8665-a9a42cef2837-tuning-conf-dir\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:26.992205 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991620 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a90e0df9-bd86-4e43-ab44-ddd45d0f1a43-hosts-file\") pod \"node-resolver-9x6w4\" (UID: \"a90e0df9-bd86-4e43-ab44-ddd45d0f1a43\") " pod="openshift-dns/node-resolver-9x6w4" Apr 21 06:26:26.992205 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991645 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-host-var-lib-cni-multus\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:26.992205 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991667 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2b985311-2ecb-45b4-8665-a9a42cef2837-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:26.992205 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991707 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfwkf\" (UniqueName: \"kubernetes.io/projected/2b985311-2ecb-45b4-8665-a9a42cef2837-kube-api-access-zfwkf\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:26.992205 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991738 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-modprobe-d\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:26.992205 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991771 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-kubelet\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:26.992205 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991794 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9b8417f4-abc8-485b-8bfc-78987d632957-ovnkube-script-lib\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:26.992205 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991820 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdgrx\" (UniqueName: \"kubernetes.io/projected/6e96c3e6-5bac-49c9-b707-018f191114fa-kube-api-access-mdgrx\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:26.992205 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991843 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-sys-fs\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:26.992205 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991869 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxq6c\" (UniqueName: \"kubernetes.io/projected/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-kube-api-access-xxq6c\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:26.992681 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991893 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a81bc131-222a-47ad-9171-0a4db0b65c51-cni-binary-copy\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:26.992681 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991915 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-host-run-multus-certs\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:26.992681 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991937 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-slash\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:26.992681 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991961 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:26.992681 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.991988 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-host\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:26.992681 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.992010 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-run-ovn\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:26.992681 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.992056 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-host-run-k8s-cni-cncf-io\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:26.992681 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.992088 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4tqf\" (UniqueName: \"kubernetes.io/projected/8eba033d-48d3-4a60-b429-c79feb5274f3-kube-api-access-g4tqf\") pod \"iptables-alerter-59vzs\" (UID: \"8eba033d-48d3-4a60-b429-c79feb5274f3\") " pod="openshift-network-operator/iptables-alerter-59vzs" Apr 21 06:26:26.992681 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.992113 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-sysconfig\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:26.992681 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.992135 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-os-release\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:26.992681 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:26.992157 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-run-openvswitch\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.014766 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.014748 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-rqs6c" Apr 21 06:26:27.023015 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.022998 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-rqs6c" Apr 21 06:26:27.080911 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:27.080870 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod443cc1db22b53511937a5739cfa2cb24.slice/crio-44c2cb2ce5046c698b7e099121432b1f846914c97c593900620d566e3a8dc630 WatchSource:0}: Error finding container 44c2cb2ce5046c698b7e099121432b1f846914c97c593900620d566e3a8dc630: Status 404 returned error can't find the container with id 44c2cb2ce5046c698b7e099121432b1f846914c97c593900620d566e3a8dc630 Apr 21 06:26:27.081134 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:27.081115 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod131d5a43792cacf9a9c03a2052451cbd.slice/crio-e0f98274bf6448a07f99f11f9e644f9f6062899b19f43dd7a1c91b6fb2bff9b2 WatchSource:0}: Error finding container e0f98274bf6448a07f99f11f9e644f9f6062899b19f43dd7a1c91b6fb2bff9b2: Status 404 returned error can't find the container with id e0f98274bf6448a07f99f11f9e644f9f6062899b19f43dd7a1c91b6fb2bff9b2 Apr 21 06:26:27.086551 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.086510 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 06:26:27.092508 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.092487 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-run\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.092616 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.092534 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-kubernetes\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.092616 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.092571 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-host-var-lib-cni-bin\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.092616 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.092582 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-run\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.092616 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.092595 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-host-var-lib-kubelet\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.092800 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.092625 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r85dd\" (UniqueName: \"kubernetes.io/projected/546de538-b56a-4ad2-baeb-3d59144586fb-kube-api-access-r85dd\") pod \"network-metrics-daemon-qjchj\" (UID: \"546de538-b56a-4ad2-baeb-3d59144586fb\") " pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:27.092800 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.092630 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-host-var-lib-kubelet\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.092800 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.092636 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-kubernetes\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.092800 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.092653 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-etc-openvswitch\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.092800 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.092643 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-host-var-lib-cni-bin\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.092800 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.092687 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-etc-openvswitch\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.092800 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.092703 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9b8417f4-abc8-485b-8bfc-78987d632957-env-overrides\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.092800 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.092733 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-etc-selinux\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:27.092800 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.092763 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hl6q\" (UniqueName: \"kubernetes.io/projected/b319b9ca-8134-427f-bce9-921c4216c413-kube-api-access-5hl6q\") pod \"node-ca-qvz4w\" (UID: \"b319b9ca-8134-427f-bce9-921c4216c413\") " pod="openshift-image-registry/node-ca-qvz4w" Apr 21 06:26:27.093263 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.092812 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-etc-selinux\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:27.093263 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.092860 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-cni-bin\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.093263 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.092888 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlktq\" (UniqueName: \"kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq\") pod \"network-check-target-zk874\" (UID: \"fdd3109b-7468-400c-b587-0e2d50c0911b\") " pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:27.093263 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.092913 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2b985311-2ecb-45b4-8665-a9a42cef2837-os-release\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:27.093263 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.092936 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-systemd\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.093263 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.092961 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-etc-kubernetes\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.093263 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093004 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-systemd-units\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.093263 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093030 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1acddd86-ee36-4689-b8ab-ef158e2b4a47-agent-certs\") pod \"konnectivity-agent-kf6hb\" (UID: \"1acddd86-ee36-4689-b8ab-ef158e2b4a47\") " pod="kube-system/konnectivity-agent-kf6hb" Apr 21 06:26:27.093263 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093053 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a90e0df9-bd86-4e43-ab44-ddd45d0f1a43-tmp-dir\") pod \"node-resolver-9x6w4\" (UID: \"a90e0df9-bd86-4e43-ab44-ddd45d0f1a43\") " pod="openshift-dns/node-resolver-9x6w4" Apr 21 06:26:27.093263 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093057 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2b985311-2ecb-45b4-8665-a9a42cef2837-os-release\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:27.093263 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093076 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8eba033d-48d3-4a60-b429-c79feb5274f3-host-slash\") pod \"iptables-alerter-59vzs\" (UID: \"8eba033d-48d3-4a60-b429-c79feb5274f3\") " pod="openshift-network-operator/iptables-alerter-59vzs" Apr 21 06:26:27.093263 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093100 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b319b9ca-8134-427f-bce9-921c4216c413-host\") pod \"node-ca-qvz4w\" (UID: \"b319b9ca-8134-427f-bce9-921c4216c413\") " pod="openshift-image-registry/node-ca-qvz4w" Apr 21 06:26:27.093263 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093102 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-cni-bin\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.093263 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093134 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c82w2\" (UniqueName: \"kubernetes.io/projected/a90e0df9-bd86-4e43-ab44-ddd45d0f1a43-kube-api-access-c82w2\") pod \"node-resolver-9x6w4\" (UID: \"a90e0df9-bd86-4e43-ab44-ddd45d0f1a43\") " pod="openshift-dns/node-resolver-9x6w4" Apr 21 06:26:27.093263 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093159 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-lib-modules\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.093263 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093181 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-node-log\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.093263 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093206 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2b985311-2ecb-45b4-8665-a9a42cef2837-tuning-conf-dir\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:27.093747 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093231 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a90e0df9-bd86-4e43-ab44-ddd45d0f1a43-hosts-file\") pod \"node-resolver-9x6w4\" (UID: \"a90e0df9-bd86-4e43-ab44-ddd45d0f1a43\") " pod="openshift-dns/node-resolver-9x6w4" Apr 21 06:26:27.093747 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093254 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-host-var-lib-cni-multus\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.093747 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093285 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2b985311-2ecb-45b4-8665-a9a42cef2837-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:27.093747 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093299 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9b8417f4-abc8-485b-8bfc-78987d632957-env-overrides\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.093747 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093310 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8eba033d-48d3-4a60-b429-c79feb5274f3-host-slash\") pod \"iptables-alerter-59vzs\" (UID: \"8eba033d-48d3-4a60-b429-c79feb5274f3\") " pod="openshift-network-operator/iptables-alerter-59vzs" Apr 21 06:26:27.093747 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093315 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfwkf\" (UniqueName: \"kubernetes.io/projected/2b985311-2ecb-45b4-8665-a9a42cef2837-kube-api-access-zfwkf\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:27.093747 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093343 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-modprobe-d\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.093747 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093366 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-systemd\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.093747 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093383 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-systemd-units\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.093747 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093436 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-etc-kubernetes\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.093747 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093434 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-modprobe-d\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.093747 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093493 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-host-var-lib-cni-multus\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.093747 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093555 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2b985311-2ecb-45b4-8665-a9a42cef2837-tuning-conf-dir\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:27.093747 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093562 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-kubelet\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.093747 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093595 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 06:26:27.093747 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093615 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-node-log\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.093747 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093600 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b319b9ca-8134-427f-bce9-921c4216c413-host\") pod \"node-ca-qvz4w\" (UID: \"b319b9ca-8134-427f-bce9-921c4216c413\") " pod="openshift-image-registry/node-ca-qvz4w" Apr 21 06:26:27.093747 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093635 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a90e0df9-bd86-4e43-ab44-ddd45d0f1a43-hosts-file\") pod \"node-resolver-9x6w4\" (UID: \"a90e0df9-bd86-4e43-ab44-ddd45d0f1a43\") " pod="openshift-dns/node-resolver-9x6w4" Apr 21 06:26:27.094506 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093665 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a90e0df9-bd86-4e43-ab44-ddd45d0f1a43-tmp-dir\") pod \"node-resolver-9x6w4\" (UID: \"a90e0df9-bd86-4e43-ab44-ddd45d0f1a43\") " pod="openshift-dns/node-resolver-9x6w4" Apr 21 06:26:27.094506 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093668 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-kubelet\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.094506 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093600 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9b8417f4-abc8-485b-8bfc-78987d632957-ovnkube-script-lib\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.094506 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093722 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdgrx\" (UniqueName: \"kubernetes.io/projected/6e96c3e6-5bac-49c9-b707-018f191114fa-kube-api-access-mdgrx\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.094506 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093742 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-lib-modules\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.094506 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093747 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-sys-fs\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:27.094506 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093793 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxq6c\" (UniqueName: \"kubernetes.io/projected/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-kube-api-access-xxq6c\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:27.094506 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093818 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a81bc131-222a-47ad-9171-0a4db0b65c51-cni-binary-copy\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.094506 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093844 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-host-run-multus-certs\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.094506 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093844 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-sys-fs\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:27.094506 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093895 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-host-run-multus-certs\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.094506 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093923 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-slash\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.094506 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093974 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.094506 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.093999 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-host\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.094506 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094026 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-run-ovn\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.094506 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094035 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-slash\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.094506 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094038 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2b985311-2ecb-45b4-8665-a9a42cef2837-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:27.095336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094055 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-host-run-k8s-cni-cncf-io\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.095336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094086 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.095336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094089 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4tqf\" (UniqueName: \"kubernetes.io/projected/8eba033d-48d3-4a60-b429-c79feb5274f3-kube-api-access-g4tqf\") pod \"iptables-alerter-59vzs\" (UID: \"8eba033d-48d3-4a60-b429-c79feb5274f3\") " pod="openshift-network-operator/iptables-alerter-59vzs" Apr 21 06:26:27.095336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094108 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-host\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.095336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094122 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-run-ovn\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.095336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094133 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-sysconfig\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.095336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094150 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-host-run-k8s-cni-cncf-io\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.095336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094158 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-os-release\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.095336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094194 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-run-openvswitch\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.095336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094220 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6mpm\" (UniqueName: \"kubernetes.io/projected/9b8417f4-abc8-485b-8bfc-78987d632957-kube-api-access-q6mpm\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.095336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094232 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9b8417f4-abc8-485b-8bfc-78987d632957-ovnkube-script-lib\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.095336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094241 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-os-release\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.095336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094245 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8eba033d-48d3-4a60-b429-c79feb5274f3-iptables-alerter-script\") pod \"iptables-alerter-59vzs\" (UID: \"8eba033d-48d3-4a60-b429-c79feb5274f3\") " pod="openshift-network-operator/iptables-alerter-59vzs" Apr 21 06:26:27.095336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094196 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-sysconfig\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.095336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094301 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-run-openvswitch\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.095336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094368 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-sysctl-d\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.095336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094397 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-sys\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.096145 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094419 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2b985311-2ecb-45b4-8665-a9a42cef2837-cnibin\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:27.096145 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094460 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2b985311-2ecb-45b4-8665-a9a42cef2837-cni-binary-copy\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:27.096145 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094472 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a81bc131-222a-47ad-9171-0a4db0b65c51-cni-binary-copy\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.096145 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094490 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2b985311-2ecb-45b4-8665-a9a42cef2837-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:27.096145 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094534 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-sysctl-d\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.096145 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094572 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-registration-dir\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:27.096145 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094604 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-multus-socket-dir-parent\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.096145 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094632 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-cnibin\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.096145 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094658 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-host-run-netns\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.096145 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094684 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-multus-conf-dir\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.096145 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094741 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4x7l\" (UniqueName: \"kubernetes.io/projected/a81bc131-222a-47ad-9171-0a4db0b65c51-kube-api-access-k4x7l\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.096145 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094480 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-sys\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.096145 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094767 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-log-socket\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.096145 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094778 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8eba033d-48d3-4a60-b429-c79feb5274f3-iptables-alerter-script\") pod \"iptables-alerter-59vzs\" (UID: \"8eba033d-48d3-4a60-b429-c79feb5274f3\") " pod="openshift-network-operator/iptables-alerter-59vzs" Apr 21 06:26:27.096145 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094791 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-run-ovn-kubernetes\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.096145 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094820 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-cnibin\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.096145 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094833 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9b8417f4-abc8-485b-8bfc-78987d632957-ovnkube-config\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.096861 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094847 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-registration-dir\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:27.096861 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094858 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9b8417f4-abc8-485b-8bfc-78987d632957-ovn-node-metrics-cert\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.096861 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094876 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-multus-socket-dir-parent\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.096861 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094885 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a81bc131-222a-47ad-9171-0a4db0b65c51-multus-daemon-config\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.096861 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094922 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-host-run-netns\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.096861 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094945 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2b985311-2ecb-45b4-8665-a9a42cef2837-cni-binary-copy\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:27.096861 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094948 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6e96c3e6-5bac-49c9-b707-018f191114fa-tmp\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.096861 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094985 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2b985311-2ecb-45b4-8665-a9a42cef2837-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:27.096861 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.094994 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b985311-2ecb-45b4-8665-a9a42cef2837-system-cni-dir\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:27.096861 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095020 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:27.096861 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095045 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-socket-dir\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:27.096861 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095054 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-multus-conf-dir\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.096861 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095064 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b985311-2ecb-45b4-8665-a9a42cef2837-system-cni-dir\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:27.096861 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095072 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-device-dir\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:27.096861 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095099 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-hostroot\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.096861 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095102 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-run-ovn-kubernetes\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.096861 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095125 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-run-systemd\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.097381 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095148 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-system-cni-dir\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.097381 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095143 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-log-socket\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.097381 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095165 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2b985311-2ecb-45b4-8665-a9a42cef2837-cnibin\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:27.097381 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095179 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-sysctl-conf\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.097381 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095202 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-var-lib-kubelet\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.097381 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095263 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:27.097381 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095263 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-tuned\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.097381 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095300 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b319b9ca-8134-427f-bce9-921c4216c413-serviceca\") pod \"node-ca-qvz4w\" (UID: \"b319b9ca-8134-427f-bce9-921c4216c413\") " pod="openshift-image-registry/node-ca-qvz4w" Apr 21 06:26:27.097381 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095326 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-run-netns\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.097381 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095350 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-var-lib-openvswitch\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.097381 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095374 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-multus-cni-dir\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.097381 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095400 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs\") pod \"network-metrics-daemon-qjchj\" (UID: \"546de538-b56a-4ad2-baeb-3d59144586fb\") " pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:27.097381 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095425 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-cni-netd\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.097381 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095451 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1acddd86-ee36-4689-b8ab-ef158e2b4a47-konnectivity-ca\") pod \"konnectivity-agent-kf6hb\" (UID: \"1acddd86-ee36-4689-b8ab-ef158e2b4a47\") " pod="kube-system/konnectivity-agent-kf6hb" Apr 21 06:26:27.097381 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095492 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a81bc131-222a-47ad-9171-0a4db0b65c51-multus-daemon-config\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.097381 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095644 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-system-cni-dir\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.097381 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095700 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-device-dir\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:27.097381 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095757 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-hostroot\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.097924 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095799 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-var-lib-openvswitch\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.097924 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095840 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-run-netns\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.097924 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:27.095906 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:27.097924 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095940 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-host-cni-netd\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.097924 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.095957 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-socket-dir\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:27.097924 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.096015 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a81bc131-222a-47ad-9171-0a4db0b65c51-multus-cni-dir\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.097924 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.096050 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-sysctl-conf\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.097924 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.096101 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e96c3e6-5bac-49c9-b707-018f191114fa-var-lib-kubelet\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.097924 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.096226 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9b8417f4-abc8-485b-8bfc-78987d632957-ovnkube-config\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.097924 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.096299 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9b8417f4-abc8-485b-8bfc-78987d632957-run-systemd\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.097924 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.096529 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b319b9ca-8134-427f-bce9-921c4216c413-serviceca\") pod \"node-ca-qvz4w\" (UID: \"b319b9ca-8134-427f-bce9-921c4216c413\") " pod="openshift-image-registry/node-ca-qvz4w" Apr 21 06:26:27.097924 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:27.096569 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs podName:546de538-b56a-4ad2-baeb-3d59144586fb nodeName:}" failed. No retries permitted until 2026-04-21 06:26:27.595972647 +0000 UTC m=+2.080623667 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs") pod "network-metrics-daemon-qjchj" (UID: "546de538-b56a-4ad2-baeb-3d59144586fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:27.097924 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.096624 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1acddd86-ee36-4689-b8ab-ef158e2b4a47-konnectivity-ca\") pod \"konnectivity-agent-kf6hb\" (UID: \"1acddd86-ee36-4689-b8ab-ef158e2b4a47\") " pod="kube-system/konnectivity-agent-kf6hb" Apr 21 06:26:27.097924 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.097222 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1acddd86-ee36-4689-b8ab-ef158e2b4a47-agent-certs\") pod \"konnectivity-agent-kf6hb\" (UID: \"1acddd86-ee36-4689-b8ab-ef158e2b4a47\") " pod="kube-system/konnectivity-agent-kf6hb" Apr 21 06:26:27.097924 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.097305 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6e96c3e6-5bac-49c9-b707-018f191114fa-tmp\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.098408 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.098321 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6e96c3e6-5bac-49c9-b707-018f191114fa-etc-tuned\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.098877 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.098860 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9b8417f4-abc8-485b-8bfc-78987d632957-ovn-node-metrics-cert\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.101831 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:27.101746 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 06:26:27.101831 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:27.101768 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 06:26:27.101831 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:27.101782 2571 projected.go:194] Error preparing data for projected volume kube-api-access-hlktq for pod openshift-network-diagnostics/network-check-target-zk874: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:27.102100 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:27.101846 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq podName:fdd3109b-7468-400c-b587-0e2d50c0911b nodeName:}" failed. No retries permitted until 2026-04-21 06:26:27.601830686 +0000 UTC m=+2.086481689 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hlktq" (UniqueName: "kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq") pod "network-check-target-zk874" (UID: "fdd3109b-7468-400c-b587-0e2d50c0911b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:27.103409 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.103354 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hl6q\" (UniqueName: \"kubernetes.io/projected/b319b9ca-8134-427f-bce9-921c4216c413-kube-api-access-5hl6q\") pod \"node-ca-qvz4w\" (UID: \"b319b9ca-8134-427f-bce9-921c4216c413\") " pod="openshift-image-registry/node-ca-qvz4w" Apr 21 06:26:27.103925 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.103901 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdgrx\" (UniqueName: \"kubernetes.io/projected/6e96c3e6-5bac-49c9-b707-018f191114fa-kube-api-access-mdgrx\") pod \"tuned-p6x5s\" (UID: \"6e96c3e6-5bac-49c9-b707-018f191114fa\") " pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.104039 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.104019 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4tqf\" (UniqueName: \"kubernetes.io/projected/8eba033d-48d3-4a60-b429-c79feb5274f3-kube-api-access-g4tqf\") pod \"iptables-alerter-59vzs\" (UID: \"8eba033d-48d3-4a60-b429-c79feb5274f3\") " pod="openshift-network-operator/iptables-alerter-59vzs" Apr 21 06:26:27.104222 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.104206 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4x7l\" (UniqueName: \"kubernetes.io/projected/a81bc131-222a-47ad-9171-0a4db0b65c51-kube-api-access-k4x7l\") pod \"multus-vpw6q\" (UID: \"a81bc131-222a-47ad-9171-0a4db0b65c51\") " pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.104588 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.104570 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r85dd\" (UniqueName: \"kubernetes.io/projected/546de538-b56a-4ad2-baeb-3d59144586fb-kube-api-access-r85dd\") pod \"network-metrics-daemon-qjchj\" (UID: \"546de538-b56a-4ad2-baeb-3d59144586fb\") " pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:27.104729 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.104706 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c82w2\" (UniqueName: \"kubernetes.io/projected/a90e0df9-bd86-4e43-ab44-ddd45d0f1a43-kube-api-access-c82w2\") pod \"node-resolver-9x6w4\" (UID: \"a90e0df9-bd86-4e43-ab44-ddd45d0f1a43\") " pod="openshift-dns/node-resolver-9x6w4" Apr 21 06:26:27.105393 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.105372 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6mpm\" (UniqueName: \"kubernetes.io/projected/9b8417f4-abc8-485b-8bfc-78987d632957-kube-api-access-q6mpm\") pod \"ovnkube-node-z8rxc\" (UID: \"9b8417f4-abc8-485b-8bfc-78987d632957\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.105560 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.105503 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfwkf\" (UniqueName: \"kubernetes.io/projected/2b985311-2ecb-45b4-8665-a9a42cef2837-kube-api-access-zfwkf\") pod \"multus-additional-cni-plugins-stw9q\" (UID: \"2b985311-2ecb-45b4-8665-a9a42cef2837\") " pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:27.105612 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.105570 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxq6c\" (UniqueName: \"kubernetes.io/projected/5b1c2098-f1cb-4a0f-a8c1-d131e97e930d-kube-api-access-xxq6c\") pod \"aws-ebs-csi-driver-node-mr8fb\" (UID: \"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:27.112622 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.112586 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-76.ec2.internal" event={"ID":"131d5a43792cacf9a9c03a2052451cbd","Type":"ContainerStarted","Data":"e0f98274bf6448a07f99f11f9e644f9f6062899b19f43dd7a1c91b6fb2bff9b2"} Apr 21 06:26:27.113550 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.113527 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal" event={"ID":"443cc1db22b53511937a5739cfa2cb24","Type":"ContainerStarted","Data":"44c2cb2ce5046c698b7e099121432b1f846914c97c593900620d566e3a8dc630"} Apr 21 06:26:27.297648 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.297586 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-kf6hb" Apr 21 06:26:27.303413 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:27.303393 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1acddd86_ee36_4689_b8ab_ef158e2b4a47.slice/crio-e96ba5e3b88d0ef5b7d8690d3caa78da28754cb0fa2f00a3798fa70588a5a556 WatchSource:0}: Error finding container e96ba5e3b88d0ef5b7d8690d3caa78da28754cb0fa2f00a3798fa70588a5a556: Status 404 returned error can't find the container with id e96ba5e3b88d0ef5b7d8690d3caa78da28754cb0fa2f00a3798fa70588a5a556 Apr 21 06:26:27.306153 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.306134 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" Apr 21 06:26:27.312558 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:27.312540 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e96c3e6_5bac_49c9_b707_018f191114fa.slice/crio-c05ebdd5f7d3d3439998b7755777ceb6dbbf6989ce58260c8efa505fe75f8fd7 WatchSource:0}: Error finding container c05ebdd5f7d3d3439998b7755777ceb6dbbf6989ce58260c8efa505fe75f8fd7: Status 404 returned error can't find the container with id c05ebdd5f7d3d3439998b7755777ceb6dbbf6989ce58260c8efa505fe75f8fd7 Apr 21 06:26:27.327152 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.327133 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9x6w4" Apr 21 06:26:27.330595 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.330576 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-stw9q" Apr 21 06:26:27.332699 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:27.332679 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda90e0df9_bd86_4e43_ab44_ddd45d0f1a43.slice/crio-cd0aa0d72df42fae8d7548f11a49c6f7f930d8be29a6350866d5839cae783a5c WatchSource:0}: Error finding container cd0aa0d72df42fae8d7548f11a49c6f7f930d8be29a6350866d5839cae783a5c: Status 404 returned error can't find the container with id cd0aa0d72df42fae8d7548f11a49c6f7f930d8be29a6350866d5839cae783a5c Apr 21 06:26:27.337436 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:27.337415 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b985311_2ecb_45b4_8665_a9a42cef2837.slice/crio-2225d12966275b1d9af16d75f8023a305c137eba07e503c0ac34599516dd4ece WatchSource:0}: Error finding container 2225d12966275b1d9af16d75f8023a305c137eba07e503c0ac34599516dd4ece: Status 404 returned error can't find the container with id 2225d12966275b1d9af16d75f8023a305c137eba07e503c0ac34599516dd4ece Apr 21 06:26:27.354239 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.354222 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-59vzs" Apr 21 06:26:27.358735 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.358719 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" Apr 21 06:26:27.359245 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:27.359228 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8eba033d_48d3_4a60_b429_c79feb5274f3.slice/crio-00de33b7dfb9a9711693ebbdc18188b58435e66ce7b388d1c053b4f67447b910 WatchSource:0}: Error finding container 00de33b7dfb9a9711693ebbdc18188b58435e66ce7b388d1c053b4f67447b910: Status 404 returned error can't find the container with id 00de33b7dfb9a9711693ebbdc18188b58435e66ce7b388d1c053b4f67447b910 Apr 21 06:26:27.365063 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:27.365042 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b1c2098_f1cb_4a0f_a8c1_d131e97e930d.slice/crio-fc4978ba9c2e4bf949c1643236e35a694f7feb376de6fee4b56259e8cfcb15a8 WatchSource:0}: Error finding container fc4978ba9c2e4bf949c1643236e35a694f7feb376de6fee4b56259e8cfcb15a8: Status 404 returned error can't find the container with id fc4978ba9c2e4bf949c1643236e35a694f7feb376de6fee4b56259e8cfcb15a8 Apr 21 06:26:27.384143 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.384125 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qvz4w" Apr 21 06:26:27.390875 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.390827 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vpw6q" Apr 21 06:26:27.392152 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:27.392127 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb319b9ca_8134_427f_bce9_921c4216c413.slice/crio-636eac30ee2af29c1b3f2714adde4e0742ef2fc243c9b83f3f14b63326aec5a7 WatchSource:0}: Error finding container 636eac30ee2af29c1b3f2714adde4e0742ef2fc243c9b83f3f14b63326aec5a7: Status 404 returned error can't find the container with id 636eac30ee2af29c1b3f2714adde4e0742ef2fc243c9b83f3f14b63326aec5a7 Apr 21 06:26:27.397045 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.397024 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:27.398169 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:27.398148 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda81bc131_222a_47ad_9171_0a4db0b65c51.slice/crio-df881658d7fced550a1e2a250a86f18bd61878de393f9cf5fe1a7b2dad90c07b WatchSource:0}: Error finding container df881658d7fced550a1e2a250a86f18bd61878de393f9cf5fe1a7b2dad90c07b: Status 404 returned error can't find the container with id df881658d7fced550a1e2a250a86f18bd61878de393f9cf5fe1a7b2dad90c07b Apr 21 06:26:27.404430 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:26:27.404404 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b8417f4_abc8_485b_8bfc_78987d632957.slice/crio-58962a338fe00904626c4507f0f11b2dad5b7a8e1f611499d7ccb3e853caef20 WatchSource:0}: Error finding container 58962a338fe00904626c4507f0f11b2dad5b7a8e1f611499d7ccb3e853caef20: Status 404 returned error can't find the container with id 58962a338fe00904626c4507f0f11b2dad5b7a8e1f611499d7ccb3e853caef20 Apr 21 06:26:27.598588 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.598461 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs\") pod \"network-metrics-daemon-qjchj\" (UID: \"546de538-b56a-4ad2-baeb-3d59144586fb\") " pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:27.598732 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:27.598616 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:27.598732 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:27.598674 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs podName:546de538-b56a-4ad2-baeb-3d59144586fb nodeName:}" failed. No retries permitted until 2026-04-21 06:26:28.598655018 +0000 UTC m=+3.083306024 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs") pod "network-metrics-daemon-qjchj" (UID: "546de538-b56a-4ad2-baeb-3d59144586fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:27.699178 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.699146 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlktq\" (UniqueName: \"kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq\") pod \"network-check-target-zk874\" (UID: \"fdd3109b-7468-400c-b587-0e2d50c0911b\") " pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:27.699389 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:27.699322 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 06:26:27.699389 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:27.699341 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 06:26:27.699389 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:27.699354 2571 projected.go:194] Error preparing data for projected volume kube-api-access-hlktq for pod openshift-network-diagnostics/network-check-target-zk874: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:27.699577 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:27.699409 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq podName:fdd3109b-7468-400c-b587-0e2d50c0911b nodeName:}" failed. No retries permitted until 2026-04-21 06:26:28.69939026 +0000 UTC m=+3.184041269 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hlktq" (UniqueName: "kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq") pod "network-check-target-zk874" (UID: "fdd3109b-7468-400c-b587-0e2d50c0911b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:27.917357 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:27.917094 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 06:26:28.024193 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:28.024103 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 06:21:27 +0000 UTC" deadline="2027-11-07 08:49:43.625867882 +0000 UTC" Apr 21 06:26:28.024193 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:28.024136 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13562h23m15.601734709s" Apr 21 06:26:28.133331 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:28.133285 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-59vzs" event={"ID":"8eba033d-48d3-4a60-b429-c79feb5274f3","Type":"ContainerStarted","Data":"00de33b7dfb9a9711693ebbdc18188b58435e66ce7b388d1c053b4f67447b910"} Apr 21 06:26:28.136620 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:28.136588 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-stw9q" event={"ID":"2b985311-2ecb-45b4-8665-a9a42cef2837","Type":"ContainerStarted","Data":"2225d12966275b1d9af16d75f8023a305c137eba07e503c0ac34599516dd4ece"} Apr 21 06:26:28.140205 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:28.140166 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9x6w4" event={"ID":"a90e0df9-bd86-4e43-ab44-ddd45d0f1a43","Type":"ContainerStarted","Data":"cd0aa0d72df42fae8d7548f11a49c6f7f930d8be29a6350866d5839cae783a5c"} Apr 21 06:26:28.148058 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:28.148035 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qvz4w" event={"ID":"b319b9ca-8134-427f-bce9-921c4216c413","Type":"ContainerStarted","Data":"636eac30ee2af29c1b3f2714adde4e0742ef2fc243c9b83f3f14b63326aec5a7"} Apr 21 06:26:28.163293 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:28.163266 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" event={"ID":"6e96c3e6-5bac-49c9-b707-018f191114fa","Type":"ContainerStarted","Data":"c05ebdd5f7d3d3439998b7755777ceb6dbbf6989ce58260c8efa505fe75f8fd7"} Apr 21 06:26:28.166312 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:28.166288 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kf6hb" event={"ID":"1acddd86-ee36-4689-b8ab-ef158e2b4a47","Type":"ContainerStarted","Data":"e96ba5e3b88d0ef5b7d8690d3caa78da28754cb0fa2f00a3798fa70588a5a556"} Apr 21 06:26:28.172493 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:28.172441 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" event={"ID":"9b8417f4-abc8-485b-8bfc-78987d632957","Type":"ContainerStarted","Data":"58962a338fe00904626c4507f0f11b2dad5b7a8e1f611499d7ccb3e853caef20"} Apr 21 06:26:28.188209 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:28.186615 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vpw6q" event={"ID":"a81bc131-222a-47ad-9171-0a4db0b65c51","Type":"ContainerStarted","Data":"df881658d7fced550a1e2a250a86f18bd61878de393f9cf5fe1a7b2dad90c07b"} Apr 21 06:26:28.194000 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:28.193978 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" event={"ID":"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d","Type":"ContainerStarted","Data":"fc4978ba9c2e4bf949c1643236e35a694f7feb376de6fee4b56259e8cfcb15a8"} Apr 21 06:26:28.326765 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:28.326735 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 06:26:28.367054 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:28.367025 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 06:26:28.607285 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:28.607206 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs\") pod \"network-metrics-daemon-qjchj\" (UID: \"546de538-b56a-4ad2-baeb-3d59144586fb\") " pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:28.607432 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:28.607363 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:28.607432 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:28.607421 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs podName:546de538-b56a-4ad2-baeb-3d59144586fb nodeName:}" failed. No retries permitted until 2026-04-21 06:26:30.607402801 +0000 UTC m=+5.092053804 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs") pod "network-metrics-daemon-qjchj" (UID: "546de538-b56a-4ad2-baeb-3d59144586fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:28.707843 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:28.707807 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlktq\" (UniqueName: \"kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq\") pod \"network-check-target-zk874\" (UID: \"fdd3109b-7468-400c-b587-0e2d50c0911b\") " pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:28.708034 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:28.708012 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 06:26:28.708101 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:28.708036 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 06:26:28.708101 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:28.708049 2571 projected.go:194] Error preparing data for projected volume kube-api-access-hlktq for pod openshift-network-diagnostics/network-check-target-zk874: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:28.708210 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:28.708106 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq podName:fdd3109b-7468-400c-b587-0e2d50c0911b nodeName:}" failed. No retries permitted until 2026-04-21 06:26:30.708087442 +0000 UTC m=+5.192738443 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hlktq" (UniqueName: "kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq") pod "network-check-target-zk874" (UID: "fdd3109b-7468-400c-b587-0e2d50c0911b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:29.025137 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:29.025049 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 06:21:27 +0000 UTC" deadline="2027-12-17 07:16:57.496082395 +0000 UTC" Apr 21 06:26:29.025137 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:29.025088 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14520h50m28.470997802s" Apr 21 06:26:29.111146 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:29.111095 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:29.111325 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:29.111236 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zk874" podUID="fdd3109b-7468-400c-b587-0e2d50c0911b" Apr 21 06:26:29.111719 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:29.111698 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:29.111867 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:29.111808 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qjchj" podUID="546de538-b56a-4ad2-baeb-3d59144586fb" Apr 21 06:26:30.625048 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:30.625005 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs\") pod \"network-metrics-daemon-qjchj\" (UID: \"546de538-b56a-4ad2-baeb-3d59144586fb\") " pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:30.625489 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:30.625153 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:30.625489 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:30.625253 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs podName:546de538-b56a-4ad2-baeb-3d59144586fb nodeName:}" failed. No retries permitted until 2026-04-21 06:26:34.625232046 +0000 UTC m=+9.109883051 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs") pod "network-metrics-daemon-qjchj" (UID: "546de538-b56a-4ad2-baeb-3d59144586fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:30.725910 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:30.725672 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlktq\" (UniqueName: \"kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq\") pod \"network-check-target-zk874\" (UID: \"fdd3109b-7468-400c-b587-0e2d50c0911b\") " pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:30.725910 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:30.725809 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 06:26:30.725910 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:30.725829 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 06:26:30.725910 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:30.725841 2571 projected.go:194] Error preparing data for projected volume kube-api-access-hlktq for pod openshift-network-diagnostics/network-check-target-zk874: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:30.725910 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:30.725894 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq podName:fdd3109b-7468-400c-b587-0e2d50c0911b nodeName:}" failed. No retries permitted until 2026-04-21 06:26:34.725875516 +0000 UTC m=+9.210526520 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-hlktq" (UniqueName: "kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq") pod "network-check-target-zk874" (UID: "fdd3109b-7468-400c-b587-0e2d50c0911b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:31.111777 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:31.111217 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:31.111777 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:31.111351 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qjchj" podUID="546de538-b56a-4ad2-baeb-3d59144586fb" Apr 21 06:26:31.111777 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:31.111378 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:31.111777 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:31.111450 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zk874" podUID="fdd3109b-7468-400c-b587-0e2d50c0911b" Apr 21 06:26:33.111288 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:33.111224 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:33.111978 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:33.111354 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zk874" podUID="fdd3109b-7468-400c-b587-0e2d50c0911b" Apr 21 06:26:33.111978 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:33.111560 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:33.111978 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:33.111664 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qjchj" podUID="546de538-b56a-4ad2-baeb-3d59144586fb" Apr 21 06:26:34.655234 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:34.654974 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs\") pod \"network-metrics-daemon-qjchj\" (UID: \"546de538-b56a-4ad2-baeb-3d59144586fb\") " pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:34.655234 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:34.655145 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:34.655234 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:34.655213 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs podName:546de538-b56a-4ad2-baeb-3d59144586fb nodeName:}" failed. No retries permitted until 2026-04-21 06:26:42.655192136 +0000 UTC m=+17.139843139 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs") pod "network-metrics-daemon-qjchj" (UID: "546de538-b56a-4ad2-baeb-3d59144586fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:34.755455 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:34.755416 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlktq\" (UniqueName: \"kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq\") pod \"network-check-target-zk874\" (UID: \"fdd3109b-7468-400c-b587-0e2d50c0911b\") " pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:34.755651 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:34.755627 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 06:26:34.755651 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:34.755649 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 06:26:34.755776 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:34.755663 2571 projected.go:194] Error preparing data for projected volume kube-api-access-hlktq for pod openshift-network-diagnostics/network-check-target-zk874: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:34.755776 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:34.755719 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq podName:fdd3109b-7468-400c-b587-0e2d50c0911b nodeName:}" failed. No retries permitted until 2026-04-21 06:26:42.755699766 +0000 UTC m=+17.240350784 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-hlktq" (UniqueName: "kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq") pod "network-check-target-zk874" (UID: "fdd3109b-7468-400c-b587-0e2d50c0911b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:35.111589 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:35.111051 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:35.111589 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:35.111115 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:35.111589 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:35.111234 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qjchj" podUID="546de538-b56a-4ad2-baeb-3d59144586fb" Apr 21 06:26:35.111589 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:35.111423 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zk874" podUID="fdd3109b-7468-400c-b587-0e2d50c0911b" Apr 21 06:26:36.292761 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:36.292688 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-rlzb5"] Apr 21 06:26:36.294662 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:36.294639 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:36.294778 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:36.294720 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzb5" podUID="75cb84e7-4602-4954-9579-ec59fa9a8289" Apr 21 06:26:36.367888 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:36.367824 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75cb84e7-4602-4954-9579-ec59fa9a8289-kubelet-config\") pod \"global-pull-secret-syncer-rlzb5\" (UID: \"75cb84e7-4602-4954-9579-ec59fa9a8289\") " pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:36.367888 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:36.367893 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75cb84e7-4602-4954-9579-ec59fa9a8289-dbus\") pod \"global-pull-secret-syncer-rlzb5\" (UID: \"75cb84e7-4602-4954-9579-ec59fa9a8289\") " pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:36.368126 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:36.367924 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret\") pod \"global-pull-secret-syncer-rlzb5\" (UID: \"75cb84e7-4602-4954-9579-ec59fa9a8289\") " pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:36.468446 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:36.468416 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75cb84e7-4602-4954-9579-ec59fa9a8289-kubelet-config\") pod \"global-pull-secret-syncer-rlzb5\" (UID: \"75cb84e7-4602-4954-9579-ec59fa9a8289\") " pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:36.468707 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:36.468470 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75cb84e7-4602-4954-9579-ec59fa9a8289-dbus\") pod \"global-pull-secret-syncer-rlzb5\" (UID: \"75cb84e7-4602-4954-9579-ec59fa9a8289\") " pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:36.468707 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:36.468496 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret\") pod \"global-pull-secret-syncer-rlzb5\" (UID: \"75cb84e7-4602-4954-9579-ec59fa9a8289\") " pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:36.468707 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:36.468584 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/75cb84e7-4602-4954-9579-ec59fa9a8289-kubelet-config\") pod \"global-pull-secret-syncer-rlzb5\" (UID: \"75cb84e7-4602-4954-9579-ec59fa9a8289\") " pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:36.468707 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:36.468606 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:36.468707 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:36.468673 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret podName:75cb84e7-4602-4954-9579-ec59fa9a8289 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:36.968653001 +0000 UTC m=+11.453304012 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret") pod "global-pull-secret-syncer-rlzb5" (UID: "75cb84e7-4602-4954-9579-ec59fa9a8289") : object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:36.468707 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:36.468686 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/75cb84e7-4602-4954-9579-ec59fa9a8289-dbus\") pod \"global-pull-secret-syncer-rlzb5\" (UID: \"75cb84e7-4602-4954-9579-ec59fa9a8289\") " pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:36.972470 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:36.972443 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret\") pod \"global-pull-secret-syncer-rlzb5\" (UID: \"75cb84e7-4602-4954-9579-ec59fa9a8289\") " pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:36.972629 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:36.972603 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:36.972737 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:36.972672 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret podName:75cb84e7-4602-4954-9579-ec59fa9a8289 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:37.972657288 +0000 UTC m=+12.457308288 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret") pod "global-pull-secret-syncer-rlzb5" (UID: "75cb84e7-4602-4954-9579-ec59fa9a8289") : object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:37.111204 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:37.111170 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:37.111204 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:37.111211 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:37.111400 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:37.111302 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qjchj" podUID="546de538-b56a-4ad2-baeb-3d59144586fb" Apr 21 06:26:37.111472 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:37.111442 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zk874" podUID="fdd3109b-7468-400c-b587-0e2d50c0911b" Apr 21 06:26:37.982305 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:37.982266 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret\") pod \"global-pull-secret-syncer-rlzb5\" (UID: \"75cb84e7-4602-4954-9579-ec59fa9a8289\") " pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:37.982784 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:37.982414 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:37.982784 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:37.982486 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret podName:75cb84e7-4602-4954-9579-ec59fa9a8289 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:39.982466328 +0000 UTC m=+14.467117332 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret") pod "global-pull-secret-syncer-rlzb5" (UID: "75cb84e7-4602-4954-9579-ec59fa9a8289") : object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:38.110654 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:38.110622 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:38.110808 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:38.110747 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzb5" podUID="75cb84e7-4602-4954-9579-ec59fa9a8289" Apr 21 06:26:39.110600 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:39.110564 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:39.111009 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:39.110563 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:39.111009 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:39.110684 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qjchj" podUID="546de538-b56a-4ad2-baeb-3d59144586fb" Apr 21 06:26:39.111009 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:39.110786 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zk874" podUID="fdd3109b-7468-400c-b587-0e2d50c0911b" Apr 21 06:26:39.999062 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:39.999024 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret\") pod \"global-pull-secret-syncer-rlzb5\" (UID: \"75cb84e7-4602-4954-9579-ec59fa9a8289\") " pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:39.999241 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:39.999187 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:39.999297 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:39.999252 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret podName:75cb84e7-4602-4954-9579-ec59fa9a8289 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:43.999236988 +0000 UTC m=+18.483887989 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret") pod "global-pull-secret-syncer-rlzb5" (UID: "75cb84e7-4602-4954-9579-ec59fa9a8289") : object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:40.113628 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:40.113598 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:40.114015 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:40.113717 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzb5" podUID="75cb84e7-4602-4954-9579-ec59fa9a8289" Apr 21 06:26:41.110902 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:41.110866 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:41.111048 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:41.110870 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:41.111048 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:41.110972 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zk874" podUID="fdd3109b-7468-400c-b587-0e2d50c0911b" Apr 21 06:26:41.111141 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:41.111058 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qjchj" podUID="546de538-b56a-4ad2-baeb-3d59144586fb" Apr 21 06:26:42.114741 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:42.114710 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:42.115203 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:42.114826 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzb5" podUID="75cb84e7-4602-4954-9579-ec59fa9a8289" Apr 21 06:26:42.720309 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:42.720273 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs\") pod \"network-metrics-daemon-qjchj\" (UID: \"546de538-b56a-4ad2-baeb-3d59144586fb\") " pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:42.720605 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:42.720437 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:42.720605 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:42.720527 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs podName:546de538-b56a-4ad2-baeb-3d59144586fb nodeName:}" failed. No retries permitted until 2026-04-21 06:26:58.720493146 +0000 UTC m=+33.205144163 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs") pod "network-metrics-daemon-qjchj" (UID: "546de538-b56a-4ad2-baeb-3d59144586fb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 06:26:42.821316 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:42.821280 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlktq\" (UniqueName: \"kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq\") pod \"network-check-target-zk874\" (UID: \"fdd3109b-7468-400c-b587-0e2d50c0911b\") " pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:42.821488 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:42.821424 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 06:26:42.821488 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:42.821445 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 06:26:42.821488 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:42.821457 2571 projected.go:194] Error preparing data for projected volume kube-api-access-hlktq for pod openshift-network-diagnostics/network-check-target-zk874: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:42.821673 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:42.821532 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq podName:fdd3109b-7468-400c-b587-0e2d50c0911b nodeName:}" failed. No retries permitted until 2026-04-21 06:26:58.821498958 +0000 UTC m=+33.306149974 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-hlktq" (UniqueName: "kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq") pod "network-check-target-zk874" (UID: "fdd3109b-7468-400c-b587-0e2d50c0911b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 06:26:43.111065 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:43.110873 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:43.111257 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:43.111081 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qjchj" podUID="546de538-b56a-4ad2-baeb-3d59144586fb" Apr 21 06:26:43.111257 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:43.111116 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:43.111257 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:43.111220 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zk874" podUID="fdd3109b-7468-400c-b587-0e2d50c0911b" Apr 21 06:26:44.032193 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:44.032156 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret\") pod \"global-pull-secret-syncer-rlzb5\" (UID: \"75cb84e7-4602-4954-9579-ec59fa9a8289\") " pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:44.032650 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:44.032326 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:44.032650 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:44.032402 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret podName:75cb84e7-4602-4954-9579-ec59fa9a8289 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:52.032380111 +0000 UTC m=+26.517031112 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret") pod "global-pull-secret-syncer-rlzb5" (UID: "75cb84e7-4602-4954-9579-ec59fa9a8289") : object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:44.111083 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:44.111046 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:44.111250 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:44.111167 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzb5" podUID="75cb84e7-4602-4954-9579-ec59fa9a8289" Apr 21 06:26:45.110413 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:45.110391 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:45.110814 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:45.110391 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:45.110814 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:45.110504 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zk874" podUID="fdd3109b-7468-400c-b587-0e2d50c0911b" Apr 21 06:26:45.110814 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:45.110595 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qjchj" podUID="546de538-b56a-4ad2-baeb-3d59144586fb" Apr 21 06:26:45.223247 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:45.223196 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-76.ec2.internal" event={"ID":"131d5a43792cacf9a9c03a2052451cbd","Type":"ContainerStarted","Data":"f5a8ba5efce966d89eab18512247b867bd5024c9add02fd148fe99299316d1fc"} Apr 21 06:26:45.227437 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:45.227412 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" event={"ID":"9b8417f4-abc8-485b-8bfc-78987d632957","Type":"ContainerStarted","Data":"647aaf5bb4bb85b8d843f609bdce5ba5c4d52421048f771d65a3334c4378fbd2"} Apr 21 06:26:45.227562 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:45.227446 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" event={"ID":"9b8417f4-abc8-485b-8bfc-78987d632957","Type":"ContainerStarted","Data":"0dbd75c229c4d7a25082279873b23aa9131d1137eadf341102af0e4d7e7039c2"} Apr 21 06:26:45.227562 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:45.227463 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" event={"ID":"9b8417f4-abc8-485b-8bfc-78987d632957","Type":"ContainerStarted","Data":"75f64e54c9338ec23e4c338c9f8e65f85150b4fd6134630aa662004b11d2cd15"} Apr 21 06:26:45.229009 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:45.228984 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vpw6q" event={"ID":"a81bc131-222a-47ad-9171-0a4db0b65c51","Type":"ContainerStarted","Data":"7192e7e136ffe08d140253f2490ed9dcbdbfafcd6a723bacb7621d9ba88394a5"} Apr 21 06:26:45.233062 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:45.232887 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" event={"ID":"6e96c3e6-5bac-49c9-b707-018f191114fa","Type":"ContainerStarted","Data":"ffa77ae13c6610b2f5c9f37f867c309965d48b1eb82ef0ad5c8d9413b6724d1e"} Apr 21 06:26:45.237111 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:45.236503 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-76.ec2.internal" podStartSLOduration=19.236492337 podStartE2EDuration="19.236492337s" podCreationTimestamp="2026-04-21 06:26:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:26:45.236341942 +0000 UTC m=+19.720992964" watchObservedRunningTime="2026-04-21 06:26:45.236492337 +0000 UTC m=+19.721143369" Apr 21 06:26:45.251470 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:45.251416 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-p6x5s" podStartSLOduration=1.6714972879999999 podStartE2EDuration="19.25140031s" podCreationTimestamp="2026-04-21 06:26:26 +0000 UTC" firstStartedPulling="2026-04-21 06:26:27.313949692 +0000 UTC m=+1.798600692" lastFinishedPulling="2026-04-21 06:26:44.893852707 +0000 UTC m=+19.378503714" observedRunningTime="2026-04-21 06:26:45.250807311 +0000 UTC m=+19.735458332" watchObservedRunningTime="2026-04-21 06:26:45.25140031 +0000 UTC m=+19.736051333" Apr 21 06:26:45.267045 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:45.267004 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vpw6q" podStartSLOduration=1.7678492270000001 podStartE2EDuration="19.266990839s" podCreationTimestamp="2026-04-21 06:26:26 +0000 UTC" firstStartedPulling="2026-04-21 06:26:27.400776581 +0000 UTC m=+1.885427584" lastFinishedPulling="2026-04-21 06:26:44.899918195 +0000 UTC m=+19.384569196" observedRunningTime="2026-04-21 06:26:45.266711872 +0000 UTC m=+19.751362893" watchObservedRunningTime="2026-04-21 06:26:45.266990839 +0000 UTC m=+19.751641862" Apr 21 06:26:46.112005 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.111843 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:46.112710 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:46.112080 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzb5" podUID="75cb84e7-4602-4954-9579-ec59fa9a8289" Apr 21 06:26:46.235823 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.235796 2571 generic.go:358] "Generic (PLEG): container finished" podID="443cc1db22b53511937a5739cfa2cb24" containerID="5eecd425adce4391f74bf526826d5b3ec7e32bd92cc9fe920f47d0a610f261d1" exitCode=0 Apr 21 06:26:46.235929 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.235864 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal" event={"ID":"443cc1db22b53511937a5739cfa2cb24","Type":"ContainerDied","Data":"5eecd425adce4391f74bf526826d5b3ec7e32bd92cc9fe920f47d0a610f261d1"} Apr 21 06:26:46.238273 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.238255 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8rxc_9b8417f4-abc8-485b-8bfc-78987d632957/ovn-acl-logging/0.log" Apr 21 06:26:46.238565 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.238545 2571 generic.go:358] "Generic (PLEG): container finished" podID="9b8417f4-abc8-485b-8bfc-78987d632957" containerID="0dbd75c229c4d7a25082279873b23aa9131d1137eadf341102af0e4d7e7039c2" exitCode=1 Apr 21 06:26:46.238674 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.238571 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" event={"ID":"9b8417f4-abc8-485b-8bfc-78987d632957","Type":"ContainerDied","Data":"0dbd75c229c4d7a25082279873b23aa9131d1137eadf341102af0e4d7e7039c2"} Apr 21 06:26:46.238674 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.238598 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" event={"ID":"9b8417f4-abc8-485b-8bfc-78987d632957","Type":"ContainerStarted","Data":"efd0980df937014f8b83cb18e03ba26b3703fbf769d4dedfbf7a8678a24e976c"} Apr 21 06:26:46.238674 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.238612 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" event={"ID":"9b8417f4-abc8-485b-8bfc-78987d632957","Type":"ContainerStarted","Data":"2669f222e4627b750f1ef34c479af0b45d41116e9e05c0dd39836dc661958aab"} Apr 21 06:26:46.238674 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.238625 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" event={"ID":"9b8417f4-abc8-485b-8bfc-78987d632957","Type":"ContainerStarted","Data":"6688d4390e8f526e51c32f2d210ef424aa682877e68668b8c6a9cc6b380b1d82"} Apr 21 06:26:46.242096 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.242074 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" event={"ID":"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d","Type":"ContainerStarted","Data":"ce402b12f2619bdfa884ab4e5aed79d83f9dd7fad4d5a3803a200d23ede9e265"} Apr 21 06:26:46.243664 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.243642 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-59vzs" event={"ID":"8eba033d-48d3-4a60-b429-c79feb5274f3","Type":"ContainerStarted","Data":"e89e2f57f11733bcb2dee8f30c63637076d053bcf7da8ea9c74cb3807f3adc7e"} Apr 21 06:26:46.244802 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.244781 2571 generic.go:358] "Generic (PLEG): container finished" podID="2b985311-2ecb-45b4-8665-a9a42cef2837" containerID="7123a47449e6448b81bd134fe4f25285e5a8e9e5f2c7dd195e36527ab53e9e49" exitCode=0 Apr 21 06:26:46.244875 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.244823 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-stw9q" event={"ID":"2b985311-2ecb-45b4-8665-a9a42cef2837","Type":"ContainerDied","Data":"7123a47449e6448b81bd134fe4f25285e5a8e9e5f2c7dd195e36527ab53e9e49"} Apr 21 06:26:46.246157 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.246072 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9x6w4" event={"ID":"a90e0df9-bd86-4e43-ab44-ddd45d0f1a43","Type":"ContainerStarted","Data":"e9c0b676a961b05d0bff88fcf270c4209740f9f12c625a792f81c08800aa6d87"} Apr 21 06:26:46.247382 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.247350 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qvz4w" event={"ID":"b319b9ca-8134-427f-bce9-921c4216c413","Type":"ContainerStarted","Data":"e461de5cedeaa95e6995defef4bf9291af5297fcebb043db981ac4992d366721"} Apr 21 06:26:46.248690 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.248668 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-kf6hb" event={"ID":"1acddd86-ee36-4689-b8ab-ef158e2b4a47","Type":"ContainerStarted","Data":"b41c3958f74fde26ebe44493175effd3a2cd5835b6aa193cef3ca2440fe49f54"} Apr 21 06:26:46.281592 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.281546 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qvz4w" podStartSLOduration=2.8139074859999997 podStartE2EDuration="20.281531532s" podCreationTimestamp="2026-04-21 06:26:26 +0000 UTC" firstStartedPulling="2026-04-21 06:26:27.394383841 +0000 UTC m=+1.879034854" lastFinishedPulling="2026-04-21 06:26:44.862007886 +0000 UTC m=+19.346658900" observedRunningTime="2026-04-21 06:26:46.280919685 +0000 UTC m=+20.765570722" watchObservedRunningTime="2026-04-21 06:26:46.281531532 +0000 UTC m=+20.766182549" Apr 21 06:26:46.306935 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.306899 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-kf6hb" podStartSLOduration=2.749720593 podStartE2EDuration="20.306888088s" podCreationTimestamp="2026-04-21 06:26:26 +0000 UTC" firstStartedPulling="2026-04-21 06:26:27.304839106 +0000 UTC m=+1.789490106" lastFinishedPulling="2026-04-21 06:26:44.862006587 +0000 UTC m=+19.346657601" observedRunningTime="2026-04-21 06:26:46.293905086 +0000 UTC m=+20.778556109" watchObservedRunningTime="2026-04-21 06:26:46.306888088 +0000 UTC m=+20.791539164" Apr 21 06:26:46.307069 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.307048 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9x6w4" podStartSLOduration=2.747855946 podStartE2EDuration="20.307043424s" podCreationTimestamp="2026-04-21 06:26:26 +0000 UTC" firstStartedPulling="2026-04-21 06:26:27.334810344 +0000 UTC m=+1.819461350" lastFinishedPulling="2026-04-21 06:26:44.893997828 +0000 UTC m=+19.378648828" observedRunningTime="2026-04-21 06:26:46.306563508 +0000 UTC m=+20.791214529" watchObservedRunningTime="2026-04-21 06:26:46.307043424 +0000 UTC m=+20.791694446" Apr 21 06:26:46.319410 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.319374 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-59vzs" podStartSLOduration=2.780881258 podStartE2EDuration="20.319364469s" podCreationTimestamp="2026-04-21 06:26:26 +0000 UTC" firstStartedPulling="2026-04-21 06:26:27.361825604 +0000 UTC m=+1.846476608" lastFinishedPulling="2026-04-21 06:26:44.900308818 +0000 UTC m=+19.384959819" observedRunningTime="2026-04-21 06:26:46.319348468 +0000 UTC m=+20.803999491" watchObservedRunningTime="2026-04-21 06:26:46.319364469 +0000 UTC m=+20.804015492" Apr 21 06:26:46.573882 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:46.573860 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 06:26:47.048481 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:47.048366 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T06:26:46.573877946Z","UUID":"05f9059e-87da-42e9-94b8-d7703b3bb4cc","Handler":null,"Name":"","Endpoint":""} Apr 21 06:26:47.051689 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:47.051647 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 06:26:47.051689 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:47.051682 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 06:26:47.110644 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:47.110615 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:47.110807 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:47.110621 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:47.110807 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:47.110736 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zk874" podUID="fdd3109b-7468-400c-b587-0e2d50c0911b" Apr 21 06:26:47.110916 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:47.110829 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qjchj" podUID="546de538-b56a-4ad2-baeb-3d59144586fb" Apr 21 06:26:47.252836 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:47.252797 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" event={"ID":"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d","Type":"ContainerStarted","Data":"f22e0788def063df6f945453054cbee238d6337cdf350345de2acfbade495b30"} Apr 21 06:26:47.255061 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:47.255035 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal" event={"ID":"443cc1db22b53511937a5739cfa2cb24","Type":"ContainerStarted","Data":"f9cc1b584d58c854d09051d65a76a471b0f559dcd27458f480e1739b3fc16f97"} Apr 21 06:26:47.275017 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:47.274969 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-76.ec2.internal" podStartSLOduration=21.274954216 podStartE2EDuration="21.274954216s" podCreationTimestamp="2026-04-21 06:26:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:26:47.274121774 +0000 UTC m=+21.758772797" watchObservedRunningTime="2026-04-21 06:26:47.274954216 +0000 UTC m=+21.759605244" Apr 21 06:26:48.110820 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:48.110573 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:48.110983 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:48.110923 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzb5" podUID="75cb84e7-4602-4954-9579-ec59fa9a8289" Apr 21 06:26:48.259893 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:48.259865 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8rxc_9b8417f4-abc8-485b-8bfc-78987d632957/ovn-acl-logging/0.log" Apr 21 06:26:48.260413 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:48.260240 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" event={"ID":"9b8417f4-abc8-485b-8bfc-78987d632957","Type":"ContainerStarted","Data":"ec8f3f16a0050911a09c45a64b72d4cc9a875f867bcb642ff233c8219772a875"} Apr 21 06:26:48.262110 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:48.262084 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" event={"ID":"5b1c2098-f1cb-4a0f-a8c1-d131e97e930d","Type":"ContainerStarted","Data":"14b440ca3c42079a56c7c690135b54af57658ffec0561bfb6595359c148bad77"} Apr 21 06:26:48.277579 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:48.277533 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mr8fb" podStartSLOduration=2.205790193 podStartE2EDuration="22.277500775s" podCreationTimestamp="2026-04-21 06:26:26 +0000 UTC" firstStartedPulling="2026-04-21 06:26:27.36661726 +0000 UTC m=+1.851268263" lastFinishedPulling="2026-04-21 06:26:47.438327842 +0000 UTC m=+21.922978845" observedRunningTime="2026-04-21 06:26:48.277223046 +0000 UTC m=+22.761874071" watchObservedRunningTime="2026-04-21 06:26:48.277500775 +0000 UTC m=+22.762151798" Apr 21 06:26:49.110647 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:49.110617 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:49.110817 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:49.110618 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:49.110817 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:49.110720 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qjchj" podUID="546de538-b56a-4ad2-baeb-3d59144586fb" Apr 21 06:26:49.110817 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:49.110795 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zk874" podUID="fdd3109b-7468-400c-b587-0e2d50c0911b" Apr 21 06:26:49.619749 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:49.619718 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-kf6hb" Apr 21 06:26:49.620502 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:49.620483 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-kf6hb" Apr 21 06:26:50.110468 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:50.110437 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:50.110641 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:50.110568 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzb5" podUID="75cb84e7-4602-4954-9579-ec59fa9a8289" Apr 21 06:26:50.268663 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:50.268493 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8rxc_9b8417f4-abc8-485b-8bfc-78987d632957/ovn-acl-logging/0.log" Apr 21 06:26:50.269045 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:50.269020 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" event={"ID":"9b8417f4-abc8-485b-8bfc-78987d632957","Type":"ContainerStarted","Data":"d9183f86a6faefb3164260a486e51c5a2119c0ef05979b4ca0dafadbbdac3451"} Apr 21 06:26:50.269330 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:50.269302 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-kf6hb" Apr 21 06:26:50.269530 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:50.269497 2571 scope.go:117] "RemoveContainer" containerID="0dbd75c229c4d7a25082279873b23aa9131d1137eadf341102af0e4d7e7039c2" Apr 21 06:26:50.269879 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:50.269860 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-kf6hb" Apr 21 06:26:51.111151 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:51.111124 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:51.111606 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:51.111163 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:51.111606 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:51.111228 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zk874" podUID="fdd3109b-7468-400c-b587-0e2d50c0911b" Apr 21 06:26:51.111606 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:51.111286 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qjchj" podUID="546de538-b56a-4ad2-baeb-3d59144586fb" Apr 21 06:26:51.273408 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:51.273383 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8rxc_9b8417f4-abc8-485b-8bfc-78987d632957/ovn-acl-logging/0.log" Apr 21 06:26:51.273730 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:51.273700 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" event={"ID":"9b8417f4-abc8-485b-8bfc-78987d632957","Type":"ContainerStarted","Data":"dc5162b5b5bfe5ad45578a04ff3b0360257e65be10c6a74c8a2e399fb40395d5"} Apr 21 06:26:51.273951 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:51.273927 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:51.274039 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:51.273954 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:51.274039 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:51.273967 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:51.275488 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:51.275460 2571 generic.go:358] "Generic (PLEG): container finished" podID="2b985311-2ecb-45b4-8665-a9a42cef2837" containerID="e12a6dff70710006d405a7ebf03e8a0d511296d55c38416132d64159f95854a5" exitCode=0 Apr 21 06:26:51.275616 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:51.275546 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-stw9q" event={"ID":"2b985311-2ecb-45b4-8665-a9a42cef2837","Type":"ContainerDied","Data":"e12a6dff70710006d405a7ebf03e8a0d511296d55c38416132d64159f95854a5"} Apr 21 06:26:51.288624 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:51.288604 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:51.288706 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:51.288666 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:26:51.299959 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:51.299922 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" podStartSLOduration=7.781667276 podStartE2EDuration="25.299909905s" podCreationTimestamp="2026-04-21 06:26:26 +0000 UTC" firstStartedPulling="2026-04-21 06:26:27.406466044 +0000 UTC m=+1.891117048" lastFinishedPulling="2026-04-21 06:26:44.924708676 +0000 UTC m=+19.409359677" observedRunningTime="2026-04-21 06:26:51.299412207 +0000 UTC m=+25.784063219" watchObservedRunningTime="2026-04-21 06:26:51.299909905 +0000 UTC m=+25.784560927" Apr 21 06:26:52.097578 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:52.097540 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret\") pod \"global-pull-secret-syncer-rlzb5\" (UID: \"75cb84e7-4602-4954-9579-ec59fa9a8289\") " pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:52.097763 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:52.097645 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:52.097763 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:52.097693 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret podName:75cb84e7-4602-4954-9579-ec59fa9a8289 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:08.097679929 +0000 UTC m=+42.582330929 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret") pod "global-pull-secret-syncer-rlzb5" (UID: "75cb84e7-4602-4954-9579-ec59fa9a8289") : object "kube-system"/"original-pull-secret" not registered Apr 21 06:26:52.111020 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:52.110993 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:52.111131 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:52.111103 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzb5" podUID="75cb84e7-4602-4954-9579-ec59fa9a8289" Apr 21 06:26:52.123271 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:52.123246 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rlzb5"] Apr 21 06:26:52.126463 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:52.126436 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zk874"] Apr 21 06:26:52.126599 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:52.126558 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:52.126670 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:52.126636 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zk874" podUID="fdd3109b-7468-400c-b587-0e2d50c0911b" Apr 21 06:26:52.127109 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:52.127089 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qjchj"] Apr 21 06:26:52.127194 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:52.127183 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:52.127277 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:52.127260 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qjchj" podUID="546de538-b56a-4ad2-baeb-3d59144586fb" Apr 21 06:26:52.277425 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:52.277403 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:52.277567 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:52.277506 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzb5" podUID="75cb84e7-4602-4954-9579-ec59fa9a8289" Apr 21 06:26:53.280894 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:53.280859 2571 generic.go:358] "Generic (PLEG): container finished" podID="2b985311-2ecb-45b4-8665-a9a42cef2837" containerID="7a0e0bfd0916013d2561b9f6b0c5b3c568d74bdd3b42f14894c2fba2b8a92e7c" exitCode=0 Apr 21 06:26:53.281499 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:53.280967 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-stw9q" event={"ID":"2b985311-2ecb-45b4-8665-a9a42cef2837","Type":"ContainerDied","Data":"7a0e0bfd0916013d2561b9f6b0c5b3c568d74bdd3b42f14894c2fba2b8a92e7c"} Apr 21 06:26:54.111363 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:54.111326 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:54.111363 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:54.111355 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:54.111609 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:54.111481 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qjchj" podUID="546de538-b56a-4ad2-baeb-3d59144586fb" Apr 21 06:26:54.111609 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:54.111562 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:54.111712 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:54.111551 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zk874" podUID="fdd3109b-7468-400c-b587-0e2d50c0911b" Apr 21 06:26:54.111712 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:54.111628 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzb5" podUID="75cb84e7-4602-4954-9579-ec59fa9a8289" Apr 21 06:26:55.286539 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:55.286489 2571 generic.go:358] "Generic (PLEG): container finished" podID="2b985311-2ecb-45b4-8665-a9a42cef2837" containerID="a4c74cd45b01fcc2878d80b0c1dcc013e12feda9fc889c2f1eb4dc1759712a2e" exitCode=0 Apr 21 06:26:55.287140 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:55.286547 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-stw9q" event={"ID":"2b985311-2ecb-45b4-8665-a9a42cef2837","Type":"ContainerDied","Data":"a4c74cd45b01fcc2878d80b0c1dcc013e12feda9fc889c2f1eb4dc1759712a2e"} Apr 21 06:26:56.112177 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:56.112150 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:56.112324 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:56.112262 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qjchj" podUID="546de538-b56a-4ad2-baeb-3d59144586fb" Apr 21 06:26:56.112401 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:56.112319 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:56.112472 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:56.112448 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzb5" podUID="75cb84e7-4602-4954-9579-ec59fa9a8289" Apr 21 06:26:56.112549 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:56.112502 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:56.112612 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:56.112591 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zk874" podUID="fdd3109b-7468-400c-b587-0e2d50c0911b" Apr 21 06:26:57.813549 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:57.813507 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-76.ec2.internal" event="NodeReady" Apr 21 06:26:57.813935 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:57.813645 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 06:26:57.854846 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:57.854820 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-t2hf9"] Apr 21 06:26:57.879741 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:57.879700 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mdnlh"] Apr 21 06:26:57.879892 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:57.879751 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t2hf9" Apr 21 06:26:57.882499 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:57.882445 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 06:26:57.882499 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:57.882451 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 06:26:57.882702 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:57.882545 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gz466\"" Apr 21 06:26:57.897829 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:57.897808 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t2hf9"] Apr 21 06:26:57.897829 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:57.897833 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mdnlh"] Apr 21 06:26:57.898011 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:57.897935 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mdnlh" Apr 21 06:26:57.900639 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:57.900398 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 06:26:57.900639 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:57.900454 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 06:26:57.900639 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:57.900565 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 06:26:57.900831 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:57.900764 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-98jt7\"" Apr 21 06:26:58.045271 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.045238 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6d75f91-6fd1-4d46-9c53-7c8492b33064-config-volume\") pod \"dns-default-t2hf9\" (UID: \"f6d75f91-6fd1-4d46-9c53-7c8492b33064\") " pod="openshift-dns/dns-default-t2hf9" Apr 21 06:26:58.045416 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.045278 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls\") pod \"dns-default-t2hf9\" (UID: \"f6d75f91-6fd1-4d46-9c53-7c8492b33064\") " pod="openshift-dns/dns-default-t2hf9" Apr 21 06:26:58.045416 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.045301 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m48z\" (UniqueName: \"kubernetes.io/projected/f6d75f91-6fd1-4d46-9c53-7c8492b33064-kube-api-access-8m48z\") pod \"dns-default-t2hf9\" (UID: \"f6d75f91-6fd1-4d46-9c53-7c8492b33064\") " pod="openshift-dns/dns-default-t2hf9" Apr 21 06:26:58.045416 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.045324 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert\") pod \"ingress-canary-mdnlh\" (UID: \"4aa4526d-b9ab-4ef8-9333-2a849fe6acc7\") " pod="openshift-ingress-canary/ingress-canary-mdnlh" Apr 21 06:26:58.045416 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.045404 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glbhf\" (UniqueName: \"kubernetes.io/projected/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-kube-api-access-glbhf\") pod \"ingress-canary-mdnlh\" (UID: \"4aa4526d-b9ab-4ef8-9333-2a849fe6acc7\") " pod="openshift-ingress-canary/ingress-canary-mdnlh" Apr 21 06:26:58.045606 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.045444 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f6d75f91-6fd1-4d46-9c53-7c8492b33064-tmp-dir\") pod \"dns-default-t2hf9\" (UID: \"f6d75f91-6fd1-4d46-9c53-7c8492b33064\") " pod="openshift-dns/dns-default-t2hf9" Apr 21 06:26:58.113630 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.113605 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:26:58.113796 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.113638 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:58.113796 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.113754 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:58.116494 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.116475 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 06:26:58.116618 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.116557 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 06:26:58.117475 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.117377 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nvgbf\"" Apr 21 06:26:58.117475 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.117388 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p6jr2\"" Apr 21 06:26:58.117475 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.117377 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 06:26:58.117475 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.117423 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 06:26:58.121842 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.121820 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-556dfbbf77-qvf4b"] Apr 21 06:26:58.139718 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.139676 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-677cd7bdc6-69hr7"] Apr 21 06:26:58.139814 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.139764 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-556dfbbf77-qvf4b" Apr 21 06:26:58.142901 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.142647 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 21 06:26:58.142901 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.142709 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 06:26:58.142901 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.142866 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 06:26:58.143541 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.143123 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 06:26:58.143541 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.143441 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-dmwvb\"" Apr 21 06:26:58.146199 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.146179 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f6d75f91-6fd1-4d46-9c53-7c8492b33064-tmp-dir\") pod \"dns-default-t2hf9\" (UID: \"f6d75f91-6fd1-4d46-9c53-7c8492b33064\") " pod="openshift-dns/dns-default-t2hf9" Apr 21 06:26:58.146313 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.146212 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6d75f91-6fd1-4d46-9c53-7c8492b33064-config-volume\") pod \"dns-default-t2hf9\" (UID: \"f6d75f91-6fd1-4d46-9c53-7c8492b33064\") " pod="openshift-dns/dns-default-t2hf9" Apr 21 06:26:58.146313 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.146231 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls\") pod \"dns-default-t2hf9\" (UID: \"f6d75f91-6fd1-4d46-9c53-7c8492b33064\") " pod="openshift-dns/dns-default-t2hf9" Apr 21 06:26:58.146313 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.146255 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8m48z\" (UniqueName: \"kubernetes.io/projected/f6d75f91-6fd1-4d46-9c53-7c8492b33064-kube-api-access-8m48z\") pod \"dns-default-t2hf9\" (UID: \"f6d75f91-6fd1-4d46-9c53-7c8492b33064\") " pod="openshift-dns/dns-default-t2hf9" Apr 21 06:26:58.146313 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.146273 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert\") pod \"ingress-canary-mdnlh\" (UID: \"4aa4526d-b9ab-4ef8-9333-2a849fe6acc7\") " pod="openshift-ingress-canary/ingress-canary-mdnlh" Apr 21 06:26:58.146538 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.146329 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glbhf\" (UniqueName: \"kubernetes.io/projected/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-kube-api-access-glbhf\") pod \"ingress-canary-mdnlh\" (UID: \"4aa4526d-b9ab-4ef8-9333-2a849fe6acc7\") " pod="openshift-ingress-canary/ingress-canary-mdnlh" Apr 21 06:26:58.146608 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.146574 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f6d75f91-6fd1-4d46-9c53-7c8492b33064-tmp-dir\") pod \"dns-default-t2hf9\" (UID: \"f6d75f91-6fd1-4d46-9c53-7c8492b33064\") " pod="openshift-dns/dns-default-t2hf9" Apr 21 06:26:58.146814 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:58.146797 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:26:58.146873 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:58.146858 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert podName:4aa4526d-b9ab-4ef8-9333-2a849fe6acc7 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:58.64683982 +0000 UTC m=+33.131490825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert") pod "ingress-canary-mdnlh" (UID: "4aa4526d-b9ab-4ef8-9333-2a849fe6acc7") : secret "canary-serving-cert" not found Apr 21 06:26:58.146918 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:58.146877 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:26:58.146967 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:58.146928 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls podName:f6d75f91-6fd1-4d46-9c53-7c8492b33064 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:58.646908949 +0000 UTC m=+33.131559955 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls") pod "dns-default-t2hf9" (UID: "f6d75f91-6fd1-4d46-9c53-7c8492b33064") : secret "dns-default-metrics-tls" not found Apr 21 06:26:58.147027 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.147014 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6d75f91-6fd1-4d46-9c53-7c8492b33064-config-volume\") pod \"dns-default-t2hf9\" (UID: \"f6d75f91-6fd1-4d46-9c53-7c8492b33064\") " pod="openshift-dns/dns-default-t2hf9" Apr 21 06:26:58.155970 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.155951 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-556dfbbf77-qvf4b"] Apr 21 06:26:58.155970 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.155974 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-677cd7bdc6-69hr7"] Apr 21 06:26:58.156098 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.156061 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677cd7bdc6-69hr7" Apr 21 06:26:58.158314 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.158156 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 21 06:26:58.161327 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.161305 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m48z\" (UniqueName: \"kubernetes.io/projected/f6d75f91-6fd1-4d46-9c53-7c8492b33064-kube-api-access-8m48z\") pod \"dns-default-t2hf9\" (UID: \"f6d75f91-6fd1-4d46-9c53-7c8492b33064\") " pod="openshift-dns/dns-default-t2hf9" Apr 21 06:26:58.161434 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.161410 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-glbhf\" (UniqueName: \"kubernetes.io/projected/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-kube-api-access-glbhf\") pod \"ingress-canary-mdnlh\" (UID: \"4aa4526d-b9ab-4ef8-9333-2a849fe6acc7\") " pod="openshift-ingress-canary/ingress-canary-mdnlh" Apr 21 06:26:58.246770 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.246744 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mgcp\" (UniqueName: \"kubernetes.io/projected/476aa6a6-c72d-43ba-a3d6-3af0ebe8b4f5-kube-api-access-8mgcp\") pod \"klusterlet-addon-workmgr-677cd7bdc6-69hr7\" (UID: \"476aa6a6-c72d-43ba-a3d6-3af0ebe8b4f5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677cd7bdc6-69hr7" Apr 21 06:26:58.246891 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.246797 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/476aa6a6-c72d-43ba-a3d6-3af0ebe8b4f5-klusterlet-config\") pod \"klusterlet-addon-workmgr-677cd7bdc6-69hr7\" (UID: \"476aa6a6-c72d-43ba-a3d6-3af0ebe8b4f5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677cd7bdc6-69hr7" Apr 21 06:26:58.246891 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.246821 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/476aa6a6-c72d-43ba-a3d6-3af0ebe8b4f5-tmp\") pod \"klusterlet-addon-workmgr-677cd7bdc6-69hr7\" (UID: \"476aa6a6-c72d-43ba-a3d6-3af0ebe8b4f5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677cd7bdc6-69hr7" Apr 21 06:26:58.246983 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.246908 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f26fa83c-31ff-4104-84fa-d9bc0f857548-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-556dfbbf77-qvf4b\" (UID: \"f26fa83c-31ff-4104-84fa-d9bc0f857548\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-556dfbbf77-qvf4b" Apr 21 06:26:58.246983 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.246939 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdmcz\" (UniqueName: \"kubernetes.io/projected/f26fa83c-31ff-4104-84fa-d9bc0f857548-kube-api-access-kdmcz\") pod \"managed-serviceaccount-addon-agent-556dfbbf77-qvf4b\" (UID: \"f26fa83c-31ff-4104-84fa-d9bc0f857548\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-556dfbbf77-qvf4b" Apr 21 06:26:58.347605 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.347566 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mgcp\" (UniqueName: \"kubernetes.io/projected/476aa6a6-c72d-43ba-a3d6-3af0ebe8b4f5-kube-api-access-8mgcp\") pod \"klusterlet-addon-workmgr-677cd7bdc6-69hr7\" (UID: \"476aa6a6-c72d-43ba-a3d6-3af0ebe8b4f5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677cd7bdc6-69hr7" Apr 21 06:26:58.347722 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.347615 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/476aa6a6-c72d-43ba-a3d6-3af0ebe8b4f5-klusterlet-config\") pod \"klusterlet-addon-workmgr-677cd7bdc6-69hr7\" (UID: \"476aa6a6-c72d-43ba-a3d6-3af0ebe8b4f5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677cd7bdc6-69hr7" Apr 21 06:26:58.347722 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.347639 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/476aa6a6-c72d-43ba-a3d6-3af0ebe8b4f5-tmp\") pod \"klusterlet-addon-workmgr-677cd7bdc6-69hr7\" (UID: \"476aa6a6-c72d-43ba-a3d6-3af0ebe8b4f5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677cd7bdc6-69hr7" Apr 21 06:26:58.347722 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.347693 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f26fa83c-31ff-4104-84fa-d9bc0f857548-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-556dfbbf77-qvf4b\" (UID: \"f26fa83c-31ff-4104-84fa-d9bc0f857548\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-556dfbbf77-qvf4b" Apr 21 06:26:58.347722 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.347718 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdmcz\" (UniqueName: \"kubernetes.io/projected/f26fa83c-31ff-4104-84fa-d9bc0f857548-kube-api-access-kdmcz\") pod \"managed-serviceaccount-addon-agent-556dfbbf77-qvf4b\" (UID: \"f26fa83c-31ff-4104-84fa-d9bc0f857548\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-556dfbbf77-qvf4b" Apr 21 06:26:58.348088 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.348068 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/476aa6a6-c72d-43ba-a3d6-3af0ebe8b4f5-tmp\") pod \"klusterlet-addon-workmgr-677cd7bdc6-69hr7\" (UID: \"476aa6a6-c72d-43ba-a3d6-3af0ebe8b4f5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677cd7bdc6-69hr7" Apr 21 06:26:58.350994 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.350972 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f26fa83c-31ff-4104-84fa-d9bc0f857548-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-556dfbbf77-qvf4b\" (UID: \"f26fa83c-31ff-4104-84fa-d9bc0f857548\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-556dfbbf77-qvf4b" Apr 21 06:26:58.351552 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.351533 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/476aa6a6-c72d-43ba-a3d6-3af0ebe8b4f5-klusterlet-config\") pod \"klusterlet-addon-workmgr-677cd7bdc6-69hr7\" (UID: \"476aa6a6-c72d-43ba-a3d6-3af0ebe8b4f5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677cd7bdc6-69hr7" Apr 21 06:26:58.355381 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.355358 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdmcz\" (UniqueName: \"kubernetes.io/projected/f26fa83c-31ff-4104-84fa-d9bc0f857548-kube-api-access-kdmcz\") pod \"managed-serviceaccount-addon-agent-556dfbbf77-qvf4b\" (UID: \"f26fa83c-31ff-4104-84fa-d9bc0f857548\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-556dfbbf77-qvf4b" Apr 21 06:26:58.355556 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.355463 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mgcp\" (UniqueName: \"kubernetes.io/projected/476aa6a6-c72d-43ba-a3d6-3af0ebe8b4f5-kube-api-access-8mgcp\") pod \"klusterlet-addon-workmgr-677cd7bdc6-69hr7\" (UID: \"476aa6a6-c72d-43ba-a3d6-3af0ebe8b4f5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677cd7bdc6-69hr7" Apr 21 06:26:58.468798 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.468714 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-556dfbbf77-qvf4b" Apr 21 06:26:58.484572 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.484486 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677cd7bdc6-69hr7" Apr 21 06:26:58.609861 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.609835 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-556dfbbf77-qvf4b"] Apr 21 06:26:58.612821 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.612787 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-677cd7bdc6-69hr7"] Apr 21 06:26:58.650165 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.650139 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls\") pod \"dns-default-t2hf9\" (UID: \"f6d75f91-6fd1-4d46-9c53-7c8492b33064\") " pod="openshift-dns/dns-default-t2hf9" Apr 21 06:26:58.650273 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.650178 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert\") pod \"ingress-canary-mdnlh\" (UID: \"4aa4526d-b9ab-4ef8-9333-2a849fe6acc7\") " pod="openshift-ingress-canary/ingress-canary-mdnlh" Apr 21 06:26:58.650329 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:58.650293 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:26:58.650329 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:58.650300 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:26:58.650409 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:58.650359 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert podName:4aa4526d-b9ab-4ef8-9333-2a849fe6acc7 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:59.650339853 +0000 UTC m=+34.134990876 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert") pod "ingress-canary-mdnlh" (UID: "4aa4526d-b9ab-4ef8-9333-2a849fe6acc7") : secret "canary-serving-cert" not found Apr 21 06:26:58.650409 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:58.650376 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls podName:f6d75f91-6fd1-4d46-9c53-7c8492b33064 nodeName:}" failed. No retries permitted until 2026-04-21 06:26:59.650368935 +0000 UTC m=+34.135019936 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls") pod "dns-default-t2hf9" (UID: "f6d75f91-6fd1-4d46-9c53-7c8492b33064") : secret "dns-default-metrics-tls" not found Apr 21 06:26:58.750760 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.750703 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs\") pod \"network-metrics-daemon-qjchj\" (UID: \"546de538-b56a-4ad2-baeb-3d59144586fb\") " pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:26:58.750871 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:58.750852 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 06:26:58.750936 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:58.750923 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs podName:546de538-b56a-4ad2-baeb-3d59144586fb nodeName:}" failed. No retries permitted until 2026-04-21 06:27:30.750903447 +0000 UTC m=+65.235554462 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs") pod "network-metrics-daemon-qjchj" (UID: "546de538-b56a-4ad2-baeb-3d59144586fb") : secret "metrics-daemon-secret" not found Apr 21 06:26:58.851172 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.851141 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlktq\" (UniqueName: \"kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq\") pod \"network-check-target-zk874\" (UID: \"fdd3109b-7468-400c-b587-0e2d50c0911b\") " pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:58.855646 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:58.855623 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlktq\" (UniqueName: \"kubernetes.io/projected/fdd3109b-7468-400c-b587-0e2d50c0911b-kube-api-access-hlktq\") pod \"network-check-target-zk874\" (UID: \"fdd3109b-7468-400c-b587-0e2d50c0911b\") " pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:59.040180 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:59.040151 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:26:59.656061 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:59.656021 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls\") pod \"dns-default-t2hf9\" (UID: \"f6d75f91-6fd1-4d46-9c53-7c8492b33064\") " pod="openshift-dns/dns-default-t2hf9" Apr 21 06:26:59.656229 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:26:59.656071 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert\") pod \"ingress-canary-mdnlh\" (UID: \"4aa4526d-b9ab-4ef8-9333-2a849fe6acc7\") " pod="openshift-ingress-canary/ingress-canary-mdnlh" Apr 21 06:26:59.656229 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:59.656171 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:26:59.656229 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:59.656187 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:26:59.656345 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:59.656247 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls podName:f6d75f91-6fd1-4d46-9c53-7c8492b33064 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:01.656224718 +0000 UTC m=+36.140875723 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls") pod "dns-default-t2hf9" (UID: "f6d75f91-6fd1-4d46-9c53-7c8492b33064") : secret "dns-default-metrics-tls" not found Apr 21 06:26:59.656345 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:26:59.656269 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert podName:4aa4526d-b9ab-4ef8-9333-2a849fe6acc7 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:01.656255623 +0000 UTC m=+36.140906637 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert") pod "ingress-canary-mdnlh" (UID: "4aa4526d-b9ab-4ef8-9333-2a849fe6acc7") : secret "canary-serving-cert" not found Apr 21 06:27:00.670410 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:27:00.670373 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod476aa6a6_c72d_43ba_a3d6_3af0ebe8b4f5.slice/crio-cf0971bc0b57eb90d759c2599665b9ab0f1bc635e0924ca4887840267ddae327 WatchSource:0}: Error finding container cf0971bc0b57eb90d759c2599665b9ab0f1bc635e0924ca4887840267ddae327: Status 404 returned error can't find the container with id cf0971bc0b57eb90d759c2599665b9ab0f1bc635e0924ca4887840267ddae327 Apr 21 06:27:00.670803 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:27:00.670784 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf26fa83c_31ff_4104_84fa_d9bc0f857548.slice/crio-f1eb1f2e2ca4ad57b9ee439d2ecf5285990d179559e0eaa3cf1c7b32ad407832 WatchSource:0}: Error finding container f1eb1f2e2ca4ad57b9ee439d2ecf5285990d179559e0eaa3cf1c7b32ad407832: Status 404 returned error can't find the container with id f1eb1f2e2ca4ad57b9ee439d2ecf5285990d179559e0eaa3cf1c7b32ad407832 Apr 21 06:27:00.778946 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:00.778920 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zk874"] Apr 21 06:27:00.819837 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:27:00.819801 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdd3109b_7468_400c_b587_0e2d50c0911b.slice/crio-cf2854ef09fe1ad8d4f8b3ef9c3fe1db19c5f81f8a556effd4a410644a083630 WatchSource:0}: Error finding container cf2854ef09fe1ad8d4f8b3ef9c3fe1db19c5f81f8a556effd4a410644a083630: Status 404 returned error can't find the container with id cf2854ef09fe1ad8d4f8b3ef9c3fe1db19c5f81f8a556effd4a410644a083630 Apr 21 06:27:01.307383 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:01.307144 2571 generic.go:358] "Generic (PLEG): container finished" podID="2b985311-2ecb-45b4-8665-a9a42cef2837" containerID="d238f5c74a122ea71625381ee788ded16feb6e339afa1a7cfab42559baaac8a6" exitCode=0 Apr 21 06:27:01.308295 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:01.308242 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-stw9q" event={"ID":"2b985311-2ecb-45b4-8665-a9a42cef2837","Type":"ContainerDied","Data":"d238f5c74a122ea71625381ee788ded16feb6e339afa1a7cfab42559baaac8a6"} Apr 21 06:27:01.310469 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:01.310410 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zk874" event={"ID":"fdd3109b-7468-400c-b587-0e2d50c0911b","Type":"ContainerStarted","Data":"cf2854ef09fe1ad8d4f8b3ef9c3fe1db19c5f81f8a556effd4a410644a083630"} Apr 21 06:27:01.312965 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:01.312627 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677cd7bdc6-69hr7" event={"ID":"476aa6a6-c72d-43ba-a3d6-3af0ebe8b4f5","Type":"ContainerStarted","Data":"cf0971bc0b57eb90d759c2599665b9ab0f1bc635e0924ca4887840267ddae327"} Apr 21 06:27:01.314961 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:01.314916 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-556dfbbf77-qvf4b" event={"ID":"f26fa83c-31ff-4104-84fa-d9bc0f857548","Type":"ContainerStarted","Data":"f1eb1f2e2ca4ad57b9ee439d2ecf5285990d179559e0eaa3cf1c7b32ad407832"} Apr 21 06:27:01.671162 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:01.671123 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert\") pod \"ingress-canary-mdnlh\" (UID: \"4aa4526d-b9ab-4ef8-9333-2a849fe6acc7\") " pod="openshift-ingress-canary/ingress-canary-mdnlh" Apr 21 06:27:01.671798 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:01.671251 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls\") pod \"dns-default-t2hf9\" (UID: \"f6d75f91-6fd1-4d46-9c53-7c8492b33064\") " pod="openshift-dns/dns-default-t2hf9" Apr 21 06:27:01.671798 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:27:01.671259 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:27:01.671798 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:27:01.671333 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert podName:4aa4526d-b9ab-4ef8-9333-2a849fe6acc7 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:05.671311837 +0000 UTC m=+40.155962840 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert") pod "ingress-canary-mdnlh" (UID: "4aa4526d-b9ab-4ef8-9333-2a849fe6acc7") : secret "canary-serving-cert" not found Apr 21 06:27:01.671798 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:27:01.671359 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:27:01.671798 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:27:01.671408 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls podName:f6d75f91-6fd1-4d46-9c53-7c8492b33064 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:05.671392905 +0000 UTC m=+40.156043905 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls") pod "dns-default-t2hf9" (UID: "f6d75f91-6fd1-4d46-9c53-7c8492b33064") : secret "dns-default-metrics-tls" not found Apr 21 06:27:02.321909 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:02.321818 2571 generic.go:358] "Generic (PLEG): container finished" podID="2b985311-2ecb-45b4-8665-a9a42cef2837" containerID="7fab4023cb8086c04e7833d6674510beb74585c75da60bd7adf8003a90423b64" exitCode=0 Apr 21 06:27:02.321909 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:02.321876 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-stw9q" event={"ID":"2b985311-2ecb-45b4-8665-a9a42cef2837","Type":"ContainerDied","Data":"7fab4023cb8086c04e7833d6674510beb74585c75da60bd7adf8003a90423b64"} Apr 21 06:27:05.705359 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:05.705319 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls\") pod \"dns-default-t2hf9\" (UID: \"f6d75f91-6fd1-4d46-9c53-7c8492b33064\") " pod="openshift-dns/dns-default-t2hf9" Apr 21 06:27:05.705359 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:05.705360 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert\") pod \"ingress-canary-mdnlh\" (UID: \"4aa4526d-b9ab-4ef8-9333-2a849fe6acc7\") " pod="openshift-ingress-canary/ingress-canary-mdnlh" Apr 21 06:27:05.705856 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:27:05.705491 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:27:05.705856 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:27:05.705492 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:27:05.705856 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:27:05.705572 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert podName:4aa4526d-b9ab-4ef8-9333-2a849fe6acc7 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:13.705554669 +0000 UTC m=+48.190205672 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert") pod "ingress-canary-mdnlh" (UID: "4aa4526d-b9ab-4ef8-9333-2a849fe6acc7") : secret "canary-serving-cert" not found Apr 21 06:27:05.705856 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:27:05.705586 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls podName:f6d75f91-6fd1-4d46-9c53-7c8492b33064 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:13.705580273 +0000 UTC m=+48.190231278 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls") pod "dns-default-t2hf9" (UID: "f6d75f91-6fd1-4d46-9c53-7c8492b33064") : secret "dns-default-metrics-tls" not found Apr 21 06:27:07.331497 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:07.331455 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zk874" event={"ID":"fdd3109b-7468-400c-b587-0e2d50c0911b","Type":"ContainerStarted","Data":"82d450232fb19cd0f73c005058a365cad9e7804234122f94df45919231840e34"} Apr 21 06:27:07.332028 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:07.331564 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:27:07.332738 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:07.332718 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677cd7bdc6-69hr7" event={"ID":"476aa6a6-c72d-43ba-a3d6-3af0ebe8b4f5","Type":"ContainerStarted","Data":"cf20f52bab190e1b476aedcc62dfb254828c97a31fe102f50fd7fab346f82354"} Apr 21 06:27:07.332913 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:07.332898 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677cd7bdc6-69hr7" Apr 21 06:27:07.333966 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:07.333936 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-556dfbbf77-qvf4b" event={"ID":"f26fa83c-31ff-4104-84fa-d9bc0f857548","Type":"ContainerStarted","Data":"0dece40a1e867174bf4da5a08702baf5894bed780f50ad6f901e6794eedf23bf"} Apr 21 06:27:07.334881 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:07.334843 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677cd7bdc6-69hr7" Apr 21 06:27:07.336697 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:07.336677 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-stw9q" event={"ID":"2b985311-2ecb-45b4-8665-a9a42cef2837","Type":"ContainerStarted","Data":"8fa791fd41f66f2e48c797dd121b9f80ca5e3791b8089150834a4048ca4b89a5"} Apr 21 06:27:07.345815 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:07.345769 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zk874" podStartSLOduration=35.560783344 podStartE2EDuration="41.345754074s" podCreationTimestamp="2026-04-21 06:26:26 +0000 UTC" firstStartedPulling="2026-04-21 06:27:00.82786793 +0000 UTC m=+35.312518930" lastFinishedPulling="2026-04-21 06:27:06.612838661 +0000 UTC m=+41.097489660" observedRunningTime="2026-04-21 06:27:07.345048609 +0000 UTC m=+41.829699653" watchObservedRunningTime="2026-04-21 06:27:07.345754074 +0000 UTC m=+41.830405097" Apr 21 06:27:07.380378 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:07.380331 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-stw9q" podStartSLOduration=7.86807139 podStartE2EDuration="41.380318532s" podCreationTimestamp="2026-04-21 06:26:26 +0000 UTC" firstStartedPulling="2026-04-21 06:26:27.338843626 +0000 UTC m=+1.823494626" lastFinishedPulling="2026-04-21 06:27:00.851090768 +0000 UTC m=+35.335741768" observedRunningTime="2026-04-21 06:27:07.378758544 +0000 UTC m=+41.863409566" watchObservedRunningTime="2026-04-21 06:27:07.380318532 +0000 UTC m=+41.864969555" Apr 21 06:27:07.380631 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:07.380606 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-556dfbbf77-qvf4b" podStartSLOduration=3.440809463 podStartE2EDuration="9.380599179s" podCreationTimestamp="2026-04-21 06:26:58 +0000 UTC" firstStartedPulling="2026-04-21 06:27:00.673104125 +0000 UTC m=+35.157755138" lastFinishedPulling="2026-04-21 06:27:06.612893852 +0000 UTC m=+41.097544854" observedRunningTime="2026-04-21 06:27:07.360920723 +0000 UTC m=+41.845571746" watchObservedRunningTime="2026-04-21 06:27:07.380599179 +0000 UTC m=+41.865250201" Apr 21 06:27:07.392783 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:07.392753 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-677cd7bdc6-69hr7" podStartSLOduration=3.433681294 podStartE2EDuration="9.392744365s" podCreationTimestamp="2026-04-21 06:26:58 +0000 UTC" firstStartedPulling="2026-04-21 06:27:00.672917908 +0000 UTC m=+35.157568929" lastFinishedPulling="2026-04-21 06:27:06.631980983 +0000 UTC m=+41.116632000" observedRunningTime="2026-04-21 06:27:07.392305797 +0000 UTC m=+41.876956820" watchObservedRunningTime="2026-04-21 06:27:07.392744365 +0000 UTC m=+41.877395381" Apr 21 06:27:08.121950 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:08.121922 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret\") pod \"global-pull-secret-syncer-rlzb5\" (UID: \"75cb84e7-4602-4954-9579-ec59fa9a8289\") " pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:27:08.125319 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:08.125295 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/75cb84e7-4602-4954-9579-ec59fa9a8289-original-pull-secret\") pod \"global-pull-secret-syncer-rlzb5\" (UID: \"75cb84e7-4602-4954-9579-ec59fa9a8289\") " pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:27:08.325056 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:08.325027 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzb5" Apr 21 06:27:08.431464 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:08.431437 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rlzb5"] Apr 21 06:27:08.434094 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:27:08.434071 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75cb84e7_4602_4954_9579_ec59fa9a8289.slice/crio-6ca62230dd342771e4d87b58adccd907d99e907987b7ae70b876e538acc39c4b WatchSource:0}: Error finding container 6ca62230dd342771e4d87b58adccd907d99e907987b7ae70b876e538acc39c4b: Status 404 returned error can't find the container with id 6ca62230dd342771e4d87b58adccd907d99e907987b7ae70b876e538acc39c4b Apr 21 06:27:09.342287 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:09.342246 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rlzb5" event={"ID":"75cb84e7-4602-4954-9579-ec59fa9a8289","Type":"ContainerStarted","Data":"6ca62230dd342771e4d87b58adccd907d99e907987b7ae70b876e538acc39c4b"} Apr 21 06:27:12.349631 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:12.349554 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rlzb5" event={"ID":"75cb84e7-4602-4954-9579-ec59fa9a8289","Type":"ContainerStarted","Data":"25bf86b78b0691005824312e673376aa93578b5d8621991c98ca0a2378cb4366"} Apr 21 06:27:12.363859 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:12.363814 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-rlzb5" podStartSLOduration=32.741346938 podStartE2EDuration="36.363802108s" podCreationTimestamp="2026-04-21 06:26:36 +0000 UTC" firstStartedPulling="2026-04-21 06:27:08.435701038 +0000 UTC m=+42.920352038" lastFinishedPulling="2026-04-21 06:27:12.058156192 +0000 UTC m=+46.542807208" observedRunningTime="2026-04-21 06:27:12.363263036 +0000 UTC m=+46.847914059" watchObservedRunningTime="2026-04-21 06:27:12.363802108 +0000 UTC m=+46.848453129" Apr 21 06:27:13.761652 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:13.761617 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls\") pod \"dns-default-t2hf9\" (UID: \"f6d75f91-6fd1-4d46-9c53-7c8492b33064\") " pod="openshift-dns/dns-default-t2hf9" Apr 21 06:27:13.761652 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:13.761653 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert\") pod \"ingress-canary-mdnlh\" (UID: \"4aa4526d-b9ab-4ef8-9333-2a849fe6acc7\") " pod="openshift-ingress-canary/ingress-canary-mdnlh" Apr 21 06:27:13.762052 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:27:13.761751 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:27:13.762052 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:27:13.761755 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:27:13.762052 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:27:13.761802 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert podName:4aa4526d-b9ab-4ef8-9333-2a849fe6acc7 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:29.761788454 +0000 UTC m=+64.246439453 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert") pod "ingress-canary-mdnlh" (UID: "4aa4526d-b9ab-4ef8-9333-2a849fe6acc7") : secret "canary-serving-cert" not found Apr 21 06:27:13.762052 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:27:13.761814 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls podName:f6d75f91-6fd1-4d46-9c53-7c8492b33064 nodeName:}" failed. No retries permitted until 2026-04-21 06:27:29.761808633 +0000 UTC m=+64.246459633 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls") pod "dns-default-t2hf9" (UID: "f6d75f91-6fd1-4d46-9c53-7c8492b33064") : secret "dns-default-metrics-tls" not found Apr 21 06:27:23.295216 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:23.295186 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z8rxc" Apr 21 06:27:29.767487 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:29.767439 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls\") pod \"dns-default-t2hf9\" (UID: \"f6d75f91-6fd1-4d46-9c53-7c8492b33064\") " pod="openshift-dns/dns-default-t2hf9" Apr 21 06:27:29.767487 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:29.767489 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert\") pod \"ingress-canary-mdnlh\" (UID: \"4aa4526d-b9ab-4ef8-9333-2a849fe6acc7\") " pod="openshift-ingress-canary/ingress-canary-mdnlh" Apr 21 06:27:29.768037 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:27:29.767612 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 06:27:29.768037 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:27:29.767625 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 06:27:29.768037 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:27:29.767681 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert podName:4aa4526d-b9ab-4ef8-9333-2a849fe6acc7 nodeName:}" failed. No retries permitted until 2026-04-21 06:28:01.767666071 +0000 UTC m=+96.252317071 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert") pod "ingress-canary-mdnlh" (UID: "4aa4526d-b9ab-4ef8-9333-2a849fe6acc7") : secret "canary-serving-cert" not found Apr 21 06:27:29.768037 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:27:29.767695 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls podName:f6d75f91-6fd1-4d46-9c53-7c8492b33064 nodeName:}" failed. No retries permitted until 2026-04-21 06:28:01.767688102 +0000 UTC m=+96.252339102 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls") pod "dns-default-t2hf9" (UID: "f6d75f91-6fd1-4d46-9c53-7c8492b33064") : secret "dns-default-metrics-tls" not found Apr 21 06:27:30.775472 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:30.775439 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs\") pod \"network-metrics-daemon-qjchj\" (UID: \"546de538-b56a-4ad2-baeb-3d59144586fb\") " pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:27:30.775862 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:27:30.775609 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 06:27:30.775862 ip-10-0-138-76 kubenswrapper[2571]: E0421 06:27:30.775677 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs podName:546de538-b56a-4ad2-baeb-3d59144586fb nodeName:}" failed. No retries permitted until 2026-04-21 06:28:34.775660637 +0000 UTC m=+129.260311637 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs") pod "network-metrics-daemon-qjchj" (UID: "546de538-b56a-4ad2-baeb-3d59144586fb") : secret "metrics-daemon-secret" not found Apr 21 06:27:38.341392 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:38.341264 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zk874" Apr 21 06:27:43.881432 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:43.881399 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-s74rh"] Apr 21 06:27:43.888710 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:43.888688 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-s74rh" Apr 21 06:27:43.890929 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:43.890909 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 21 06:27:43.891134 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:43.891113 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 21 06:27:43.891217 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:43.891143 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 21 06:27:43.892267 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:43.892243 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 21 06:27:43.892378 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:43.892347 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-qwr46\"" Apr 21 06:27:43.893357 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:43.893335 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-s74rh"] Apr 21 06:27:43.966339 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:43.966312 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19694577-e921-48dd-acfc-d48492e5ee03-config\") pod \"service-ca-operator-d6fc45fc5-s74rh\" (UID: \"19694577-e921-48dd-acfc-d48492e5ee03\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-s74rh" Apr 21 06:27:43.966444 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:43.966381 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19694577-e921-48dd-acfc-d48492e5ee03-serving-cert\") pod \"service-ca-operator-d6fc45fc5-s74rh\" (UID: \"19694577-e921-48dd-acfc-d48492e5ee03\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-s74rh" Apr 21 06:27:43.966444 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:43.966432 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8tck\" (UniqueName: \"kubernetes.io/projected/19694577-e921-48dd-acfc-d48492e5ee03-kube-api-access-d8tck\") pod \"service-ca-operator-d6fc45fc5-s74rh\" (UID: \"19694577-e921-48dd-acfc-d48492e5ee03\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-s74rh" Apr 21 06:27:44.066723 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:44.066694 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19694577-e921-48dd-acfc-d48492e5ee03-config\") pod \"service-ca-operator-d6fc45fc5-s74rh\" (UID: \"19694577-e921-48dd-acfc-d48492e5ee03\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-s74rh" Apr 21 06:27:44.066808 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:44.066750 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19694577-e921-48dd-acfc-d48492e5ee03-serving-cert\") pod \"service-ca-operator-d6fc45fc5-s74rh\" (UID: \"19694577-e921-48dd-acfc-d48492e5ee03\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-s74rh" Apr 21 06:27:44.066808 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:44.066779 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d8tck\" (UniqueName: \"kubernetes.io/projected/19694577-e921-48dd-acfc-d48492e5ee03-kube-api-access-d8tck\") pod \"service-ca-operator-d6fc45fc5-s74rh\" (UID: \"19694577-e921-48dd-acfc-d48492e5ee03\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-s74rh" Apr 21 06:27:44.067377 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:44.067355 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19694577-e921-48dd-acfc-d48492e5ee03-config\") pod \"service-ca-operator-d6fc45fc5-s74rh\" (UID: \"19694577-e921-48dd-acfc-d48492e5ee03\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-s74rh" Apr 21 06:27:44.069295 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:44.069273 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19694577-e921-48dd-acfc-d48492e5ee03-serving-cert\") pod \"service-ca-operator-d6fc45fc5-s74rh\" (UID: \"19694577-e921-48dd-acfc-d48492e5ee03\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-s74rh" Apr 21 06:27:44.075054 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:44.075030 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8tck\" (UniqueName: \"kubernetes.io/projected/19694577-e921-48dd-acfc-d48492e5ee03-kube-api-access-d8tck\") pod \"service-ca-operator-d6fc45fc5-s74rh\" (UID: \"19694577-e921-48dd-acfc-d48492e5ee03\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-s74rh" Apr 21 06:27:44.197886 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:44.197832 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-s74rh" Apr 21 06:27:44.309971 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:44.309942 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-s74rh"] Apr 21 06:27:44.314334 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:27:44.314306 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19694577_e921_48dd_acfc_d48492e5ee03.slice/crio-453ed0f512881e67f36ffe1111176a9ef0169986b2fbdd712b16ca9c477c8087 WatchSource:0}: Error finding container 453ed0f512881e67f36ffe1111176a9ef0169986b2fbdd712b16ca9c477c8087: Status 404 returned error can't find the container with id 453ed0f512881e67f36ffe1111176a9ef0169986b2fbdd712b16ca9c477c8087 Apr 21 06:27:44.412684 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:44.412657 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-s74rh" event={"ID":"19694577-e921-48dd-acfc-d48492e5ee03","Type":"ContainerStarted","Data":"453ed0f512881e67f36ffe1111176a9ef0169986b2fbdd712b16ca9c477c8087"} Apr 21 06:27:44.436273 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:44.436246 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-gcmqq"] Apr 21 06:27:44.440665 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:44.440649 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gcmqq" Apr 21 06:27:44.443462 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:44.443443 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-t7jrq\"" Apr 21 06:27:44.446594 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:44.446575 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-gcmqq"] Apr 21 06:27:44.571903 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:44.571881 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbb8j\" (UniqueName: \"kubernetes.io/projected/afbf0267-655b-4cf1-bb8f-dcfa09f69f56-kube-api-access-kbb8j\") pod \"network-check-source-8894fc9bd-gcmqq\" (UID: \"afbf0267-655b-4cf1-bb8f-dcfa09f69f56\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gcmqq" Apr 21 06:27:44.672552 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:44.672529 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbb8j\" (UniqueName: \"kubernetes.io/projected/afbf0267-655b-4cf1-bb8f-dcfa09f69f56-kube-api-access-kbb8j\") pod \"network-check-source-8894fc9bd-gcmqq\" (UID: \"afbf0267-655b-4cf1-bb8f-dcfa09f69f56\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gcmqq" Apr 21 06:27:44.681830 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:44.681734 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbb8j\" (UniqueName: \"kubernetes.io/projected/afbf0267-655b-4cf1-bb8f-dcfa09f69f56-kube-api-access-kbb8j\") pod \"network-check-source-8894fc9bd-gcmqq\" (UID: \"afbf0267-655b-4cf1-bb8f-dcfa09f69f56\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gcmqq" Apr 21 06:27:44.750065 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:44.750034 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gcmqq" Apr 21 06:27:44.863265 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:44.863199 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-gcmqq"] Apr 21 06:27:44.866426 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:27:44.866395 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafbf0267_655b_4cf1_bb8f_dcfa09f69f56.slice/crio-67d0968b51098c58458f1c1f21fa247a987c81246c9198c7457d12f8239c23ec WatchSource:0}: Error finding container 67d0968b51098c58458f1c1f21fa247a987c81246c9198c7457d12f8239c23ec: Status 404 returned error can't find the container with id 67d0968b51098c58458f1c1f21fa247a987c81246c9198c7457d12f8239c23ec Apr 21 06:27:45.417401 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:45.417365 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gcmqq" event={"ID":"afbf0267-655b-4cf1-bb8f-dcfa09f69f56","Type":"ContainerStarted","Data":"75bbb4d4cfb3769c1d0e7cf224f6ec346389e046093f58f6af115500142303d3"} Apr 21 06:27:45.417859 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:45.417410 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gcmqq" event={"ID":"afbf0267-655b-4cf1-bb8f-dcfa09f69f56","Type":"ContainerStarted","Data":"67d0968b51098c58458f1c1f21fa247a987c81246c9198c7457d12f8239c23ec"} Apr 21 06:27:45.431245 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:45.431206 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gcmqq" podStartSLOduration=1.431192241 podStartE2EDuration="1.431192241s" podCreationTimestamp="2026-04-21 06:27:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:27:45.429671494 +0000 UTC m=+79.914322515" watchObservedRunningTime="2026-04-21 06:27:45.431192241 +0000 UTC m=+79.915843271" Apr 21 06:27:46.421172 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:46.421096 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-s74rh" event={"ID":"19694577-e921-48dd-acfc-d48492e5ee03","Type":"ContainerStarted","Data":"e9d0f62a6fdba6e69d79a134c5bd6de541e0de3f5689ea0901c1dd808599d600"} Apr 21 06:27:46.435455 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:46.435409 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-s74rh" podStartSLOduration=1.6696286 podStartE2EDuration="3.435396027s" podCreationTimestamp="2026-04-21 06:27:43 +0000 UTC" firstStartedPulling="2026-04-21 06:27:44.315939959 +0000 UTC m=+78.800590959" lastFinishedPulling="2026-04-21 06:27:46.081707386 +0000 UTC m=+80.566358386" observedRunningTime="2026-04-21 06:27:46.434628153 +0000 UTC m=+80.919279176" watchObservedRunningTime="2026-04-21 06:27:46.435396027 +0000 UTC m=+80.920047059" Apr 21 06:27:47.507321 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:47.507291 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-dfjbd"] Apr 21 06:27:47.510407 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:47.510390 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dfjbd" Apr 21 06:27:47.512644 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:47.512626 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 21 06:27:47.512762 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:47.512743 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-vwt77\"" Apr 21 06:27:47.513677 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:47.513663 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 21 06:27:47.518241 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:47.518219 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-dfjbd"] Apr 21 06:27:47.593040 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:47.593014 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnljh\" (UniqueName: \"kubernetes.io/projected/250dc6fb-a5a2-4b36-8c2d-2c0ba8d08ee2-kube-api-access-dnljh\") pod \"migrator-74bb7799d9-dfjbd\" (UID: \"250dc6fb-a5a2-4b36-8c2d-2c0ba8d08ee2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dfjbd" Apr 21 06:27:47.693323 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:47.693297 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnljh\" (UniqueName: \"kubernetes.io/projected/250dc6fb-a5a2-4b36-8c2d-2c0ba8d08ee2-kube-api-access-dnljh\") pod \"migrator-74bb7799d9-dfjbd\" (UID: \"250dc6fb-a5a2-4b36-8c2d-2c0ba8d08ee2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dfjbd" Apr 21 06:27:47.702560 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:47.702543 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnljh\" (UniqueName: \"kubernetes.io/projected/250dc6fb-a5a2-4b36-8c2d-2c0ba8d08ee2-kube-api-access-dnljh\") pod \"migrator-74bb7799d9-dfjbd\" (UID: \"250dc6fb-a5a2-4b36-8c2d-2c0ba8d08ee2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dfjbd" Apr 21 06:27:47.819207 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:47.819190 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dfjbd" Apr 21 06:27:47.950693 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:47.950666 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-dfjbd"] Apr 21 06:27:47.953199 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:27:47.953169 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod250dc6fb_a5a2_4b36_8c2d_2c0ba8d08ee2.slice/crio-a439516c6453a1dfb6b6f3aa67479e295a5c1fcee4d024a7397894b35c962425 WatchSource:0}: Error finding container a439516c6453a1dfb6b6f3aa67479e295a5c1fcee4d024a7397894b35c962425: Status 404 returned error can't find the container with id a439516c6453a1dfb6b6f3aa67479e295a5c1fcee4d024a7397894b35c962425 Apr 21 06:27:48.431856 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:48.431822 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dfjbd" event={"ID":"250dc6fb-a5a2-4b36-8c2d-2c0ba8d08ee2","Type":"ContainerStarted","Data":"a439516c6453a1dfb6b6f3aa67479e295a5c1fcee4d024a7397894b35c962425"} Apr 21 06:27:49.096119 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:49.096044 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9x6w4_a90e0df9-bd86-4e43-ab44-ddd45d0f1a43/dns-node-resolver/0.log" Apr 21 06:27:49.436779 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:49.436698 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dfjbd" event={"ID":"250dc6fb-a5a2-4b36-8c2d-2c0ba8d08ee2","Type":"ContainerStarted","Data":"dbd75e0245d17e87da958c7e6dbeb9b048dae31244008619a1b3355712df613f"} Apr 21 06:27:49.436779 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:49.436735 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dfjbd" event={"ID":"250dc6fb-a5a2-4b36-8c2d-2c0ba8d08ee2","Type":"ContainerStarted","Data":"a4bf24d12e3397ebc80680ff76d51deb052167e8e7ae02b20fcc234d7b8635bb"} Apr 21 06:27:49.450656 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:49.450613 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dfjbd" podStartSLOduration=1.602150397 podStartE2EDuration="2.450601551s" podCreationTimestamp="2026-04-21 06:27:47 +0000 UTC" firstStartedPulling="2026-04-21 06:27:47.95507853 +0000 UTC m=+82.439729529" lastFinishedPulling="2026-04-21 06:27:48.803529666 +0000 UTC m=+83.288180683" observedRunningTime="2026-04-21 06:27:49.450021414 +0000 UTC m=+83.934672435" watchObservedRunningTime="2026-04-21 06:27:49.450601551 +0000 UTC m=+83.935252573" Apr 21 06:27:49.896197 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:27:49.896173 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qvz4w_b319b9ca-8134-427f-bce9-921c4216c413/node-ca/0.log" Apr 21 06:28:01.788186 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:01.788156 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls\") pod \"dns-default-t2hf9\" (UID: \"f6d75f91-6fd1-4d46-9c53-7c8492b33064\") " pod="openshift-dns/dns-default-t2hf9" Apr 21 06:28:01.788186 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:01.788191 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert\") pod \"ingress-canary-mdnlh\" (UID: \"4aa4526d-b9ab-4ef8-9333-2a849fe6acc7\") " pod="openshift-ingress-canary/ingress-canary-mdnlh" Apr 21 06:28:01.790565 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:01.790540 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6d75f91-6fd1-4d46-9c53-7c8492b33064-metrics-tls\") pod \"dns-default-t2hf9\" (UID: \"f6d75f91-6fd1-4d46-9c53-7c8492b33064\") " pod="openshift-dns/dns-default-t2hf9" Apr 21 06:28:01.790678 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:01.790657 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4aa4526d-b9ab-4ef8-9333-2a849fe6acc7-cert\") pod \"ingress-canary-mdnlh\" (UID: \"4aa4526d-b9ab-4ef8-9333-2a849fe6acc7\") " pod="openshift-ingress-canary/ingress-canary-mdnlh" Apr 21 06:28:01.809816 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:01.809796 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-98jt7\"" Apr 21 06:28:01.817729 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:01.817715 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mdnlh" Apr 21 06:28:01.927422 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:01.927389 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mdnlh"] Apr 21 06:28:01.930779 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:28:01.930753 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aa4526d_b9ab_4ef8_9333_2a849fe6acc7.slice/crio-04870f72a7856c4e0dfde781338caad72c65336bba05d544133198fe8addfbd1 WatchSource:0}: Error finding container 04870f72a7856c4e0dfde781338caad72c65336bba05d544133198fe8addfbd1: Status 404 returned error can't find the container with id 04870f72a7856c4e0dfde781338caad72c65336bba05d544133198fe8addfbd1 Apr 21 06:28:02.092563 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:02.092539 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gz466\"" Apr 21 06:28:02.100558 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:02.100536 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t2hf9" Apr 21 06:28:02.210089 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:02.210057 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t2hf9"] Apr 21 06:28:02.213383 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:28:02.213360 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6d75f91_6fd1_4d46_9c53_7c8492b33064.slice/crio-05f61e82eee361a1f3d66f4423a7db3399d6b15fda86f785409430929dfec7d1 WatchSource:0}: Error finding container 05f61e82eee361a1f3d66f4423a7db3399d6b15fda86f785409430929dfec7d1: Status 404 returned error can't find the container with id 05f61e82eee361a1f3d66f4423a7db3399d6b15fda86f785409430929dfec7d1 Apr 21 06:28:02.472445 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:02.472362 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mdnlh" event={"ID":"4aa4526d-b9ab-4ef8-9333-2a849fe6acc7","Type":"ContainerStarted","Data":"04870f72a7856c4e0dfde781338caad72c65336bba05d544133198fe8addfbd1"} Apr 21 06:28:02.473612 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:02.473577 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t2hf9" event={"ID":"f6d75f91-6fd1-4d46-9c53-7c8492b33064","Type":"ContainerStarted","Data":"05f61e82eee361a1f3d66f4423a7db3399d6b15fda86f785409430929dfec7d1"} Apr 21 06:28:04.479393 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:04.479355 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mdnlh" event={"ID":"4aa4526d-b9ab-4ef8-9333-2a849fe6acc7","Type":"ContainerStarted","Data":"cba123d6e7de8df3b31439c0827e1137fe1453317975b96185f1c176455ae70f"} Apr 21 06:28:04.480977 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:04.480951 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t2hf9" event={"ID":"f6d75f91-6fd1-4d46-9c53-7c8492b33064","Type":"ContainerStarted","Data":"06733746d14f2e5996ad8551e9025cd10704785c2067e95754d2dc51ef6a1f30"} Apr 21 06:28:04.481091 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:04.480983 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t2hf9" event={"ID":"f6d75f91-6fd1-4d46-9c53-7c8492b33064","Type":"ContainerStarted","Data":"5c542049beddd0c90a1ffcba405f0c9381c8da50d325dcaf2eac9041de530176"} Apr 21 06:28:04.481091 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:04.481084 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-t2hf9" Apr 21 06:28:04.494003 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:04.493961 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mdnlh" podStartSLOduration=65.569558796 podStartE2EDuration="1m7.493948273s" podCreationTimestamp="2026-04-21 06:26:57 +0000 UTC" firstStartedPulling="2026-04-21 06:28:01.932681537 +0000 UTC m=+96.417332536" lastFinishedPulling="2026-04-21 06:28:03.857071006 +0000 UTC m=+98.341722013" observedRunningTime="2026-04-21 06:28:04.492698806 +0000 UTC m=+98.977349857" watchObservedRunningTime="2026-04-21 06:28:04.493948273 +0000 UTC m=+98.978599359" Apr 21 06:28:04.507265 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:04.507228 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-t2hf9" podStartSLOduration=65.863893497 podStartE2EDuration="1m7.507217511s" podCreationTimestamp="2026-04-21 06:26:57 +0000 UTC" firstStartedPulling="2026-04-21 06:28:02.215167904 +0000 UTC m=+96.699818908" lastFinishedPulling="2026-04-21 06:28:03.858491922 +0000 UTC m=+98.343142922" observedRunningTime="2026-04-21 06:28:04.506807721 +0000 UTC m=+98.991458744" watchObservedRunningTime="2026-04-21 06:28:04.507217511 +0000 UTC m=+98.991868533" Apr 21 06:28:09.238271 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.238235 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-5n64c"] Apr 21 06:28:09.241408 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.241392 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5n64c" Apr 21 06:28:09.243992 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.243951 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 06:28:09.245106 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.245084 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 06:28:09.245285 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.245271 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 06:28:09.245350 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.245303 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-7w99g\"" Apr 21 06:28:09.245635 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.245618 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 06:28:09.251084 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.251065 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5n64c"] Apr 21 06:28:09.294815 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.294785 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-579f775496-zztsw"] Apr 21 06:28:09.297667 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.297648 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.300096 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.300078 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 06:28:09.300456 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.300437 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 06:28:09.300575 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.300471 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-9wmrl\"" Apr 21 06:28:09.300575 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.300446 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 06:28:09.306619 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.306599 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 06:28:09.308534 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.308496 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-579f775496-zztsw"] Apr 21 06:28:09.338134 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.338113 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b31dbc8f-189e-47fe-8be6-d02332dd3cb9-crio-socket\") pod \"insights-runtime-extractor-5n64c\" (UID: \"b31dbc8f-189e-47fe-8be6-d02332dd3cb9\") " pod="openshift-insights/insights-runtime-extractor-5n64c" Apr 21 06:28:09.338233 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.338142 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b31dbc8f-189e-47fe-8be6-d02332dd3cb9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5n64c\" (UID: \"b31dbc8f-189e-47fe-8be6-d02332dd3cb9\") " pod="openshift-insights/insights-runtime-extractor-5n64c" Apr 21 06:28:09.338275 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.338237 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b31dbc8f-189e-47fe-8be6-d02332dd3cb9-data-volume\") pod \"insights-runtime-extractor-5n64c\" (UID: \"b31dbc8f-189e-47fe-8be6-d02332dd3cb9\") " pod="openshift-insights/insights-runtime-extractor-5n64c" Apr 21 06:28:09.338275 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.338262 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b31dbc8f-189e-47fe-8be6-d02332dd3cb9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5n64c\" (UID: \"b31dbc8f-189e-47fe-8be6-d02332dd3cb9\") " pod="openshift-insights/insights-runtime-extractor-5n64c" Apr 21 06:28:09.338338 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.338281 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rvlw\" (UniqueName: \"kubernetes.io/projected/b31dbc8f-189e-47fe-8be6-d02332dd3cb9-kube-api-access-9rvlw\") pod \"insights-runtime-extractor-5n64c\" (UID: \"b31dbc8f-189e-47fe-8be6-d02332dd3cb9\") " pod="openshift-insights/insights-runtime-extractor-5n64c" Apr 21 06:28:09.393207 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.393187 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-2f6rv"] Apr 21 06:28:09.396130 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.396116 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-2f6rv" Apr 21 06:28:09.398713 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.398697 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 06:28:09.398924 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.398910 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-wdntc\"" Apr 21 06:28:09.398973 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.398922 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 06:28:09.406400 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.406379 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-2f6rv"] Apr 21 06:28:09.438948 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.438930 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b58fb78-ebee-424f-9022-f06fa1fd9290-trusted-ca\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.439035 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.438960 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b31dbc8f-189e-47fe-8be6-d02332dd3cb9-data-volume\") pod \"insights-runtime-extractor-5n64c\" (UID: \"b31dbc8f-189e-47fe-8be6-d02332dd3cb9\") " pod="openshift-insights/insights-runtime-extractor-5n64c" Apr 21 06:28:09.439035 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.438977 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rvlw\" (UniqueName: \"kubernetes.io/projected/b31dbc8f-189e-47fe-8be6-d02332dd3cb9-kube-api-access-9rvlw\") pod \"insights-runtime-extractor-5n64c\" (UID: \"b31dbc8f-189e-47fe-8be6-d02332dd3cb9\") " pod="openshift-insights/insights-runtime-extractor-5n64c" Apr 21 06:28:09.439035 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.438999 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b58fb78-ebee-424f-9022-f06fa1fd9290-registry-certificates\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.439035 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.439033 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b58fb78-ebee-424f-9022-f06fa1fd9290-ca-trust-extracted\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.439207 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.439059 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b31dbc8f-189e-47fe-8be6-d02332dd3cb9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5n64c\" (UID: \"b31dbc8f-189e-47fe-8be6-d02332dd3cb9\") " pod="openshift-insights/insights-runtime-extractor-5n64c" Apr 21 06:28:09.439207 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.439090 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b58fb78-ebee-424f-9022-f06fa1fd9290-installation-pull-secrets\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.439207 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.439172 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b58fb78-ebee-424f-9022-f06fa1fd9290-registry-tls\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.439336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.439237 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b31dbc8f-189e-47fe-8be6-d02332dd3cb9-crio-socket\") pod \"insights-runtime-extractor-5n64c\" (UID: \"b31dbc8f-189e-47fe-8be6-d02332dd3cb9\") " pod="openshift-insights/insights-runtime-extractor-5n64c" Apr 21 06:28:09.439336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.439271 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7b58fb78-ebee-424f-9022-f06fa1fd9290-image-registry-private-configuration\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.439336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.439280 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b31dbc8f-189e-47fe-8be6-d02332dd3cb9-data-volume\") pod \"insights-runtime-extractor-5n64c\" (UID: \"b31dbc8f-189e-47fe-8be6-d02332dd3cb9\") " pod="openshift-insights/insights-runtime-extractor-5n64c" Apr 21 06:28:09.439336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.439294 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b31dbc8f-189e-47fe-8be6-d02332dd3cb9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5n64c\" (UID: \"b31dbc8f-189e-47fe-8be6-d02332dd3cb9\") " pod="openshift-insights/insights-runtime-extractor-5n64c" Apr 21 06:28:09.439336 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.439336 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b31dbc8f-189e-47fe-8be6-d02332dd3cb9-crio-socket\") pod \"insights-runtime-extractor-5n64c\" (UID: \"b31dbc8f-189e-47fe-8be6-d02332dd3cb9\") " pod="openshift-insights/insights-runtime-extractor-5n64c" Apr 21 06:28:09.439572 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.439392 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b58fb78-ebee-424f-9022-f06fa1fd9290-bound-sa-token\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.439572 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.439429 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h7qr\" (UniqueName: \"kubernetes.io/projected/7b58fb78-ebee-424f-9022-f06fa1fd9290-kube-api-access-8h7qr\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.439739 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.439723 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b31dbc8f-189e-47fe-8be6-d02332dd3cb9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5n64c\" (UID: \"b31dbc8f-189e-47fe-8be6-d02332dd3cb9\") " pod="openshift-insights/insights-runtime-extractor-5n64c" Apr 21 06:28:09.441332 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.441314 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b31dbc8f-189e-47fe-8be6-d02332dd3cb9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5n64c\" (UID: \"b31dbc8f-189e-47fe-8be6-d02332dd3cb9\") " pod="openshift-insights/insights-runtime-extractor-5n64c" Apr 21 06:28:09.447361 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.447336 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rvlw\" (UniqueName: \"kubernetes.io/projected/b31dbc8f-189e-47fe-8be6-d02332dd3cb9-kube-api-access-9rvlw\") pod \"insights-runtime-extractor-5n64c\" (UID: \"b31dbc8f-189e-47fe-8be6-d02332dd3cb9\") " pod="openshift-insights/insights-runtime-extractor-5n64c" Apr 21 06:28:09.539767 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.539746 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b58fb78-ebee-424f-9022-f06fa1fd9290-installation-pull-secrets\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.539865 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.539776 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b58fb78-ebee-424f-9022-f06fa1fd9290-registry-tls\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.539865 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.539796 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nsvg\" (UniqueName: \"kubernetes.io/projected/75155df7-4d6d-4333-9af5-53bf8969c877-kube-api-access-5nsvg\") pod \"downloads-6bcc868b7-2f6rv\" (UID: \"75155df7-4d6d-4333-9af5-53bf8969c877\") " pod="openshift-console/downloads-6bcc868b7-2f6rv" Apr 21 06:28:09.539865 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.539814 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7b58fb78-ebee-424f-9022-f06fa1fd9290-image-registry-private-configuration\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.540026 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.539881 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b58fb78-ebee-424f-9022-f06fa1fd9290-bound-sa-token\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.540026 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.539930 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8h7qr\" (UniqueName: \"kubernetes.io/projected/7b58fb78-ebee-424f-9022-f06fa1fd9290-kube-api-access-8h7qr\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.540026 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.539979 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b58fb78-ebee-424f-9022-f06fa1fd9290-trusted-ca\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.540026 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.540014 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b58fb78-ebee-424f-9022-f06fa1fd9290-registry-certificates\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.540220 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.540051 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b58fb78-ebee-424f-9022-f06fa1fd9290-ca-trust-extracted\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.540418 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.540402 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7b58fb78-ebee-424f-9022-f06fa1fd9290-ca-trust-extracted\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.540872 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.540849 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7b58fb78-ebee-424f-9022-f06fa1fd9290-registry-certificates\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.541068 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.541047 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b58fb78-ebee-424f-9022-f06fa1fd9290-trusted-ca\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.542313 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.542294 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7b58fb78-ebee-424f-9022-f06fa1fd9290-image-registry-private-configuration\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.542595 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.542575 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7b58fb78-ebee-424f-9022-f06fa1fd9290-registry-tls\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.543022 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.543002 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7b58fb78-ebee-424f-9022-f06fa1fd9290-installation-pull-secrets\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.548485 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.548467 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h7qr\" (UniqueName: \"kubernetes.io/projected/7b58fb78-ebee-424f-9022-f06fa1fd9290-kube-api-access-8h7qr\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.548614 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.548595 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b58fb78-ebee-424f-9022-f06fa1fd9290-bound-sa-token\") pod \"image-registry-579f775496-zztsw\" (UID: \"7b58fb78-ebee-424f-9022-f06fa1fd9290\") " pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.551359 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.551339 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5n64c" Apr 21 06:28:09.606947 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.606923 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:09.640422 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.640390 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nsvg\" (UniqueName: \"kubernetes.io/projected/75155df7-4d6d-4333-9af5-53bf8969c877-kube-api-access-5nsvg\") pod \"downloads-6bcc868b7-2f6rv\" (UID: \"75155df7-4d6d-4333-9af5-53bf8969c877\") " pod="openshift-console/downloads-6bcc868b7-2f6rv" Apr 21 06:28:09.648808 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.648760 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nsvg\" (UniqueName: \"kubernetes.io/projected/75155df7-4d6d-4333-9af5-53bf8969c877-kube-api-access-5nsvg\") pod \"downloads-6bcc868b7-2f6rv\" (UID: \"75155df7-4d6d-4333-9af5-53bf8969c877\") " pod="openshift-console/downloads-6bcc868b7-2f6rv" Apr 21 06:28:09.670788 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.670760 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5n64c"] Apr 21 06:28:09.673850 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:28:09.673802 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb31dbc8f_189e_47fe_8be6_d02332dd3cb9.slice/crio-f5dda2edd0ebd878bbff8ea486ddc21b176f89da2207b6d8a5e885b0fdeca3a9 WatchSource:0}: Error finding container f5dda2edd0ebd878bbff8ea486ddc21b176f89da2207b6d8a5e885b0fdeca3a9: Status 404 returned error can't find the container with id f5dda2edd0ebd878bbff8ea486ddc21b176f89da2207b6d8a5e885b0fdeca3a9 Apr 21 06:28:09.703641 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.703614 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-2f6rv" Apr 21 06:28:09.730374 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.730350 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-579f775496-zztsw"] Apr 21 06:28:09.733087 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:28:09.733060 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b58fb78_ebee_424f_9022_f06fa1fd9290.slice/crio-ab7053ce2cd1212c9e5014b8a12f51c79616903b7a1538d0fcbf9186ce6fb0e0 WatchSource:0}: Error finding container ab7053ce2cd1212c9e5014b8a12f51c79616903b7a1538d0fcbf9186ce6fb0e0: Status 404 returned error can't find the container with id ab7053ce2cd1212c9e5014b8a12f51c79616903b7a1538d0fcbf9186ce6fb0e0 Apr 21 06:28:09.825760 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:09.825737 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-2f6rv"] Apr 21 06:28:09.828682 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:28:09.828653 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75155df7_4d6d_4333_9af5_53bf8969c877.slice/crio-f8bc95a936d0cbf3443bfe1e79ab7795797b05e4992d9f874a565e02a2a412b4 WatchSource:0}: Error finding container f8bc95a936d0cbf3443bfe1e79ab7795797b05e4992d9f874a565e02a2a412b4: Status 404 returned error can't find the container with id f8bc95a936d0cbf3443bfe1e79ab7795797b05e4992d9f874a565e02a2a412b4 Apr 21 06:28:10.498287 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:10.498240 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-579f775496-zztsw" event={"ID":"7b58fb78-ebee-424f-9022-f06fa1fd9290","Type":"ContainerStarted","Data":"3da1cf732cc21da3aeb5c271b7f8eeeed828c7597ef6eaba6e04f29dc455fb8a"} Apr 21 06:28:10.498287 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:10.498288 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-579f775496-zztsw" event={"ID":"7b58fb78-ebee-424f-9022-f06fa1fd9290","Type":"ContainerStarted","Data":"ab7053ce2cd1212c9e5014b8a12f51c79616903b7a1538d0fcbf9186ce6fb0e0"} Apr 21 06:28:10.498799 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:10.498559 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:10.500298 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:10.500271 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5n64c" event={"ID":"b31dbc8f-189e-47fe-8be6-d02332dd3cb9","Type":"ContainerStarted","Data":"575d50512f4777bfdaf14ff969c7de04f4bff092b7a2a31b1535e9800cfd115f"} Apr 21 06:28:10.500411 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:10.500302 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5n64c" event={"ID":"b31dbc8f-189e-47fe-8be6-d02332dd3cb9","Type":"ContainerStarted","Data":"31db77a37971824ec8e7252d0e9955ff0538605a5f511204fc9aaa7bc5be5c25"} Apr 21 06:28:10.500411 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:10.500315 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5n64c" event={"ID":"b31dbc8f-189e-47fe-8be6-d02332dd3cb9","Type":"ContainerStarted","Data":"f5dda2edd0ebd878bbff8ea486ddc21b176f89da2207b6d8a5e885b0fdeca3a9"} Apr 21 06:28:10.501483 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:10.501425 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-2f6rv" event={"ID":"75155df7-4d6d-4333-9af5-53bf8969c877","Type":"ContainerStarted","Data":"f8bc95a936d0cbf3443bfe1e79ab7795797b05e4992d9f874a565e02a2a412b4"} Apr 21 06:28:10.518035 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:10.517995 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-579f775496-zztsw" podStartSLOduration=1.517984293 podStartE2EDuration="1.517984293s" podCreationTimestamp="2026-04-21 06:28:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:28:10.515983849 +0000 UTC m=+105.000634906" watchObservedRunningTime="2026-04-21 06:28:10.517984293 +0000 UTC m=+105.002635308" Apr 21 06:28:12.509419 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:12.509380 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5n64c" event={"ID":"b31dbc8f-189e-47fe-8be6-d02332dd3cb9","Type":"ContainerStarted","Data":"9f4463b1958766aa76eef44b5836d4ce90d339428fdca6de1d5b249223b58d7f"} Apr 21 06:28:12.529609 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:12.529557 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-5n64c" podStartSLOduration=1.593718568 podStartE2EDuration="3.529542777s" podCreationTimestamp="2026-04-21 06:28:09 +0000 UTC" firstStartedPulling="2026-04-21 06:28:09.7445367 +0000 UTC m=+104.229187703" lastFinishedPulling="2026-04-21 06:28:11.680360899 +0000 UTC m=+106.165011912" observedRunningTime="2026-04-21 06:28:12.528475916 +0000 UTC m=+107.013126960" watchObservedRunningTime="2026-04-21 06:28:12.529542777 +0000 UTC m=+107.014193799" Apr 21 06:28:14.486489 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:14.486460 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-t2hf9" Apr 21 06:28:24.642479 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.642442 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-gnrvf"] Apr 21 06:28:24.647154 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.647131 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4"] Apr 21 06:28:24.647335 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.647308 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" Apr 21 06:28:24.649968 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.649945 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 06:28:24.649968 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.649957 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 21 06:28:24.650133 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.650020 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 21 06:28:24.650133 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.650049 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 06:28:24.650721 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.650698 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4" Apr 21 06:28:24.651175 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.651148 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-z8lmd\"" Apr 21 06:28:24.651280 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.651244 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 06:28:24.651414 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.651387 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 21 06:28:24.653156 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.653137 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 21 06:28:24.653420 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.653405 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-n9bxh\"" Apr 21 06:28:24.653493 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.653481 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 21 06:28:24.658422 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.658401 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-gnrvf"] Apr 21 06:28:24.659781 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.659757 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4"] Apr 21 06:28:24.679895 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.679876 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7vgfg"] Apr 21 06:28:24.683270 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.683251 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.685874 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.685854 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 06:28:24.685983 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.685897 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7m4pt\"" Apr 21 06:28:24.686302 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.686281 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 06:28:24.686400 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.686341 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 06:28:24.755420 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.755391 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28abb5ca-b2e3-46aa-8fb4-9a304b05acd1-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-gnrvf\" (UID: \"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" Apr 21 06:28:24.755591 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.755446 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d162ff51-8396-49a1-ad33-c6274571862d-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-4f7q4\" (UID: \"d162ff51-8396-49a1-ad33-c6274571862d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4" Apr 21 06:28:24.755591 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.755553 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/28abb5ca-b2e3-46aa-8fb4-9a304b05acd1-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-gnrvf\" (UID: \"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" Apr 21 06:28:24.755703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.755595 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d162ff51-8396-49a1-ad33-c6274571862d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-4f7q4\" (UID: \"d162ff51-8396-49a1-ad33-c6274571862d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4" Apr 21 06:28:24.755703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.755632 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/28abb5ca-b2e3-46aa-8fb4-9a304b05acd1-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-gnrvf\" (UID: \"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" Apr 21 06:28:24.755703 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.755670 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnsff\" (UniqueName: \"kubernetes.io/projected/28abb5ca-b2e3-46aa-8fb4-9a304b05acd1-kube-api-access-jnsff\") pod \"kube-state-metrics-69db897b98-gnrvf\" (UID: \"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" Apr 21 06:28:24.755854 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.755709 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d162ff51-8396-49a1-ad33-c6274571862d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-4f7q4\" (UID: \"d162ff51-8396-49a1-ad33-c6274571862d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4" Apr 21 06:28:24.755854 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.755777 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/28abb5ca-b2e3-46aa-8fb4-9a304b05acd1-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-gnrvf\" (UID: \"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" Apr 21 06:28:24.755854 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.755813 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwx7j\" (UniqueName: \"kubernetes.io/projected/d162ff51-8396-49a1-ad33-c6274571862d-kube-api-access-zwx7j\") pod \"openshift-state-metrics-9d44df66c-4f7q4\" (UID: \"d162ff51-8396-49a1-ad33-c6274571862d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4" Apr 21 06:28:24.755987 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.755852 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/28abb5ca-b2e3-46aa-8fb4-9a304b05acd1-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-gnrvf\" (UID: \"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" Apr 21 06:28:24.856505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.856473 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28abb5ca-b2e3-46aa-8fb4-9a304b05acd1-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-gnrvf\" (UID: \"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" Apr 21 06:28:24.856505 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.856524 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d162ff51-8396-49a1-ad33-c6274571862d-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-4f7q4\" (UID: \"d162ff51-8396-49a1-ad33-c6274571862d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4" Apr 21 06:28:24.856734 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.856566 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/28abb5ca-b2e3-46aa-8fb4-9a304b05acd1-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-gnrvf\" (UID: \"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" Apr 21 06:28:24.856734 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.856589 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/18081e16-7ca9-4320-a6fc-7726f9939e49-metrics-client-ca\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.856734 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.856616 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d162ff51-8396-49a1-ad33-c6274571862d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-4f7q4\" (UID: \"d162ff51-8396-49a1-ad33-c6274571862d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4" Apr 21 06:28:24.856734 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.856643 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/28abb5ca-b2e3-46aa-8fb4-9a304b05acd1-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-gnrvf\" (UID: \"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" Apr 21 06:28:24.856734 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.856673 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jnsff\" (UniqueName: \"kubernetes.io/projected/28abb5ca-b2e3-46aa-8fb4-9a304b05acd1-kube-api-access-jnsff\") pod \"kube-state-metrics-69db897b98-gnrvf\" (UID: \"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" Apr 21 06:28:24.856734 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.856701 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/18081e16-7ca9-4320-a6fc-7726f9939e49-root\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.856734 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.856730 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fggvr\" (UniqueName: \"kubernetes.io/projected/18081e16-7ca9-4320-a6fc-7726f9939e49-kube-api-access-fggvr\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.857075 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.856762 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d162ff51-8396-49a1-ad33-c6274571862d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-4f7q4\" (UID: \"d162ff51-8396-49a1-ad33-c6274571862d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4" Apr 21 06:28:24.857075 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.856790 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/18081e16-7ca9-4320-a6fc-7726f9939e49-node-exporter-tls\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.857075 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.856825 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/28abb5ca-b2e3-46aa-8fb4-9a304b05acd1-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-gnrvf\" (UID: \"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" Apr 21 06:28:24.857075 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.856852 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwx7j\" (UniqueName: \"kubernetes.io/projected/d162ff51-8396-49a1-ad33-c6274571862d-kube-api-access-zwx7j\") pod \"openshift-state-metrics-9d44df66c-4f7q4\" (UID: \"d162ff51-8396-49a1-ad33-c6274571862d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4" Apr 21 06:28:24.857075 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.856883 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/18081e16-7ca9-4320-a6fc-7726f9939e49-node-exporter-accelerators-collector-config\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.857075 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.856910 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/28abb5ca-b2e3-46aa-8fb4-9a304b05acd1-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-gnrvf\" (UID: \"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" Apr 21 06:28:24.857075 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.856947 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/18081e16-7ca9-4320-a6fc-7726f9939e49-sys\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.857075 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.856968 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/18081e16-7ca9-4320-a6fc-7726f9939e49-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.857075 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.856998 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/18081e16-7ca9-4320-a6fc-7726f9939e49-node-exporter-wtmp\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.857075 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.857025 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/18081e16-7ca9-4320-a6fc-7726f9939e49-node-exporter-textfile\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.858106 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.857365 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28abb5ca-b2e3-46aa-8fb4-9a304b05acd1-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-gnrvf\" (UID: \"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" Apr 21 06:28:24.858106 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.857381 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/28abb5ca-b2e3-46aa-8fb4-9a304b05acd1-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-gnrvf\" (UID: \"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" Apr 21 06:28:24.858106 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.857379 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/28abb5ca-b2e3-46aa-8fb4-9a304b05acd1-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-gnrvf\" (UID: \"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" Apr 21 06:28:24.858106 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.857965 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d162ff51-8396-49a1-ad33-c6274571862d-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-4f7q4\" (UID: \"d162ff51-8396-49a1-ad33-c6274571862d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4" Apr 21 06:28:24.859693 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.859667 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/28abb5ca-b2e3-46aa-8fb4-9a304b05acd1-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-gnrvf\" (UID: \"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" Apr 21 06:28:24.860177 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.860155 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/28abb5ca-b2e3-46aa-8fb4-9a304b05acd1-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-gnrvf\" (UID: \"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" Apr 21 06:28:24.860290 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.860223 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d162ff51-8396-49a1-ad33-c6274571862d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-4f7q4\" (UID: \"d162ff51-8396-49a1-ad33-c6274571862d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4" Apr 21 06:28:24.860509 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.860491 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d162ff51-8396-49a1-ad33-c6274571862d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-4f7q4\" (UID: \"d162ff51-8396-49a1-ad33-c6274571862d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4" Apr 21 06:28:24.864845 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.864825 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwx7j\" (UniqueName: \"kubernetes.io/projected/d162ff51-8396-49a1-ad33-c6274571862d-kube-api-access-zwx7j\") pod \"openshift-state-metrics-9d44df66c-4f7q4\" (UID: \"d162ff51-8396-49a1-ad33-c6274571862d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4" Apr 21 06:28:24.864940 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.864848 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnsff\" (UniqueName: \"kubernetes.io/projected/28abb5ca-b2e3-46aa-8fb4-9a304b05acd1-kube-api-access-jnsff\") pod \"kube-state-metrics-69db897b98-gnrvf\" (UID: \"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" Apr 21 06:28:24.958160 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.958077 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/18081e16-7ca9-4320-a6fc-7726f9939e49-node-exporter-textfile\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.958303 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.958249 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/18081e16-7ca9-4320-a6fc-7726f9939e49-metrics-client-ca\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.958303 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.958298 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/18081e16-7ca9-4320-a6fc-7726f9939e49-root\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.958413 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.958325 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fggvr\" (UniqueName: \"kubernetes.io/projected/18081e16-7ca9-4320-a6fc-7726f9939e49-kube-api-access-fggvr\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.958413 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.958347 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/18081e16-7ca9-4320-a6fc-7726f9939e49-node-exporter-tls\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.958413 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.958389 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/18081e16-7ca9-4320-a6fc-7726f9939e49-node-exporter-accelerators-collector-config\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.958563 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.958440 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/18081e16-7ca9-4320-a6fc-7726f9939e49-sys\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.958563 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.958448 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/18081e16-7ca9-4320-a6fc-7726f9939e49-node-exporter-textfile\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.958563 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.958466 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/18081e16-7ca9-4320-a6fc-7726f9939e49-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.958563 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.958501 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/18081e16-7ca9-4320-a6fc-7726f9939e49-node-exporter-wtmp\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.958762 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.958681 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/18081e16-7ca9-4320-a6fc-7726f9939e49-node-exporter-wtmp\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.959018 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.958970 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/18081e16-7ca9-4320-a6fc-7726f9939e49-root\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.959018 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.958989 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/18081e16-7ca9-4320-a6fc-7726f9939e49-sys\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.959361 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.959335 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/18081e16-7ca9-4320-a6fc-7726f9939e49-node-exporter-accelerators-collector-config\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.959446 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.959374 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/18081e16-7ca9-4320-a6fc-7726f9939e49-metrics-client-ca\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.959681 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.959663 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" Apr 21 06:28:24.961388 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.961367 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/18081e16-7ca9-4320-a6fc-7726f9939e49-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.961566 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.961539 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/18081e16-7ca9-4320-a6fc-7726f9939e49-node-exporter-tls\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.966923 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.966899 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4" Apr 21 06:28:24.969074 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.969055 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fggvr\" (UniqueName: \"kubernetes.io/projected/18081e16-7ca9-4320-a6fc-7726f9939e49-kube-api-access-fggvr\") pod \"node-exporter-7vgfg\" (UID: \"18081e16-7ca9-4320-a6fc-7726f9939e49\") " pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:24.993013 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:24.992991 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7vgfg" Apr 21 06:28:25.121652 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:28:25.121616 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18081e16_7ca9_4320_a6fc_7726f9939e49.slice/crio-e9bfbbd2c14e88a13562ebc88cd22f60cefe25c13dc962b04cf8cfa80f3c0f9b WatchSource:0}: Error finding container e9bfbbd2c14e88a13562ebc88cd22f60cefe25c13dc962b04cf8cfa80f3c0f9b: Status 404 returned error can't find the container with id e9bfbbd2c14e88a13562ebc88cd22f60cefe25c13dc962b04cf8cfa80f3c0f9b Apr 21 06:28:25.245423 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:25.245399 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4"] Apr 21 06:28:25.249601 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:28:25.249567 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd162ff51_8396_49a1_ad33_c6274571862d.slice/crio-f1b3fa2623d43181282022b503c9ea19b184521c1c0ab3391f4195d1f8547a6c WatchSource:0}: Error finding container f1b3fa2623d43181282022b503c9ea19b184521c1c0ab3391f4195d1f8547a6c: Status 404 returned error can't find the container with id f1b3fa2623d43181282022b503c9ea19b184521c1c0ab3391f4195d1f8547a6c Apr 21 06:28:25.267442 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:25.267353 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-gnrvf"] Apr 21 06:28:25.269866 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:28:25.269841 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28abb5ca_b2e3_46aa_8fb4_9a304b05acd1.slice/crio-922acaf082bff4886f3dbf8cbeb088a3c49c3b30587cee4f7a997461f5d2d862 WatchSource:0}: Error finding container 922acaf082bff4886f3dbf8cbeb088a3c49c3b30587cee4f7a997461f5d2d862: Status 404 returned error can't find the container with id 922acaf082bff4886f3dbf8cbeb088a3c49c3b30587cee4f7a997461f5d2d862 Apr 21 06:28:25.548760 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:25.548717 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-2f6rv" event={"ID":"75155df7-4d6d-4333-9af5-53bf8969c877","Type":"ContainerStarted","Data":"184ff7009d2af372bc1f148c74b5aeb3edfacd6f758341896117ed7a4b4ddc7e"} Apr 21 06:28:25.549251 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:25.549227 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-2f6rv" Apr 21 06:28:25.551507 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:25.551470 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4" event={"ID":"d162ff51-8396-49a1-ad33-c6274571862d","Type":"ContainerStarted","Data":"2b7efaf481462844bebfcf16d0ddafa257a280cb5efba2215d95b8fc709de287"} Apr 21 06:28:25.551622 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:25.551535 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4" event={"ID":"d162ff51-8396-49a1-ad33-c6274571862d","Type":"ContainerStarted","Data":"88a76ab62362dc5050954aaae169f6422c1261f041bd12346c80ae076289f3d6"} Apr 21 06:28:25.551622 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:25.551551 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4" event={"ID":"d162ff51-8396-49a1-ad33-c6274571862d","Type":"ContainerStarted","Data":"f1b3fa2623d43181282022b503c9ea19b184521c1c0ab3391f4195d1f8547a6c"} Apr 21 06:28:25.552764 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:25.552726 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7vgfg" event={"ID":"18081e16-7ca9-4320-a6fc-7726f9939e49","Type":"ContainerStarted","Data":"e9bfbbd2c14e88a13562ebc88cd22f60cefe25c13dc962b04cf8cfa80f3c0f9b"} Apr 21 06:28:25.553889 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:25.553865 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" event={"ID":"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1","Type":"ContainerStarted","Data":"922acaf082bff4886f3dbf8cbeb088a3c49c3b30587cee4f7a997461f5d2d862"} Apr 21 06:28:25.564086 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:25.564067 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-2f6rv" Apr 21 06:28:25.568374 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:25.568334 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-2f6rv" podStartSLOduration=1.182694775 podStartE2EDuration="16.568318719s" podCreationTimestamp="2026-04-21 06:28:09 +0000 UTC" firstStartedPulling="2026-04-21 06:28:09.830482273 +0000 UTC m=+104.315133272" lastFinishedPulling="2026-04-21 06:28:25.2161062 +0000 UTC m=+119.700757216" observedRunningTime="2026-04-21 06:28:25.567881361 +0000 UTC m=+120.052532383" watchObservedRunningTime="2026-04-21 06:28:25.568318719 +0000 UTC m=+120.052969741" Apr 21 06:28:26.559909 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:26.559637 2571 generic.go:358] "Generic (PLEG): container finished" podID="18081e16-7ca9-4320-a6fc-7726f9939e49" containerID="2ccaadfc82e64d95aa3c74960cc4cc715216c2fcfe2bad4849704cf542485bba" exitCode=0 Apr 21 06:28:26.560460 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:26.560196 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7vgfg" event={"ID":"18081e16-7ca9-4320-a6fc-7726f9939e49","Type":"ContainerDied","Data":"2ccaadfc82e64d95aa3c74960cc4cc715216c2fcfe2bad4849704cf542485bba"} Apr 21 06:28:27.565368 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:27.565228 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7vgfg" event={"ID":"18081e16-7ca9-4320-a6fc-7726f9939e49","Type":"ContainerStarted","Data":"ca00dadfdaba08f4a9a92aa9e43d2d1ef36174ef37fd6af3ee71295c8e204e0c"} Apr 21 06:28:27.565368 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:27.565279 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7vgfg" event={"ID":"18081e16-7ca9-4320-a6fc-7726f9939e49","Type":"ContainerStarted","Data":"6fc2f7232aa32e905e8794025bac080b3d4f1ff355c252993a05f11f138efe93"} Apr 21 06:28:27.567601 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:27.567575 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" event={"ID":"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1","Type":"ContainerStarted","Data":"29184bfda5a6ecbd8a9238f46b557d55d5687fc33163beeeb46d30e113371fe8"} Apr 21 06:28:27.567745 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:27.567605 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" event={"ID":"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1","Type":"ContainerStarted","Data":"5d016e3344db70a24c4677cb6b8c5c96ee4a2e055167e282776e96e8ed028743"} Apr 21 06:28:27.567745 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:27.567618 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" event={"ID":"28abb5ca-b2e3-46aa-8fb4-9a304b05acd1","Type":"ContainerStarted","Data":"e2465e35f00fc61e3bb4e5d384d4a93add10aaba1d892bb42083d8da5e259904"} Apr 21 06:28:27.570044 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:27.569787 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4" event={"ID":"d162ff51-8396-49a1-ad33-c6274571862d","Type":"ContainerStarted","Data":"ce2be5645d474abf929d9d9f5eca7ae0be909af36374cadd640f62113b0563c2"} Apr 21 06:28:27.588555 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:27.588451 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7vgfg" podStartSLOduration=2.896345868 podStartE2EDuration="3.588435977s" podCreationTimestamp="2026-04-21 06:28:24 +0000 UTC" firstStartedPulling="2026-04-21 06:28:25.123548843 +0000 UTC m=+119.608199846" lastFinishedPulling="2026-04-21 06:28:25.815638938 +0000 UTC m=+120.300289955" observedRunningTime="2026-04-21 06:28:27.587311401 +0000 UTC m=+122.071962433" watchObservedRunningTime="2026-04-21 06:28:27.588435977 +0000 UTC m=+122.073087001" Apr 21 06:28:27.606697 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:27.606653 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-4f7q4" podStartSLOduration=2.133340214 podStartE2EDuration="3.606635962s" podCreationTimestamp="2026-04-21 06:28:24 +0000 UTC" firstStartedPulling="2026-04-21 06:28:25.374233838 +0000 UTC m=+119.858884851" lastFinishedPulling="2026-04-21 06:28:26.847529573 +0000 UTC m=+121.332180599" observedRunningTime="2026-04-21 06:28:27.605640712 +0000 UTC m=+122.090291735" watchObservedRunningTime="2026-04-21 06:28:27.606635962 +0000 UTC m=+122.091286987" Apr 21 06:28:27.630759 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:27.630709 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-gnrvf" podStartSLOduration=2.053634314 podStartE2EDuration="3.630694433s" podCreationTimestamp="2026-04-21 06:28:24 +0000 UTC" firstStartedPulling="2026-04-21 06:28:25.271698201 +0000 UTC m=+119.756349201" lastFinishedPulling="2026-04-21 06:28:26.848758313 +0000 UTC m=+121.333409320" observedRunningTime="2026-04-21 06:28:27.629259811 +0000 UTC m=+122.113910846" watchObservedRunningTime="2026-04-21 06:28:27.630694433 +0000 UTC m=+122.115345489" Apr 21 06:28:29.052651 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.052613 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-754d4c66d8-rcb4z"] Apr 21 06:28:29.092356 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.092320 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-754d4c66d8-rcb4z"] Apr 21 06:28:29.092510 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.092473 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.095408 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.095385 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 21 06:28:29.095570 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.095411 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-4nk8m\"" Apr 21 06:28:29.095570 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.095382 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 06:28:29.095691 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.095382 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 21 06:28:29.096415 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.096393 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 21 06:28:29.096612 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.096590 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-lc44puo56s2j\"" Apr 21 06:28:29.195692 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.195664 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-client-ca-bundle\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.195852 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.195714 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-secret-metrics-server-tls\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.195852 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.195766 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-audit-log\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.195852 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.195797 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-secret-metrics-server-client-certs\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.195997 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.195882 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhjbl\" (UniqueName: \"kubernetes.io/projected/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-kube-api-access-nhjbl\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.195997 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.195927 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.195997 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.195955 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-metrics-server-audit-profiles\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.296615 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.296583 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-metrics-server-audit-profiles\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.296741 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.296628 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-client-ca-bundle\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.296741 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.296671 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-secret-metrics-server-tls\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.296741 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.296721 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-audit-log\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.296863 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.296836 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-secret-metrics-server-client-certs\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.296938 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.296922 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhjbl\" (UniqueName: \"kubernetes.io/projected/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-kube-api-access-nhjbl\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.296986 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.296964 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.297151 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.297104 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-audit-log\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.297659 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.297613 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.297766 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.297691 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-metrics-server-audit-profiles\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.299787 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.299758 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-secret-metrics-server-client-certs\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.299918 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.299893 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-secret-metrics-server-tls\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.300005 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.299893 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-client-ca-bundle\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.304598 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.304541 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhjbl\" (UniqueName: \"kubernetes.io/projected/ab9d28b9-df72-4ace-b58a-a72c2a76bb82-kube-api-access-nhjbl\") pod \"metrics-server-754d4c66d8-rcb4z\" (UID: \"ab9d28b9-df72-4ace-b58a-a72c2a76bb82\") " pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.403961 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.403932 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:29.540735 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.540693 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-754d4c66d8-rcb4z"] Apr 21 06:28:29.543509 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:28:29.543479 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab9d28b9_df72_4ace_b58a_a72c2a76bb82.slice/crio-28b2d55a1472d0fa75ae78331a424d4722d6ce45b0b6fdf05105b24a9fda437f WatchSource:0}: Error finding container 28b2d55a1472d0fa75ae78331a424d4722d6ce45b0b6fdf05105b24a9fda437f: Status 404 returned error can't find the container with id 28b2d55a1472d0fa75ae78331a424d4722d6ce45b0b6fdf05105b24a9fda437f Apr 21 06:28:29.576866 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:29.576792 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" event={"ID":"ab9d28b9-df72-4ace-b58a-a72c2a76bb82","Type":"ContainerStarted","Data":"28b2d55a1472d0fa75ae78331a424d4722d6ce45b0b6fdf05105b24a9fda437f"} Apr 21 06:28:31.509292 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:31.509263 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-579f775496-zztsw" Apr 21 06:28:32.588243 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:32.588197 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" event={"ID":"ab9d28b9-df72-4ace-b58a-a72c2a76bb82","Type":"ContainerStarted","Data":"229eb2a86eebedd657e85b64a23fbda8d188f5ff21b39097e64db229b3c199bb"} Apr 21 06:28:32.606708 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:32.606650 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" podStartSLOduration=1.643637671 podStartE2EDuration="3.606633545s" podCreationTimestamp="2026-04-21 06:28:29 +0000 UTC" firstStartedPulling="2026-04-21 06:28:29.545677861 +0000 UTC m=+124.030328863" lastFinishedPulling="2026-04-21 06:28:31.508673719 +0000 UTC m=+125.993324737" observedRunningTime="2026-04-21 06:28:32.606206534 +0000 UTC m=+127.090857557" watchObservedRunningTime="2026-04-21 06:28:32.606633545 +0000 UTC m=+127.091284568" Apr 21 06:28:34.432067 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.432034 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5b96dffcb7-np5j4"] Apr 21 06:28:34.457831 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.457801 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b96dffcb7-np5j4"] Apr 21 06:28:34.457981 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.457941 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.460719 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.460691 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 06:28:34.460860 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.460725 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 06:28:34.460860 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.460764 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 06:28:34.461724 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.461696 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 06:28:34.461941 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.461924 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-rk6ld\"" Apr 21 06:28:34.462037 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.461964 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 06:28:34.466880 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.466858 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 06:28:34.642433 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.642400 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vpk6\" (UniqueName: \"kubernetes.io/projected/2b932e1c-4189-4911-a1c8-dc4956034b5d-kube-api-access-7vpk6\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.642621 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.642436 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-oauth-serving-cert\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.642621 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.642503 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-service-ca\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.642621 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.642555 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b932e1c-4189-4911-a1c8-dc4956034b5d-console-serving-cert\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.642621 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.642585 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-trusted-ca-bundle\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.642621 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.642610 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-console-config\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.642855 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.642659 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2b932e1c-4189-4911-a1c8-dc4956034b5d-console-oauth-config\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.743813 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.743733 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b932e1c-4189-4911-a1c8-dc4956034b5d-console-serving-cert\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.743813 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.743767 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-trusted-ca-bundle\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.743813 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.743789 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-console-config\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.744061 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.743818 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2b932e1c-4189-4911-a1c8-dc4956034b5d-console-oauth-config\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.744061 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.743898 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vpk6\" (UniqueName: \"kubernetes.io/projected/2b932e1c-4189-4911-a1c8-dc4956034b5d-kube-api-access-7vpk6\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.744061 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.743929 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-oauth-serving-cert\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.744220 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.744190 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-service-ca\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.744660 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.744634 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-oauth-serving-cert\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.744792 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.744716 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-trusted-ca-bundle\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.744792 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.744734 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-console-config\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.744936 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.744913 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-service-ca\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.746707 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.746682 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2b932e1c-4189-4911-a1c8-dc4956034b5d-console-oauth-config\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.746952 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.746932 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b932e1c-4189-4911-a1c8-dc4956034b5d-console-serving-cert\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.752760 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.752739 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vpk6\" (UniqueName: \"kubernetes.io/projected/2b932e1c-4189-4911-a1c8-dc4956034b5d-kube-api-access-7vpk6\") pod \"console-5b96dffcb7-np5j4\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.769489 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.769465 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:34.845834 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.845760 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs\") pod \"network-metrics-daemon-qjchj\" (UID: \"546de538-b56a-4ad2-baeb-3d59144586fb\") " pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:28:34.848022 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.847974 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/546de538-b56a-4ad2-baeb-3d59144586fb-metrics-certs\") pod \"network-metrics-daemon-qjchj\" (UID: \"546de538-b56a-4ad2-baeb-3d59144586fb\") " pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:28:34.905845 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:34.905817 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b96dffcb7-np5j4"] Apr 21 06:28:34.909079 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:28:34.909049 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b932e1c_4189_4911_a1c8_dc4956034b5d.slice/crio-b2547de9c727175d8366666b43c646d2e355ef4ca9cf1bd1b4f54b064a415db2 WatchSource:0}: Error finding container b2547de9c727175d8366666b43c646d2e355ef4ca9cf1bd1b4f54b064a415db2: Status 404 returned error can't find the container with id b2547de9c727175d8366666b43c646d2e355ef4ca9cf1bd1b4f54b064a415db2 Apr 21 06:28:35.035176 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:35.035154 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p6jr2\"" Apr 21 06:28:35.043160 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:35.043143 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qjchj" Apr 21 06:28:35.156404 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:35.156373 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qjchj"] Apr 21 06:28:35.159145 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:28:35.159118 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod546de538_b56a_4ad2_baeb_3d59144586fb.slice/crio-fe1fbf6d7e577963f47fdce9cb17ff872308116aa1bb4309d4af0567815d6530 WatchSource:0}: Error finding container fe1fbf6d7e577963f47fdce9cb17ff872308116aa1bb4309d4af0567815d6530: Status 404 returned error can't find the container with id fe1fbf6d7e577963f47fdce9cb17ff872308116aa1bb4309d4af0567815d6530 Apr 21 06:28:35.599095 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:35.599059 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qjchj" event={"ID":"546de538-b56a-4ad2-baeb-3d59144586fb","Type":"ContainerStarted","Data":"fe1fbf6d7e577963f47fdce9cb17ff872308116aa1bb4309d4af0567815d6530"} Apr 21 06:28:35.600258 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:35.600222 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b96dffcb7-np5j4" event={"ID":"2b932e1c-4189-4911-a1c8-dc4956034b5d","Type":"ContainerStarted","Data":"b2547de9c727175d8366666b43c646d2e355ef4ca9cf1bd1b4f54b064a415db2"} Apr 21 06:28:38.612632 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:38.612551 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qjchj" event={"ID":"546de538-b56a-4ad2-baeb-3d59144586fb","Type":"ContainerStarted","Data":"d6bc78608344f0b8123e3e773611ee0e2baf09c8eaa03735a7b9d29a3bd4395b"} Apr 21 06:28:38.612632 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:38.612589 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qjchj" event={"ID":"546de538-b56a-4ad2-baeb-3d59144586fb","Type":"ContainerStarted","Data":"1bf608bdbed8311c7b2835ea8a08c8f19ae0a5cdb259c715ae438b4fad20ee21"} Apr 21 06:28:38.613995 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:38.613965 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b96dffcb7-np5j4" event={"ID":"2b932e1c-4189-4911-a1c8-dc4956034b5d","Type":"ContainerStarted","Data":"f6a6d32b33822ff6ed501cda807efd612c2061393714c99403bca78035e9a5c8"} Apr 21 06:28:38.628676 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:38.628633 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qjchj" podStartSLOduration=129.869684108 podStartE2EDuration="2m12.628620158s" podCreationTimestamp="2026-04-21 06:26:26 +0000 UTC" firstStartedPulling="2026-04-21 06:28:35.160913552 +0000 UTC m=+129.645564553" lastFinishedPulling="2026-04-21 06:28:37.919849602 +0000 UTC m=+132.404500603" observedRunningTime="2026-04-21 06:28:38.627128295 +0000 UTC m=+133.111779309" watchObservedRunningTime="2026-04-21 06:28:38.628620158 +0000 UTC m=+133.113271179" Apr 21 06:28:38.645102 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:38.645054 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5b96dffcb7-np5j4" podStartSLOduration=1.284140329 podStartE2EDuration="4.64501027s" podCreationTimestamp="2026-04-21 06:28:34 +0000 UTC" firstStartedPulling="2026-04-21 06:28:34.911464874 +0000 UTC m=+129.396115877" lastFinishedPulling="2026-04-21 06:28:38.272334815 +0000 UTC m=+132.756985818" observedRunningTime="2026-04-21 06:28:38.644118901 +0000 UTC m=+133.128769926" watchObservedRunningTime="2026-04-21 06:28:38.64501027 +0000 UTC m=+133.129661293" Apr 21 06:28:44.770612 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:44.770576 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:44.771078 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:44.770628 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:44.774924 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:44.774901 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:45.636630 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:45.636600 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:28:49.404986 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:49.404953 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:49.405366 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:49.405046 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:28:57.665842 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:57.665812 2571 generic.go:358] "Generic (PLEG): container finished" podID="19694577-e921-48dd-acfc-d48492e5ee03" containerID="e9d0f62a6fdba6e69d79a134c5bd6de541e0de3f5689ea0901c1dd808599d600" exitCode=0 Apr 21 06:28:57.666205 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:57.665889 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-s74rh" event={"ID":"19694577-e921-48dd-acfc-d48492e5ee03","Type":"ContainerDied","Data":"e9d0f62a6fdba6e69d79a134c5bd6de541e0de3f5689ea0901c1dd808599d600"} Apr 21 06:28:57.666205 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:57.666169 2571 scope.go:117] "RemoveContainer" containerID="e9d0f62a6fdba6e69d79a134c5bd6de541e0de3f5689ea0901c1dd808599d600" Apr 21 06:28:58.671205 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:28:58.671169 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-s74rh" event={"ID":"19694577-e921-48dd-acfc-d48492e5ee03","Type":"ContainerStarted","Data":"afd63b48135bcdbf2d92611764c33a4068cca7bb06062a85105c34b9d2a07c9f"} Apr 21 06:29:03.117196 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:29:03.117162 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-mdnlh_4aa4526d-b9ab-4ef8-9333-2a849fe6acc7/serve-healthcheck-canary/0.log" Apr 21 06:29:09.409870 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:29:09.409788 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:29:09.413841 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:29:09.413818 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-754d4c66d8-rcb4z" Apr 21 06:29:58.730643 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:29:58.730600 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b96dffcb7-np5j4"] Apr 21 06:30:23.754437 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:23.754377 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5b96dffcb7-np5j4" podUID="2b932e1c-4189-4911-a1c8-dc4956034b5d" containerName="console" containerID="cri-o://f6a6d32b33822ff6ed501cda807efd612c2061393714c99403bca78035e9a5c8" gracePeriod=15 Apr 21 06:30:23.913330 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:23.913305 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b96dffcb7-np5j4_2b932e1c-4189-4911-a1c8-dc4956034b5d/console/0.log" Apr 21 06:30:23.913475 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:23.913343 2571 generic.go:358] "Generic (PLEG): container finished" podID="2b932e1c-4189-4911-a1c8-dc4956034b5d" containerID="f6a6d32b33822ff6ed501cda807efd612c2061393714c99403bca78035e9a5c8" exitCode=2 Apr 21 06:30:23.913475 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:23.913377 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b96dffcb7-np5j4" event={"ID":"2b932e1c-4189-4911-a1c8-dc4956034b5d","Type":"ContainerDied","Data":"f6a6d32b33822ff6ed501cda807efd612c2061393714c99403bca78035e9a5c8"} Apr 21 06:30:24.011119 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.011064 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b96dffcb7-np5j4_2b932e1c-4189-4911-a1c8-dc4956034b5d/console/0.log" Apr 21 06:30:24.011224 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.011147 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:30:24.102741 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.102712 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-trusted-ca-bundle\") pod \"2b932e1c-4189-4911-a1c8-dc4956034b5d\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " Apr 21 06:30:24.102881 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.102759 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-service-ca\") pod \"2b932e1c-4189-4911-a1c8-dc4956034b5d\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " Apr 21 06:30:24.102881 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.102798 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-console-config\") pod \"2b932e1c-4189-4911-a1c8-dc4956034b5d\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " Apr 21 06:30:24.102881 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.102821 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2b932e1c-4189-4911-a1c8-dc4956034b5d-console-oauth-config\") pod \"2b932e1c-4189-4911-a1c8-dc4956034b5d\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " Apr 21 06:30:24.102881 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.102841 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b932e1c-4189-4911-a1c8-dc4956034b5d-console-serving-cert\") pod \"2b932e1c-4189-4911-a1c8-dc4956034b5d\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " Apr 21 06:30:24.102881 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.102857 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vpk6\" (UniqueName: \"kubernetes.io/projected/2b932e1c-4189-4911-a1c8-dc4956034b5d-kube-api-access-7vpk6\") pod \"2b932e1c-4189-4911-a1c8-dc4956034b5d\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " Apr 21 06:30:24.103123 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.102962 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-oauth-serving-cert\") pod \"2b932e1c-4189-4911-a1c8-dc4956034b5d\" (UID: \"2b932e1c-4189-4911-a1c8-dc4956034b5d\") " Apr 21 06:30:24.103178 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.103143 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2b932e1c-4189-4911-a1c8-dc4956034b5d" (UID: "2b932e1c-4189-4911-a1c8-dc4956034b5d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:30:24.103279 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.103253 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-console-config" (OuterVolumeSpecName: "console-config") pod "2b932e1c-4189-4911-a1c8-dc4956034b5d" (UID: "2b932e1c-4189-4911-a1c8-dc4956034b5d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:30:24.103346 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.103296 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-service-ca" (OuterVolumeSpecName: "service-ca") pod "2b932e1c-4189-4911-a1c8-dc4956034b5d" (UID: "2b932e1c-4189-4911-a1c8-dc4956034b5d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:30:24.103560 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.103511 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2b932e1c-4189-4911-a1c8-dc4956034b5d" (UID: "2b932e1c-4189-4911-a1c8-dc4956034b5d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 06:30:24.105114 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.105092 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b932e1c-4189-4911-a1c8-dc4956034b5d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2b932e1c-4189-4911-a1c8-dc4956034b5d" (UID: "2b932e1c-4189-4911-a1c8-dc4956034b5d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:30:24.105372 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.105350 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b932e1c-4189-4911-a1c8-dc4956034b5d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2b932e1c-4189-4911-a1c8-dc4956034b5d" (UID: "2b932e1c-4189-4911-a1c8-dc4956034b5d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 06:30:24.105442 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.105378 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b932e1c-4189-4911-a1c8-dc4956034b5d-kube-api-access-7vpk6" (OuterVolumeSpecName: "kube-api-access-7vpk6") pod "2b932e1c-4189-4911-a1c8-dc4956034b5d" (UID: "2b932e1c-4189-4911-a1c8-dc4956034b5d"). InnerVolumeSpecName "kube-api-access-7vpk6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:30:24.204001 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.203978 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-console-config\") on node \"ip-10-0-138-76.ec2.internal\" DevicePath \"\"" Apr 21 06:30:24.204001 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.204000 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2b932e1c-4189-4911-a1c8-dc4956034b5d-console-oauth-config\") on node \"ip-10-0-138-76.ec2.internal\" DevicePath \"\"" Apr 21 06:30:24.204130 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.204010 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b932e1c-4189-4911-a1c8-dc4956034b5d-console-serving-cert\") on node \"ip-10-0-138-76.ec2.internal\" DevicePath \"\"" Apr 21 06:30:24.204130 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.204021 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7vpk6\" (UniqueName: \"kubernetes.io/projected/2b932e1c-4189-4911-a1c8-dc4956034b5d-kube-api-access-7vpk6\") on node \"ip-10-0-138-76.ec2.internal\" DevicePath \"\"" Apr 21 06:30:24.204130 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.204030 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-oauth-serving-cert\") on node \"ip-10-0-138-76.ec2.internal\" DevicePath \"\"" Apr 21 06:30:24.204130 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.204038 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-trusted-ca-bundle\") on node \"ip-10-0-138-76.ec2.internal\" DevicePath \"\"" Apr 21 06:30:24.204130 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.204048 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2b932e1c-4189-4911-a1c8-dc4956034b5d-service-ca\") on node \"ip-10-0-138-76.ec2.internal\" DevicePath \"\"" Apr 21 06:30:24.917451 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.917425 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b96dffcb7-np5j4_2b932e1c-4189-4911-a1c8-dc4956034b5d/console/0.log" Apr 21 06:30:24.917819 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.917511 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b96dffcb7-np5j4" event={"ID":"2b932e1c-4189-4911-a1c8-dc4956034b5d","Type":"ContainerDied","Data":"b2547de9c727175d8366666b43c646d2e355ef4ca9cf1bd1b4f54b064a415db2"} Apr 21 06:30:24.917819 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.917550 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b96dffcb7-np5j4" Apr 21 06:30:24.917819 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.917560 2571 scope.go:117] "RemoveContainer" containerID="f6a6d32b33822ff6ed501cda807efd612c2061393714c99403bca78035e9a5c8" Apr 21 06:30:24.937569 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.934550 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b96dffcb7-np5j4"] Apr 21 06:30:24.940333 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:24.940312 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5b96dffcb7-np5j4"] Apr 21 06:30:26.114976 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:30:26.114945 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b932e1c-4189-4911-a1c8-dc4956034b5d" path="/var/lib/kubelet/pods/2b932e1c-4189-4911-a1c8-dc4956034b5d/volumes" Apr 21 06:31:25.996086 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:25.996053 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8rxc_9b8417f4-abc8-485b-8bfc-78987d632957/ovn-acl-logging/0.log" Apr 21 06:31:25.996652 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:25.996063 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8rxc_9b8417f4-abc8-485b-8bfc-78987d632957/ovn-acl-logging/0.log" Apr 21 06:31:26.003243 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:26.003222 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 06:31:55.689597 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:55.689559 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-9nm7q"] Apr 21 06:31:55.692124 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:55.689845 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b932e1c-4189-4911-a1c8-dc4956034b5d" containerName="console" Apr 21 06:31:55.692124 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:55.689856 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b932e1c-4189-4911-a1c8-dc4956034b5d" containerName="console" Apr 21 06:31:55.692124 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:55.689919 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b932e1c-4189-4911-a1c8-dc4956034b5d" containerName="console" Apr 21 06:31:55.692919 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:55.692901 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-9nm7q" Apr 21 06:31:55.695259 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:55.695234 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-cvmfh\"" Apr 21 06:31:55.695361 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:55.695285 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 06:31:55.696235 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:55.696211 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 06:31:55.702027 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:55.702006 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-9nm7q"] Apr 21 06:31:55.829723 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:55.829694 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fec195f1-0288-447a-91b9-ef247c1b29de-bound-sa-token\") pod \"cert-manager-79c8d999ff-9nm7q\" (UID: \"fec195f1-0288-447a-91b9-ef247c1b29de\") " pod="cert-manager/cert-manager-79c8d999ff-9nm7q" Apr 21 06:31:55.829835 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:55.829742 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szfj5\" (UniqueName: \"kubernetes.io/projected/fec195f1-0288-447a-91b9-ef247c1b29de-kube-api-access-szfj5\") pod \"cert-manager-79c8d999ff-9nm7q\" (UID: \"fec195f1-0288-447a-91b9-ef247c1b29de\") " pod="cert-manager/cert-manager-79c8d999ff-9nm7q" Apr 21 06:31:55.930857 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:55.930833 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szfj5\" (UniqueName: \"kubernetes.io/projected/fec195f1-0288-447a-91b9-ef247c1b29de-kube-api-access-szfj5\") pod \"cert-manager-79c8d999ff-9nm7q\" (UID: \"fec195f1-0288-447a-91b9-ef247c1b29de\") " pod="cert-manager/cert-manager-79c8d999ff-9nm7q" Apr 21 06:31:55.930944 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:55.930887 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fec195f1-0288-447a-91b9-ef247c1b29de-bound-sa-token\") pod \"cert-manager-79c8d999ff-9nm7q\" (UID: \"fec195f1-0288-447a-91b9-ef247c1b29de\") " pod="cert-manager/cert-manager-79c8d999ff-9nm7q" Apr 21 06:31:55.942213 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:55.942162 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fec195f1-0288-447a-91b9-ef247c1b29de-bound-sa-token\") pod \"cert-manager-79c8d999ff-9nm7q\" (UID: \"fec195f1-0288-447a-91b9-ef247c1b29de\") " pod="cert-manager/cert-manager-79c8d999ff-9nm7q" Apr 21 06:31:55.942445 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:55.942423 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szfj5\" (UniqueName: \"kubernetes.io/projected/fec195f1-0288-447a-91b9-ef247c1b29de-kube-api-access-szfj5\") pod \"cert-manager-79c8d999ff-9nm7q\" (UID: \"fec195f1-0288-447a-91b9-ef247c1b29de\") " pod="cert-manager/cert-manager-79c8d999ff-9nm7q" Apr 21 06:31:56.003339 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:56.003309 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-9nm7q" Apr 21 06:31:56.119169 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:56.119114 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-9nm7q"] Apr 21 06:31:56.121765 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:31:56.121738 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfec195f1_0288_447a_91b9_ef247c1b29de.slice/crio-94b607c65a276a1fed7b589465b2bda54fae2735aaff7301a1de63b49b6e7316 WatchSource:0}: Error finding container 94b607c65a276a1fed7b589465b2bda54fae2735aaff7301a1de63b49b6e7316: Status 404 returned error can't find the container with id 94b607c65a276a1fed7b589465b2bda54fae2735aaff7301a1de63b49b6e7316 Apr 21 06:31:56.123412 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:56.123395 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 06:31:56.164450 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:56.164423 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-9nm7q" event={"ID":"fec195f1-0288-447a-91b9-ef247c1b29de","Type":"ContainerStarted","Data":"94b607c65a276a1fed7b589465b2bda54fae2735aaff7301a1de63b49b6e7316"} Apr 21 06:31:59.174602 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:59.174506 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-9nm7q" event={"ID":"fec195f1-0288-447a-91b9-ef247c1b29de","Type":"ContainerStarted","Data":"08e059c3620c593273289c40ec2e162a88e8493c6918551920da6e737c560b8c"} Apr 21 06:31:59.188980 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:31:59.188931 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-9nm7q" podStartSLOduration=1.46477085 podStartE2EDuration="4.188914003s" podCreationTimestamp="2026-04-21 06:31:55 +0000 UTC" firstStartedPulling="2026-04-21 06:31:56.123533676 +0000 UTC m=+330.608184677" lastFinishedPulling="2026-04-21 06:31:58.847676825 +0000 UTC m=+333.332327830" observedRunningTime="2026-04-21 06:31:59.188078566 +0000 UTC m=+333.672729598" watchObservedRunningTime="2026-04-21 06:31:59.188914003 +0000 UTC m=+333.673565026" Apr 21 06:32:11.660745 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:11.660657 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-tpxsq"] Apr 21 06:32:11.663666 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:11.663639 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-tpxsq" Apr 21 06:32:11.666128 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:11.666098 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 21 06:32:11.667178 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:11.667158 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-operator-dockercfg-tpvkp\"" Apr 21 06:32:11.667250 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:11.667194 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 21 06:32:11.672506 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:11.672486 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-tpxsq"] Apr 21 06:32:11.746845 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:11.746818 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b271c541-e1a2-4f7d-876a-2cc8c9630482-tmp\") pod \"jobset-operator-747c5859c7-tpxsq\" (UID: \"b271c541-e1a2-4f7d-876a-2cc8c9630482\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-tpxsq" Apr 21 06:32:11.746937 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:11.746853 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42rlc\" (UniqueName: \"kubernetes.io/projected/b271c541-e1a2-4f7d-876a-2cc8c9630482-kube-api-access-42rlc\") pod \"jobset-operator-747c5859c7-tpxsq\" (UID: \"b271c541-e1a2-4f7d-876a-2cc8c9630482\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-tpxsq" Apr 21 06:32:11.847397 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:11.847361 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b271c541-e1a2-4f7d-876a-2cc8c9630482-tmp\") pod \"jobset-operator-747c5859c7-tpxsq\" (UID: \"b271c541-e1a2-4f7d-876a-2cc8c9630482\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-tpxsq" Apr 21 06:32:11.847397 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:11.847393 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42rlc\" (UniqueName: \"kubernetes.io/projected/b271c541-e1a2-4f7d-876a-2cc8c9630482-kube-api-access-42rlc\") pod \"jobset-operator-747c5859c7-tpxsq\" (UID: \"b271c541-e1a2-4f7d-876a-2cc8c9630482\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-tpxsq" Apr 21 06:32:11.847737 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:11.847721 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b271c541-e1a2-4f7d-876a-2cc8c9630482-tmp\") pod \"jobset-operator-747c5859c7-tpxsq\" (UID: \"b271c541-e1a2-4f7d-876a-2cc8c9630482\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-tpxsq" Apr 21 06:32:11.855682 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:11.855662 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42rlc\" (UniqueName: \"kubernetes.io/projected/b271c541-e1a2-4f7d-876a-2cc8c9630482-kube-api-access-42rlc\") pod \"jobset-operator-747c5859c7-tpxsq\" (UID: \"b271c541-e1a2-4f7d-876a-2cc8c9630482\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-tpxsq" Apr 21 06:32:11.973311 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:11.973255 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-tpxsq" Apr 21 06:32:12.117260 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:32:12.116956 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb271c541_e1a2_4f7d_876a_2cc8c9630482.slice/crio-1672c1b77986d771f638cd80d95e51865f1020695e02bdc512ac1ee18fdecee1 WatchSource:0}: Error finding container 1672c1b77986d771f638cd80d95e51865f1020695e02bdc512ac1ee18fdecee1: Status 404 returned error can't find the container with id 1672c1b77986d771f638cd80d95e51865f1020695e02bdc512ac1ee18fdecee1 Apr 21 06:32:12.117874 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:12.117843 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-tpxsq"] Apr 21 06:32:12.214817 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:12.214777 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-tpxsq" event={"ID":"b271c541-e1a2-4f7d-876a-2cc8c9630482","Type":"ContainerStarted","Data":"1672c1b77986d771f638cd80d95e51865f1020695e02bdc512ac1ee18fdecee1"} Apr 21 06:32:15.225206 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:15.225162 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-tpxsq" event={"ID":"b271c541-e1a2-4f7d-876a-2cc8c9630482","Type":"ContainerStarted","Data":"c60ce4ecd619b5f38dd5c2283f97d95c9dee47a5c122d21f8998f10418b763d9"} Apr 21 06:32:15.240252 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:15.240207 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-operator-747c5859c7-tpxsq" podStartSLOduration=1.65613459 podStartE2EDuration="4.240193335s" podCreationTimestamp="2026-04-21 06:32:11 +0000 UTC" firstStartedPulling="2026-04-21 06:32:12.118351193 +0000 UTC m=+346.603002196" lastFinishedPulling="2026-04-21 06:32:14.702409941 +0000 UTC m=+349.187060941" observedRunningTime="2026-04-21 06:32:15.239020974 +0000 UTC m=+349.723671997" watchObservedRunningTime="2026-04-21 06:32:15.240193335 +0000 UTC m=+349.724844368" Apr 21 06:32:26.916391 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:26.916360 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b"] Apr 21 06:32:26.919430 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:26.919414 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b" Apr 21 06:32:26.922729 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:26.922696 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"metrics-server-cert\"" Apr 21 06:32:26.922865 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:26.922729 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"jobset-manager-config\"" Apr 21 06:32:26.922865 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:26.922707 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-controller-manager-dockercfg-b4mwv\"" Apr 21 06:32:26.922865 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:26.922826 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"webhook-server-cert\"" Apr 21 06:32:26.926810 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:26.926790 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b"] Apr 21 06:32:26.944234 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:26.944215 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c-metrics-certs\") pod \"jobset-controller-manager-5cc8ccb8b7-jbp4b\" (UID: \"d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c\") " pod="openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b" Apr 21 06:32:26.944348 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:26.944297 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c-manager-config\") pod \"jobset-controller-manager-5cc8ccb8b7-jbp4b\" (UID: \"d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c\") " pod="openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b" Apr 21 06:32:26.944348 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:26.944331 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c-cert\") pod \"jobset-controller-manager-5cc8ccb8b7-jbp4b\" (UID: \"d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c\") " pod="openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b" Apr 21 06:32:26.944422 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:26.944411 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w448\" (UniqueName: \"kubernetes.io/projected/d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c-kube-api-access-2w448\") pod \"jobset-controller-manager-5cc8ccb8b7-jbp4b\" (UID: \"d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c\") " pod="openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b" Apr 21 06:32:27.045599 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:27.045558 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c-cert\") pod \"jobset-controller-manager-5cc8ccb8b7-jbp4b\" (UID: \"d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c\") " pod="openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b" Apr 21 06:32:27.045790 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:27.045689 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2w448\" (UniqueName: \"kubernetes.io/projected/d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c-kube-api-access-2w448\") pod \"jobset-controller-manager-5cc8ccb8b7-jbp4b\" (UID: \"d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c\") " pod="openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b" Apr 21 06:32:27.045790 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:27.045736 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c-metrics-certs\") pod \"jobset-controller-manager-5cc8ccb8b7-jbp4b\" (UID: \"d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c\") " pod="openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b" Apr 21 06:32:27.045920 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:27.045789 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c-manager-config\") pod \"jobset-controller-manager-5cc8ccb8b7-jbp4b\" (UID: \"d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c\") " pod="openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b" Apr 21 06:32:27.046743 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:27.046709 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c-manager-config\") pod \"jobset-controller-manager-5cc8ccb8b7-jbp4b\" (UID: \"d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c\") " pod="openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b" Apr 21 06:32:27.048722 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:27.048694 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c-cert\") pod \"jobset-controller-manager-5cc8ccb8b7-jbp4b\" (UID: \"d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c\") " pod="openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b" Apr 21 06:32:27.048952 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:27.048921 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c-metrics-certs\") pod \"jobset-controller-manager-5cc8ccb8b7-jbp4b\" (UID: \"d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c\") " pod="openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b" Apr 21 06:32:27.053928 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:27.053907 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w448\" (UniqueName: \"kubernetes.io/projected/d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c-kube-api-access-2w448\") pod \"jobset-controller-manager-5cc8ccb8b7-jbp4b\" (UID: \"d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c\") " pod="openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b" Apr 21 06:32:27.229649 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:27.229569 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b" Apr 21 06:32:27.346053 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:27.345947 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b"] Apr 21 06:32:27.348190 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:32:27.348157 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd78c09bc_d1a1_4d6e_93a9_46643fb3fd5c.slice/crio-5fa920d98862004c5a31fc7ee41dde6de8fec46b2857a24f3b8cef656b2078cd WatchSource:0}: Error finding container 5fa920d98862004c5a31fc7ee41dde6de8fec46b2857a24f3b8cef656b2078cd: Status 404 returned error can't find the container with id 5fa920d98862004c5a31fc7ee41dde6de8fec46b2857a24f3b8cef656b2078cd Apr 21 06:32:28.261851 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:28.261817 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b" event={"ID":"d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c","Type":"ContainerStarted","Data":"5fa920d98862004c5a31fc7ee41dde6de8fec46b2857a24f3b8cef656b2078cd"} Apr 21 06:32:43.305419 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:43.305383 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b" event={"ID":"d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c","Type":"ContainerStarted","Data":"7f5d2795e8f7bcba7b53f2be9740da6ca996d40a8bee4da2db3060d92c0f575a"} Apr 21 06:32:43.305814 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:43.305485 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b" Apr 21 06:32:43.321321 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:43.321278 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b" podStartSLOduration=2.19875241 podStartE2EDuration="17.321266835s" podCreationTimestamp="2026-04-21 06:32:26 +0000 UTC" firstStartedPulling="2026-04-21 06:32:27.350265829 +0000 UTC m=+361.834916836" lastFinishedPulling="2026-04-21 06:32:42.472780257 +0000 UTC m=+376.957431261" observedRunningTime="2026-04-21 06:32:43.319366092 +0000 UTC m=+377.804017115" watchObservedRunningTime="2026-04-21 06:32:43.321266835 +0000 UTC m=+377.805917856" Apr 21 06:32:54.314261 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:32:54.314226 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-jobset-operator/jobset-controller-manager-5cc8ccb8b7-jbp4b" Apr 21 06:36:26.019267 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:36:26.019236 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8rxc_9b8417f4-abc8-485b-8bfc-78987d632957/ovn-acl-logging/0.log" Apr 21 06:36:26.019833 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:36:26.019351 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8rxc_9b8417f4-abc8-485b-8bfc-78987d632957/ovn-acl-logging/0.log" Apr 21 06:37:55.960935 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:37:55.960901 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-fqc9p/progression-job-failure-node-0-0-lbqvb"] Apr 21 06:37:55.962935 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:37:55.962920 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-job-failure-node-0-0-lbqvb" Apr 21 06:37:55.965144 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:37:55.965105 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fqc9p\"/\"openshift-service-ca.crt\"" Apr 21 06:37:55.965144 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:37:55.965120 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fqc9p\"/\"kube-root-ca.crt\"" Apr 21 06:37:55.965302 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:37:55.965144 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-fqc9p\"/\"default-dockercfg-nvv9m\"" Apr 21 06:37:55.972714 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:37:55.972687 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-job-failure-node-0-0-lbqvb"] Apr 21 06:37:56.118052 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:37:56.118011 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smb79\" (UniqueName: \"kubernetes.io/projected/c717c5d2-6c46-4a9a-bcde-be96fcae9bff-kube-api-access-smb79\") pod \"progression-job-failure-node-0-0-lbqvb\" (UID: \"c717c5d2-6c46-4a9a-bcde-be96fcae9bff\") " pod="rhai-e2e-progression-fqc9p/progression-job-failure-node-0-0-lbqvb" Apr 21 06:37:56.219442 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:37:56.219370 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smb79\" (UniqueName: \"kubernetes.io/projected/c717c5d2-6c46-4a9a-bcde-be96fcae9bff-kube-api-access-smb79\") pod \"progression-job-failure-node-0-0-lbqvb\" (UID: \"c717c5d2-6c46-4a9a-bcde-be96fcae9bff\") " pod="rhai-e2e-progression-fqc9p/progression-job-failure-node-0-0-lbqvb" Apr 21 06:37:56.227408 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:37:56.227389 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smb79\" (UniqueName: \"kubernetes.io/projected/c717c5d2-6c46-4a9a-bcde-be96fcae9bff-kube-api-access-smb79\") pod \"progression-job-failure-node-0-0-lbqvb\" (UID: \"c717c5d2-6c46-4a9a-bcde-be96fcae9bff\") " pod="rhai-e2e-progression-fqc9p/progression-job-failure-node-0-0-lbqvb" Apr 21 06:37:56.272742 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:37:56.272719 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-job-failure-node-0-0-lbqvb" Apr 21 06:37:56.387297 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:37:56.387269 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-job-failure-node-0-0-lbqvb"] Apr 21 06:37:56.389888 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:37:56.389859 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc717c5d2_6c46_4a9a_bcde_be96fcae9bff.slice/crio-7299de73bc668be7c2585357de57757e7ad17111e9679b4855c21d5e679a9bb3 WatchSource:0}: Error finding container 7299de73bc668be7c2585357de57757e7ad17111e9679b4855c21d5e679a9bb3: Status 404 returned error can't find the container with id 7299de73bc668be7c2585357de57757e7ad17111e9679b4855c21d5e679a9bb3 Apr 21 06:37:56.391971 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:37:56.391954 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 06:37:57.183533 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:37:57.183473 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-job-failure-node-0-0-lbqvb" event={"ID":"c717c5d2-6c46-4a9a-bcde-be96fcae9bff","Type":"ContainerStarted","Data":"7299de73bc668be7c2585357de57757e7ad17111e9679b4855c21d5e679a9bb3"} Apr 21 06:39:43.510649 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:39:43.510560 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-job-failure-node-0-0-lbqvb" event={"ID":"c717c5d2-6c46-4a9a-bcde-be96fcae9bff","Type":"ContainerStarted","Data":"4fe72c2218148577dd3711a82da35a5cccdf27ab36a29a4d83c25cb4d19915b0"} Apr 21 06:39:43.511063 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:39:43.510648 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-fqc9p/progression-job-failure-node-0-0-lbqvb" Apr 21 06:39:43.530104 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:39:43.529976 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-fqc9p/progression-job-failure-node-0-0-lbqvb" podStartSLOduration=1.774905418 podStartE2EDuration="1m48.529959668s" podCreationTimestamp="2026-04-21 06:37:55 +0000 UTC" firstStartedPulling="2026-04-21 06:37:56.392076787 +0000 UTC m=+690.876727787" lastFinishedPulling="2026-04-21 06:39:43.147131037 +0000 UTC m=+797.631782037" observedRunningTime="2026-04-21 06:39:43.5294287 +0000 UTC m=+798.014079719" watchObservedRunningTime="2026-04-21 06:39:43.529959668 +0000 UTC m=+798.014610696" Apr 21 06:39:45.516046 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:39:45.516021 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-fqc9p/progression-job-failure-node-0-0-lbqvb" Apr 21 06:39:52.513834 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:39:52.513772 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-fqc9p/progression-job-failure-node-0-0-lbqvb" podUID="c717c5d2-6c46-4a9a-bcde-be96fcae9bff" containerName="node" probeResult="failure" output="Get \"http://10.133.0.23:28080/metrics\": dial tcp 10.133.0.23:28080: connect: connection refused" Apr 21 06:39:52.537578 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:39:52.537545 2571 generic.go:358] "Generic (PLEG): container finished" podID="c717c5d2-6c46-4a9a-bcde-be96fcae9bff" containerID="4fe72c2218148577dd3711a82da35a5cccdf27ab36a29a4d83c25cb4d19915b0" exitCode=1 Apr 21 06:39:52.537689 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:39:52.537615 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-job-failure-node-0-0-lbqvb" event={"ID":"c717c5d2-6c46-4a9a-bcde-be96fcae9bff","Type":"ContainerDied","Data":"4fe72c2218148577dd3711a82da35a5cccdf27ab36a29a4d83c25cb4d19915b0"} Apr 21 06:39:53.662447 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:39:53.662427 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-job-failure-node-0-0-lbqvb" Apr 21 06:39:53.743159 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:39:53.743129 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smb79\" (UniqueName: \"kubernetes.io/projected/c717c5d2-6c46-4a9a-bcde-be96fcae9bff-kube-api-access-smb79\") pod \"c717c5d2-6c46-4a9a-bcde-be96fcae9bff\" (UID: \"c717c5d2-6c46-4a9a-bcde-be96fcae9bff\") " Apr 21 06:39:53.745222 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:39:53.745190 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c717c5d2-6c46-4a9a-bcde-be96fcae9bff-kube-api-access-smb79" (OuterVolumeSpecName: "kube-api-access-smb79") pod "c717c5d2-6c46-4a9a-bcde-be96fcae9bff" (UID: "c717c5d2-6c46-4a9a-bcde-be96fcae9bff"). InnerVolumeSpecName "kube-api-access-smb79". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 06:39:53.843598 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:39:53.843548 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-smb79\" (UniqueName: \"kubernetes.io/projected/c717c5d2-6c46-4a9a-bcde-be96fcae9bff-kube-api-access-smb79\") on node \"ip-10-0-138-76.ec2.internal\" DevicePath \"\"" Apr 21 06:39:54.543965 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:39:54.543930 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fqc9p/progression-job-failure-node-0-0-lbqvb" Apr 21 06:39:54.544127 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:39:54.543924 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fqc9p/progression-job-failure-node-0-0-lbqvb" event={"ID":"c717c5d2-6c46-4a9a-bcde-be96fcae9bff","Type":"ContainerDied","Data":"7299de73bc668be7c2585357de57757e7ad17111e9679b4855c21d5e679a9bb3"} Apr 21 06:39:54.544127 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:39:54.544047 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7299de73bc668be7c2585357de57757e7ad17111e9679b4855c21d5e679a9bb3" Apr 21 06:40:12.991859 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:40:12.991818 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-job-failure-node-0-0-lbqvb"] Apr 21 06:40:12.997327 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:40:12.997301 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-fqc9p/progression-job-failure-node-0-0-lbqvb"] Apr 21 06:40:14.114553 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:40:14.114494 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c717c5d2-6c46-4a9a-bcde-be96fcae9bff" path="/var/lib/kubelet/pods/c717c5d2-6c46-4a9a-bcde-be96fcae9bff/volumes" Apr 21 06:41:03.567203 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:03.567167 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-rlzb5_75cb84e7-4602-4954-9579-ec59fa9a8289/global-pull-secret-syncer/0.log" Apr 21 06:41:03.571577 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:03.571560 2571 ???:1] "http: TLS handshake error from 10.0.129.55:40938: EOF" Apr 21 06:41:03.745430 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:03.745401 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-kf6hb_1acddd86-ee36-4689-b8ab-ef158e2b4a47/konnectivity-agent/0.log" Apr 21 06:41:03.838006 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:03.837920 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-76.ec2.internal_131d5a43792cacf9a9c03a2052451cbd/haproxy/0.log" Apr 21 06:41:06.859955 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:06.859843 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-gnrvf_28abb5ca-b2e3-46aa-8fb4-9a304b05acd1/kube-state-metrics/0.log" Apr 21 06:41:06.883173 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:06.883148 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-gnrvf_28abb5ca-b2e3-46aa-8fb4-9a304b05acd1/kube-rbac-proxy-main/0.log" Apr 21 06:41:06.902550 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:06.902481 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-gnrvf_28abb5ca-b2e3-46aa-8fb4-9a304b05acd1/kube-rbac-proxy-self/0.log" Apr 21 06:41:06.930746 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:06.930719 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-754d4c66d8-rcb4z_ab9d28b9-df72-4ace-b58a-a72c2a76bb82/metrics-server/0.log" Apr 21 06:41:06.996211 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:06.996190 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7vgfg_18081e16-7ca9-4320-a6fc-7726f9939e49/node-exporter/0.log" Apr 21 06:41:07.015270 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:07.015253 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7vgfg_18081e16-7ca9-4320-a6fc-7726f9939e49/kube-rbac-proxy/0.log" Apr 21 06:41:07.036067 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:07.036046 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7vgfg_18081e16-7ca9-4320-a6fc-7726f9939e49/init-textfile/0.log" Apr 21 06:41:07.216056 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:07.215996 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-4f7q4_d162ff51-8396-49a1-ad33-c6274571862d/kube-rbac-proxy-main/0.log" Apr 21 06:41:07.234280 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:07.234253 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-4f7q4_d162ff51-8396-49a1-ad33-c6274571862d/kube-rbac-proxy-self/0.log" Apr 21 06:41:07.255292 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:07.255269 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-4f7q4_d162ff51-8396-49a1-ad33-c6274571862d/openshift-state-metrics/0.log" Apr 21 06:41:09.642677 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:09.642649 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-2f6rv_75155df7-4d6d-4333-9af5-53bf8969c877/download-server/0.log" Apr 21 06:41:10.567200 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.567169 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44"] Apr 21 06:41:10.567566 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.567537 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c717c5d2-6c46-4a9a-bcde-be96fcae9bff" containerName="node" Apr 21 06:41:10.567566 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.567557 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c717c5d2-6c46-4a9a-bcde-be96fcae9bff" containerName="node" Apr 21 06:41:10.567725 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.567642 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="c717c5d2-6c46-4a9a-bcde-be96fcae9bff" containerName="node" Apr 21 06:41:10.570270 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.570248 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" Apr 21 06:41:10.572721 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.572703 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lwwlk\"/\"openshift-service-ca.crt\"" Apr 21 06:41:10.572823 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.572741 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lwwlk\"/\"kube-root-ca.crt\"" Apr 21 06:41:10.573486 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.573470 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-lwwlk\"/\"default-dockercfg-2tfc9\"" Apr 21 06:41:10.581196 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.581173 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44"] Apr 21 06:41:10.668434 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.668408 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwd4t\" (UniqueName: \"kubernetes.io/projected/20e2b4b5-f029-47af-873c-b8c2465a40f9-kube-api-access-vwd4t\") pod \"perf-node-gather-daemonset-xhl44\" (UID: \"20e2b4b5-f029-47af-873c-b8c2465a40f9\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" Apr 21 06:41:10.668750 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.668445 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/20e2b4b5-f029-47af-873c-b8c2465a40f9-podres\") pod \"perf-node-gather-daemonset-xhl44\" (UID: \"20e2b4b5-f029-47af-873c-b8c2465a40f9\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" Apr 21 06:41:10.668750 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.668472 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20e2b4b5-f029-47af-873c-b8c2465a40f9-lib-modules\") pod \"perf-node-gather-daemonset-xhl44\" (UID: \"20e2b4b5-f029-47af-873c-b8c2465a40f9\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" Apr 21 06:41:10.668750 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.668531 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20e2b4b5-f029-47af-873c-b8c2465a40f9-sys\") pod \"perf-node-gather-daemonset-xhl44\" (UID: \"20e2b4b5-f029-47af-873c-b8c2465a40f9\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" Apr 21 06:41:10.668750 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.668562 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/20e2b4b5-f029-47af-873c-b8c2465a40f9-proc\") pod \"perf-node-gather-daemonset-xhl44\" (UID: \"20e2b4b5-f029-47af-873c-b8c2465a40f9\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" Apr 21 06:41:10.675036 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.675015 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-t2hf9_f6d75f91-6fd1-4d46-9c53-7c8492b33064/dns/0.log" Apr 21 06:41:10.692504 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.692484 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-t2hf9_f6d75f91-6fd1-4d46-9c53-7c8492b33064/kube-rbac-proxy/0.log" Apr 21 06:41:10.769038 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.769015 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwd4t\" (UniqueName: \"kubernetes.io/projected/20e2b4b5-f029-47af-873c-b8c2465a40f9-kube-api-access-vwd4t\") pod \"perf-node-gather-daemonset-xhl44\" (UID: \"20e2b4b5-f029-47af-873c-b8c2465a40f9\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" Apr 21 06:41:10.769114 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.769051 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/20e2b4b5-f029-47af-873c-b8c2465a40f9-podres\") pod \"perf-node-gather-daemonset-xhl44\" (UID: \"20e2b4b5-f029-47af-873c-b8c2465a40f9\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" Apr 21 06:41:10.769114 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.769078 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20e2b4b5-f029-47af-873c-b8c2465a40f9-lib-modules\") pod \"perf-node-gather-daemonset-xhl44\" (UID: \"20e2b4b5-f029-47af-873c-b8c2465a40f9\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" Apr 21 06:41:10.769114 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.769099 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20e2b4b5-f029-47af-873c-b8c2465a40f9-sys\") pod \"perf-node-gather-daemonset-xhl44\" (UID: \"20e2b4b5-f029-47af-873c-b8c2465a40f9\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" Apr 21 06:41:10.769233 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.769135 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/20e2b4b5-f029-47af-873c-b8c2465a40f9-proc\") pod \"perf-node-gather-daemonset-xhl44\" (UID: \"20e2b4b5-f029-47af-873c-b8c2465a40f9\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" Apr 21 06:41:10.769233 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.769195 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/20e2b4b5-f029-47af-873c-b8c2465a40f9-podres\") pod \"perf-node-gather-daemonset-xhl44\" (UID: \"20e2b4b5-f029-47af-873c-b8c2465a40f9\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" Apr 21 06:41:10.769233 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.769213 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/20e2b4b5-f029-47af-873c-b8c2465a40f9-proc\") pod \"perf-node-gather-daemonset-xhl44\" (UID: \"20e2b4b5-f029-47af-873c-b8c2465a40f9\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" Apr 21 06:41:10.769233 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.769200 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20e2b4b5-f029-47af-873c-b8c2465a40f9-lib-modules\") pod \"perf-node-gather-daemonset-xhl44\" (UID: \"20e2b4b5-f029-47af-873c-b8c2465a40f9\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" Apr 21 06:41:10.769348 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.769235 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20e2b4b5-f029-47af-873c-b8c2465a40f9-sys\") pod \"perf-node-gather-daemonset-xhl44\" (UID: \"20e2b4b5-f029-47af-873c-b8c2465a40f9\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" Apr 21 06:41:10.778762 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.778739 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwd4t\" (UniqueName: \"kubernetes.io/projected/20e2b4b5-f029-47af-873c-b8c2465a40f9-kube-api-access-vwd4t\") pod \"perf-node-gather-daemonset-xhl44\" (UID: \"20e2b4b5-f029-47af-873c-b8c2465a40f9\") " pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" Apr 21 06:41:10.801216 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.801197 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9x6w4_a90e0df9-bd86-4e43-ab44-ddd45d0f1a43/dns-node-resolver/0.log" Apr 21 06:41:10.881450 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.881399 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" Apr 21 06:41:10.996135 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:10.996107 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44"] Apr 21 06:41:10.999413 ip-10-0-138-76 kubenswrapper[2571]: W0421 06:41:10.999384 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod20e2b4b5_f029_47af_873c_b8c2465a40f9.slice/crio-97ae2b41ba0b5708b0de38952d09283685788b0cd6a06dde97a662a6d1b785a8 WatchSource:0}: Error finding container 97ae2b41ba0b5708b0de38952d09283685788b0cd6a06dde97a662a6d1b785a8: Status 404 returned error can't find the container with id 97ae2b41ba0b5708b0de38952d09283685788b0cd6a06dde97a662a6d1b785a8 Apr 21 06:41:11.189010 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:11.188954 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-579f775496-zztsw_7b58fb78-ebee-424f-9022-f06fa1fd9290/registry/0.log" Apr 21 06:41:11.227960 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:11.227937 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qvz4w_b319b9ca-8134-427f-bce9-921c4216c413/node-ca/0.log" Apr 21 06:41:11.765080 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:11.765038 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" event={"ID":"20e2b4b5-f029-47af-873c-b8c2465a40f9","Type":"ContainerStarted","Data":"b9058a565e3ef4f896bddacdce5b99902dc2edb718e90a77af3a3606236e5226"} Apr 21 06:41:11.765080 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:11.765083 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" event={"ID":"20e2b4b5-f029-47af-873c-b8c2465a40f9","Type":"ContainerStarted","Data":"97ae2b41ba0b5708b0de38952d09283685788b0cd6a06dde97a662a6d1b785a8"} Apr 21 06:41:11.765534 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:11.765181 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" Apr 21 06:41:11.780375 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:11.780327 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" podStartSLOduration=1.780313617 podStartE2EDuration="1.780313617s" podCreationTimestamp="2026-04-21 06:41:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 06:41:11.778764408 +0000 UTC m=+886.263415431" watchObservedRunningTime="2026-04-21 06:41:11.780313617 +0000 UTC m=+886.264964639" Apr 21 06:41:12.209919 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:12.209879 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-mdnlh_4aa4526d-b9ab-4ef8-9333-2a849fe6acc7/serve-healthcheck-canary/0.log" Apr 21 06:41:12.605839 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:12.605811 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5n64c_b31dbc8f-189e-47fe-8be6-d02332dd3cb9/kube-rbac-proxy/0.log" Apr 21 06:41:12.623059 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:12.623031 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5n64c_b31dbc8f-189e-47fe-8be6-d02332dd3cb9/exporter/0.log" Apr 21 06:41:12.640873 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:12.640855 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5n64c_b31dbc8f-189e-47fe-8be6-d02332dd3cb9/extractor/0.log" Apr 21 06:41:14.226853 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:14.226816 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-controller-manager-5cc8ccb8b7-jbp4b_d78c09bc-d1a1-4d6e-93a9-46643fb3fd5c/manager/0.log" Apr 21 06:41:14.249689 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:14.249666 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-operator-747c5859c7-tpxsq_b271c541-e1a2-4f7d-876a-2cc8c9630482/jobset-operator/0.log" Apr 21 06:41:17.118852 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:17.118814 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-dfjbd_250dc6fb-a5a2-4b36-8c2d-2c0ba8d08ee2/migrator/0.log" Apr 21 06:41:17.135718 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:17.135695 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-dfjbd_250dc6fb-a5a2-4b36-8c2d-2c0ba8d08ee2/graceful-termination/0.log" Apr 21 06:41:17.777822 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:17.777795 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-lwwlk/perf-node-gather-daemonset-xhl44" Apr 21 06:41:18.430692 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:18.430666 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-stw9q_2b985311-2ecb-45b4-8665-a9a42cef2837/kube-multus-additional-cni-plugins/0.log" Apr 21 06:41:18.449435 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:18.449411 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-stw9q_2b985311-2ecb-45b4-8665-a9a42cef2837/egress-router-binary-copy/0.log" Apr 21 06:41:18.467620 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:18.467599 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-stw9q_2b985311-2ecb-45b4-8665-a9a42cef2837/cni-plugins/0.log" Apr 21 06:41:18.486278 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:18.486261 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-stw9q_2b985311-2ecb-45b4-8665-a9a42cef2837/bond-cni-plugin/0.log" Apr 21 06:41:18.503776 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:18.503754 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-stw9q_2b985311-2ecb-45b4-8665-a9a42cef2837/routeoverride-cni/0.log" Apr 21 06:41:18.520754 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:18.520735 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-stw9q_2b985311-2ecb-45b4-8665-a9a42cef2837/whereabouts-cni-bincopy/0.log" Apr 21 06:41:18.538361 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:18.538341 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-stw9q_2b985311-2ecb-45b4-8665-a9a42cef2837/whereabouts-cni/0.log" Apr 21 06:41:18.625034 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:18.625002 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vpw6q_a81bc131-222a-47ad-9171-0a4db0b65c51/kube-multus/0.log" Apr 21 06:41:18.689631 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:18.689584 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qjchj_546de538-b56a-4ad2-baeb-3d59144586fb/network-metrics-daemon/0.log" Apr 21 06:41:18.707382 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:18.707363 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qjchj_546de538-b56a-4ad2-baeb-3d59144586fb/kube-rbac-proxy/0.log" Apr 21 06:41:20.173409 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:20.173381 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8rxc_9b8417f4-abc8-485b-8bfc-78987d632957/ovn-controller/0.log" Apr 21 06:41:20.189706 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:20.189682 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8rxc_9b8417f4-abc8-485b-8bfc-78987d632957/ovn-acl-logging/0.log" Apr 21 06:41:20.197884 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:20.197862 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8rxc_9b8417f4-abc8-485b-8bfc-78987d632957/ovn-acl-logging/1.log" Apr 21 06:41:20.218143 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:20.218119 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8rxc_9b8417f4-abc8-485b-8bfc-78987d632957/kube-rbac-proxy-node/0.log" Apr 21 06:41:20.237443 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:20.237416 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8rxc_9b8417f4-abc8-485b-8bfc-78987d632957/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 06:41:20.255284 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:20.255230 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8rxc_9b8417f4-abc8-485b-8bfc-78987d632957/northd/0.log" Apr 21 06:41:20.273952 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:20.273935 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8rxc_9b8417f4-abc8-485b-8bfc-78987d632957/nbdb/0.log" Apr 21 06:41:20.292856 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:20.292830 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8rxc_9b8417f4-abc8-485b-8bfc-78987d632957/sbdb/0.log" Apr 21 06:41:20.449009 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:20.448979 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8rxc_9b8417f4-abc8-485b-8bfc-78987d632957/ovnkube-controller/0.log" Apr 21 06:41:21.342923 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:21.342878 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-gcmqq_afbf0267-655b-4cf1-bb8f-dcfa09f69f56/check-endpoints/0.log" Apr 21 06:41:21.409994 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:21.409965 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-zk874_fdd3109b-7468-400c-b587-0e2d50c0911b/network-check-target-container/0.log" Apr 21 06:41:22.237573 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:22.237542 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-59vzs_8eba033d-48d3-4a60-b429-c79feb5274f3/iptables-alerter/0.log" Apr 21 06:41:22.879000 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:22.878962 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-p6x5s_6e96c3e6-5bac-49c9-b707-018f191114fa/tuned/0.log" Apr 21 06:41:25.285693 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:25.285658 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-s74rh_19694577-e921-48dd-acfc-d48492e5ee03/service-ca-operator/1.log" Apr 21 06:41:25.287099 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:25.287077 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-s74rh_19694577-e921-48dd-acfc-d48492e5ee03/service-ca-operator/0.log" Apr 21 06:41:25.913823 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:25.913739 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-mr8fb_5b1c2098-f1cb-4a0f-a8c1-d131e97e930d/csi-driver/0.log" Apr 21 06:41:25.931415 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:25.931386 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-mr8fb_5b1c2098-f1cb-4a0f-a8c1-d131e97e930d/csi-node-driver-registrar/0.log" Apr 21 06:41:25.949037 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:25.949014 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-mr8fb_5b1c2098-f1cb-4a0f-a8c1-d131e97e930d/csi-liveness-probe/0.log" Apr 21 06:41:26.043125 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:26.043097 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8rxc_9b8417f4-abc8-485b-8bfc-78987d632957/ovn-acl-logging/0.log" Apr 21 06:41:26.045500 ip-10-0-138-76 kubenswrapper[2571]: I0421 06:41:26.045476 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8rxc_9b8417f4-abc8-485b-8bfc-78987d632957/ovn-acl-logging/0.log"