Apr 21 15:32:45.732838 ip-10-0-133-237 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 15:32:45.732851 ip-10-0-133-237 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 15:32:45.732860 ip-10-0-133-237 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 15:32:45.733222 ip-10-0-133-237 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 15:32:55.790009 ip-10-0-133-237 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 15:32:55.790026 ip-10-0-133-237 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot b3195a1fc13d4e87b4ab407e47bc1522 -- Apr 21 15:35:18.577385 ip-10-0-133-237 systemd[1]: Starting Kubernetes Kubelet... Apr 21 15:35:19.048356 ip-10-0-133-237 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:35:19.048356 ip-10-0-133-237 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 15:35:19.048356 ip-10-0-133-237 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:35:19.048356 ip-10-0-133-237 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 15:35:19.048356 ip-10-0-133-237 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:35:19.050242 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.050152 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 15:35:19.053269 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053254 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:19.053269 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053269 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:19.053329 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053273 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:19.053329 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053276 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:19.053329 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053279 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:19.053329 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053284 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:19.053329 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053287 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:19.053329 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053290 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:19.053329 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053292 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:19.053329 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053296 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:19.053329 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053299 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:19.053329 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053302 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:19.053329 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053305 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:19.053329 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053308 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:19.053329 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053311 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:19.053329 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053314 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:19.053329 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053317 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:19.053329 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053320 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:19.053329 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053323 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:19.053329 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053326 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:19.053329 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053328 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:19.053329 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053332 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:19.053785 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053335 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:19.053785 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053338 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:19.053785 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053341 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:19.053785 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053344 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:19.053785 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053346 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:19.053785 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053349 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:19.053785 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053351 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:19.053785 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053354 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:19.053785 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053356 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:19.053785 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053359 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:19.053785 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053362 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:19.053785 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053371 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:19.053785 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053375 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:19.053785 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053379 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:19.053785 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053381 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:19.053785 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053384 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:19.053785 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053386 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:19.053785 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053388 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:19.053785 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053391 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:19.054291 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053393 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:19.054291 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053396 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:19.054291 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053398 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:19.054291 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053401 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:19.054291 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053403 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:19.054291 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053405 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:19.054291 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053408 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:19.054291 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053410 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:19.054291 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053413 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:19.054291 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053415 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:19.054291 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053418 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:19.054291 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053421 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:19.054291 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053423 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:19.054291 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053426 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:19.054291 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053429 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:19.054291 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053431 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:19.054291 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053434 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:19.054291 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053437 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:19.054291 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053439 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:19.054291 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053442 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:19.054762 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053444 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:19.054762 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053446 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:19.054762 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053449 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:19.054762 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053451 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:19.054762 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053454 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:19.054762 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053459 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:19.054762 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053464 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:19.054762 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053466 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:19.054762 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053469 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:19.054762 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053471 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:19.054762 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053474 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:19.054762 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053476 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:19.054762 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053480 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:19.054762 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053484 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:19.054762 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053487 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:19.054762 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053490 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:19.054762 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053497 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:19.054762 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053500 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:19.054762 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053503 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:19.055296 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053505 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:19.055296 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053508 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:19.055296 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053511 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:19.055296 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053513 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:19.055296 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053516 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:19.055296 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053519 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:19.055296 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053924 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:19.055296 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053929 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:19.055296 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053932 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:19.055296 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053935 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:19.055296 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053937 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:19.055296 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053940 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:19.055296 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053943 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:19.055296 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053945 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:19.055296 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053948 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:19.055296 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053951 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:19.055296 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053953 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:19.055296 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053955 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:19.055296 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053963 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:19.055296 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053966 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:19.055779 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053969 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:19.055779 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053971 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:19.055779 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053973 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:19.055779 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053976 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:19.055779 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053978 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:19.055779 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053981 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:19.055779 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053983 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:19.055779 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053986 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:19.055779 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053988 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:19.055779 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053991 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:19.055779 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053993 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:19.055779 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053995 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:19.055779 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.053998 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:19.055779 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054000 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:19.055779 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054003 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:19.055779 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054005 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:19.055779 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054008 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:19.055779 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054010 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:19.055779 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054013 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:19.055779 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054015 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:19.056275 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054018 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:19.056275 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054021 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:19.056275 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054023 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:19.056275 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054025 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:19.056275 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054028 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:19.056275 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054030 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:19.056275 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054033 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:19.056275 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054035 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:19.056275 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054047 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:19.056275 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054050 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:19.056275 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054052 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:19.056275 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054060 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:19.056275 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054063 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:19.056275 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054065 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:19.056275 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054068 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:19.056275 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054070 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:19.056275 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054073 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:19.056275 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054075 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:19.056275 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054078 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:19.056275 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054080 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:19.056757 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054082 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:19.056757 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054085 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:19.056757 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054090 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:19.056757 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054093 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:19.056757 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054096 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:19.056757 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054098 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:19.056757 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054102 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:19.056757 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054105 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:19.056757 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054109 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:19.056757 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054112 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:19.056757 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054115 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:19.056757 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054118 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:19.056757 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054121 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:19.056757 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054123 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:19.056757 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054126 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:19.056757 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054129 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:19.056757 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054144 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:19.056757 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054147 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:19.056757 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054150 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054153 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054155 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054158 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054160 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054163 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054171 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054174 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054177 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054179 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054182 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054184 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054187 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054189 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054265 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054274 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054291 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054296 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054300 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054304 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054308 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 15:35:19.057255 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054312 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054316 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054319 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054322 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054326 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054329 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054332 2570 flags.go:64] FLAG: --cgroup-root="" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054335 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054338 2570 flags.go:64] FLAG: --client-ca-file="" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054340 2570 flags.go:64] FLAG: --cloud-config="" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054343 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054346 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054353 2570 flags.go:64] FLAG: --cluster-domain="" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054356 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054359 2570 flags.go:64] FLAG: --config-dir="" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054362 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054366 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054369 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054378 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054381 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054384 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054387 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054390 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054393 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054396 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 15:35:19.057745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054399 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054403 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054406 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054409 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054412 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054415 2570 flags.go:64] FLAG: --enable-server="true" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054418 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054425 2570 flags.go:64] FLAG: --event-burst="100" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054428 2570 flags.go:64] FLAG: --event-qps="50" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054431 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054434 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054437 2570 flags.go:64] FLAG: --eviction-hard="" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054441 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054444 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054447 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054450 2570 flags.go:64] FLAG: --eviction-soft="" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054453 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054456 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054459 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054461 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054464 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054467 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054470 2570 flags.go:64] FLAG: --feature-gates="" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054473 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054476 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 15:35:19.058350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054480 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054483 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054487 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054490 2570 flags.go:64] FLAG: --help="false" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054493 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-133-237.ec2.internal" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054496 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054499 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054502 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054505 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054508 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054511 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054514 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054516 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054519 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054522 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054525 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054528 2570 flags.go:64] FLAG: --kube-reserved="" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054531 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054534 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054537 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054540 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054543 2570 flags.go:64] FLAG: --lock-file="" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054545 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054548 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 15:35:19.058939 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054552 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054557 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054560 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054563 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054565 2570 flags.go:64] FLAG: --logging-format="text" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054568 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054571 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054574 2570 flags.go:64] FLAG: --manifest-url="" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054577 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054581 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054594 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054599 2570 flags.go:64] FLAG: --max-pods="110" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054602 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054605 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054608 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054611 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054614 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054616 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054619 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054626 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054629 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054632 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054635 2570 flags.go:64] FLAG: --pod-cidr="" Apr 21 15:35:19.059506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054638 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054643 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054646 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054649 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054652 2570 flags.go:64] FLAG: --port="10250" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054655 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054658 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-03fe5b0bda0928063" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054661 2570 flags.go:64] FLAG: --qos-reserved="" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054663 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054666 2570 flags.go:64] FLAG: --register-node="true" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054670 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054673 2570 flags.go:64] FLAG: --register-with-taints="" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054676 2570 flags.go:64] FLAG: --registry-burst="10" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054679 2570 flags.go:64] FLAG: --registry-qps="5" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054682 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054685 2570 flags.go:64] FLAG: --reserved-memory="" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054688 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054691 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054694 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054697 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054700 2570 flags.go:64] FLAG: --runonce="false" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054704 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054707 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054709 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054713 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054715 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 15:35:19.060045 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054718 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054721 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054724 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054727 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054730 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054733 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054736 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054739 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054742 2570 flags.go:64] FLAG: --system-cgroups="" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054745 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054750 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054753 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054756 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054760 2570 flags.go:64] FLAG: --tls-min-version="" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054763 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054766 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054769 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054772 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054775 2570 flags.go:64] FLAG: --v="2" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054779 2570 flags.go:64] FLAG: --version="false" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054784 2570 flags.go:64] FLAG: --vmodule="" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054788 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.054791 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054900 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:19.060657 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054903 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:19.061256 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054906 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:19.061256 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054909 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:19.061256 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054913 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:19.061256 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054916 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:19.061256 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054918 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:19.061256 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054921 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:19.061256 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054924 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:19.061256 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054926 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:19.061256 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054929 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:19.061256 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054932 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:19.061256 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054934 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:19.061256 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054936 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:19.061256 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054939 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:19.061256 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054944 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:19.061256 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054946 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:19.061256 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054949 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:19.061256 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054951 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:19.061256 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054954 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:19.061256 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054956 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:19.061753 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054959 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:19.061753 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054962 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:19.061753 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054964 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:19.061753 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054967 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:19.061753 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054969 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:19.061753 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054972 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:19.061753 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054974 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:19.061753 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054977 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:19.061753 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054980 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:19.061753 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054982 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:19.061753 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054985 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:19.061753 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054987 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:19.061753 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054989 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:19.061753 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054992 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:19.061753 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054994 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:19.061753 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.054997 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:19.061753 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055000 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:19.061753 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055002 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:19.061753 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055005 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:19.061753 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055007 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:19.062250 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055011 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:19.062250 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055014 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:19.062250 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055016 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:19.062250 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055019 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:19.062250 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055022 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:19.062250 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055024 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:19.062250 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055028 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:19.062250 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055030 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:19.062250 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055033 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:19.062250 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055035 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:19.062250 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055037 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:19.062250 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055040 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:19.062250 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055042 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:19.062250 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055045 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:19.062250 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055047 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:19.062250 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055050 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:19.062250 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055052 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:19.062250 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055055 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:19.062250 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055057 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:19.062250 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055060 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:19.063020 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055062 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:19.063020 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055064 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:19.063020 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055067 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:19.063020 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055070 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:19.063020 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055072 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:19.063020 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055075 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:19.063020 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055077 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:19.063020 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055080 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:19.063020 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055082 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:19.063020 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055086 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:19.063020 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055090 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:19.063020 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055094 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:19.063020 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055099 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:19.063020 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055102 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:19.063020 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055105 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:19.063020 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055107 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:19.063020 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055110 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:19.063020 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055113 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:19.063020 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055117 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:19.063020 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055119 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:19.063696 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055122 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:19.063696 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055124 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:19.063696 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055127 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:19.063696 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055129 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:19.063696 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.055143 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:19.063696 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.055966 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:35:19.065952 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.065934 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 15:35:19.065991 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.065953 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 15:35:19.066018 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066001 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:19.066018 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066006 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:19.066018 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066010 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:19.066018 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066013 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:19.066018 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066015 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:19.066018 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066018 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:19.066018 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066020 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:19.066209 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066023 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:19.066209 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066026 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:19.066209 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066029 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:19.066209 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066031 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:19.066209 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066034 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:19.066209 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066037 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:19.066209 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066039 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:19.066209 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066041 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:19.066209 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066044 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:19.066209 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066046 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:19.066209 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066049 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:19.066209 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066051 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:19.066209 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066054 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:19.066209 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066057 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:19.066209 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066059 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:19.066209 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066062 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:19.066209 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066065 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:19.066209 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066067 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:19.066209 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066070 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:19.066668 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066072 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:19.066668 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066075 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:19.066668 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066078 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:19.066668 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066082 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:19.066668 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066086 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:19.066668 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066089 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:19.066668 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066092 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:19.066668 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066095 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:19.066668 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066097 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:19.066668 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066100 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:19.066668 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066102 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:19.066668 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066105 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:19.066668 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066108 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:19.066668 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066110 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:19.066668 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066114 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:19.066668 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066116 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:19.066668 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066119 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:19.066668 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066121 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:19.066668 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066124 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:19.066668 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066126 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:19.067157 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066129 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:19.067157 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066145 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:19.067157 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066149 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:19.067157 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066151 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:19.067157 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066154 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:19.067157 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066157 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:19.067157 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066159 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:19.067157 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066161 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:19.067157 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066164 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:19.067157 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066166 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:19.067157 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066169 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:19.067157 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066171 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:19.067157 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066174 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:19.067157 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066176 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:19.067157 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066178 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:19.067157 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066182 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:19.067157 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066186 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:19.067157 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066189 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:19.067157 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066192 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:19.067157 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066194 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:19.067630 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066197 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:19.067630 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066199 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:19.067630 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066202 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:19.067630 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066204 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:19.067630 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066206 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:19.067630 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066209 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:19.067630 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066212 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:19.067630 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066215 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:19.067630 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066217 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:19.067630 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066220 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:19.067630 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066224 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:19.067630 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066226 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:19.067630 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066229 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:19.067630 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066231 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:19.067630 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066234 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:19.067630 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066236 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:19.067630 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066238 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:19.067630 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066241 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:19.067630 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066244 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:19.067630 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066246 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:19.068098 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.066251 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:35:19.068098 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066340 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:19.068098 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066345 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:19.068098 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066348 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:19.068098 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066351 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:19.068098 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066354 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:19.068098 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066356 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:19.068098 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066359 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:19.068098 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066361 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:19.068098 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066364 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:19.068098 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066366 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:19.068098 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066369 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:19.068098 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066371 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:19.068098 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066374 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:19.068098 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066376 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:19.068541 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066379 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:19.068541 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066382 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:19.068541 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066384 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:19.068541 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066387 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:19.068541 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066389 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:19.068541 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066392 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:19.068541 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066395 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:19.068541 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066397 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:19.068541 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066400 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:19.068541 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066403 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:19.068541 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066405 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:19.068541 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066408 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:19.068541 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066410 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:19.068541 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066413 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:19.068541 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066415 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:19.068541 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066418 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:19.068541 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066421 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:19.068541 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066424 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:19.068541 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066427 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:19.068541 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066429 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:19.068997 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066432 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:19.068997 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066434 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:19.068997 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066437 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:19.068997 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066439 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:19.068997 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066441 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:19.068997 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066444 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:19.068997 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066446 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:19.068997 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066449 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:19.068997 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066451 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:19.068997 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066453 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:19.068997 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066456 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:19.068997 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066459 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:19.068997 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066461 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:19.068997 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066463 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:19.068997 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066466 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:19.068997 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066468 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:19.068997 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066471 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:19.068997 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066473 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:19.068997 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066476 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:19.068997 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066479 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:19.069478 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066481 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:19.069478 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066484 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:19.069478 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066486 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:19.069478 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066489 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:19.069478 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066491 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:19.069478 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066494 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:19.069478 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066496 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:19.069478 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066499 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:19.069478 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066501 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:19.069478 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066503 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:19.069478 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066506 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:19.069478 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066508 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:19.069478 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066510 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:19.069478 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066513 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:19.069478 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066515 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:19.069478 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066517 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:19.069478 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066520 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:19.069478 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066522 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:19.069478 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066525 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:19.069478 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066527 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:19.069944 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066530 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:19.069944 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066532 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:19.069944 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066535 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:19.069944 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066537 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:19.069944 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066539 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:19.069944 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066542 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:19.069944 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066545 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:19.069944 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066548 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:19.069944 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066550 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:19.069944 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066554 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:19.069944 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066557 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:19.069944 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:19.066560 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:19.069944 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.066565 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:35:19.069944 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.067260 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 15:35:19.074430 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.074416 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 15:35:19.075394 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.075382 2570 server.go:1019] "Starting client certificate rotation" Apr 21 15:35:19.075491 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.075478 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 15:35:19.075525 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.075517 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 15:35:19.102212 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.102194 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 15:35:19.107128 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.107106 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 15:35:19.120861 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.120843 2570 log.go:25] "Validated CRI v1 runtime API" Apr 21 15:35:19.127683 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.127666 2570 log.go:25] "Validated CRI v1 image API" Apr 21 15:35:19.130673 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.130656 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 15:35:19.135014 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.134994 2570 fs.go:135] Filesystem UUIDs: map[5cbf9ca6-bd0f-4c97-a246-17e3c9f1ae8e:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 95a5204c-2b60-41e0-8869-41c1a5208895:/dev/nvme0n1p4] Apr 21 15:35:19.135077 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.135013 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 15:35:19.142077 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.141974 2570 manager.go:217] Machine: {Timestamp:2026-04-21 15:35:19.140089782 +0000 UTC m=+0.440943975 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3200260 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23e1973ad5ae8dbb3397de7912cd03 SystemUUID:ec23e197-3ad5-ae8d-bb33-97de7912cd03 BootID:b3195a1f-c13d-4e87-b4ab-407e47bc1522 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:d6:9d:06:31:43 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:d6:9d:06:31:43 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:fe:16:63:c9:78:ea Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 15:35:19.142077 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.142067 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 15:35:19.142202 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.142187 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 15:35:19.143424 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.143402 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 15:35:19.143561 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.143426 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-237.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 15:35:19.143609 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.143569 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 15:35:19.143609 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.143577 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 15:35:19.143609 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.143593 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 15:35:19.144398 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.144387 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 15:35:19.145270 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.145260 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 21 15:35:19.145527 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.145517 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 15:35:19.148630 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.148613 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 15:35:19.148923 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.148911 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 21 15:35:19.148960 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.148932 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 15:35:19.148960 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.148948 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 15:35:19.148960 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.148958 2570 kubelet.go:397] "Adding apiserver pod source" Apr 21 15:35:19.149043 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.148967 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 15:35:19.150128 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.150113 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 15:35:19.150194 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.150156 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 15:35:19.153457 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.153442 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 15:35:19.154768 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.154754 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 15:35:19.156252 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.156241 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 15:35:19.156316 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.156258 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 15:35:19.156316 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.156264 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 15:35:19.156316 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.156271 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 15:35:19.156316 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.156277 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 15:35:19.156316 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.156282 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 15:35:19.156316 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.156288 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 15:35:19.156316 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.156293 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 15:35:19.156316 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.156300 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 15:35:19.156316 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.156307 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 15:35:19.156316 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.156320 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 15:35:19.156558 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.156329 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 15:35:19.157237 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.157228 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 15:35:19.157237 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.157237 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 15:35:19.160549 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.160537 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 15:35:19.160642 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.160570 2570 server.go:1295] "Started kubelet" Apr 21 15:35:19.160692 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.160642 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 15:35:19.160739 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.160671 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 15:35:19.160819 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.160748 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 15:35:19.161624 ip-10-0-133-237 systemd[1]: Started Kubernetes Kubelet. Apr 21 15:35:19.161894 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.161740 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 15:35:19.163275 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.163261 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 21 15:35:19.165937 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.165919 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-237.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 15:35:19.166032 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.165992 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 15:35:19.166205 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.166187 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-237.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 15:35:19.170003 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.169976 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 15:35:19.170442 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.170423 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 15:35:19.171101 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.171080 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 15:35:19.171215 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.171127 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 15:35:19.171215 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.171127 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 15:35:19.171215 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.171170 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 15:35:19.171347 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.171252 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 21 15:35:19.171347 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.171260 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 21 15:35:19.171347 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.171282 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-237.ec2.internal\" not found" Apr 21 15:35:19.171347 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.171325 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 15:35:19.171347 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.171336 2570 factory.go:55] Registering systemd factory Apr 21 15:35:19.171347 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.171342 2570 factory.go:223] Registration of the systemd container factory successfully Apr 21 15:35:19.171596 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.171515 2570 factory.go:153] Registering CRI-O factory Apr 21 15:35:19.171596 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.171524 2570 factory.go:223] Registration of the crio container factory successfully Apr 21 15:35:19.171596 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.171542 2570 factory.go:103] Registering Raw factory Apr 21 15:35:19.171596 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.171550 2570 manager.go:1196] Started watching for new ooms in manager Apr 21 15:35:19.171897 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.171883 2570 manager.go:319] Starting recovery of all containers Apr 21 15:35:19.178913 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.178883 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 15:35:19.180907 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.180869 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 15:35:19.180907 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.180868 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-237.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 15:35:19.182252 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.180964 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-237.ec2.internal.18a86932a08d2aec default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-237.ec2.internal,UID:ip-10-0-133-237.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-237.ec2.internal,},FirstTimestamp:2026-04-21 15:35:19.160548076 +0000 UTC m=+0.461402271,LastTimestamp:2026-04-21 15:35:19.160548076 +0000 UTC m=+0.461402271,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-237.ec2.internal,}" Apr 21 15:35:19.182796 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.182778 2570 manager.go:324] Recovery completed Apr 21 15:35:19.187020 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.187009 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:19.189383 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.189367 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-237.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:19.189486 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.189394 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:19.189486 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.189404 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-237.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:19.189891 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.189876 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 15:35:19.189891 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.189892 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 15:35:19.189972 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.189909 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 21 15:35:19.192319 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.192306 2570 policy_none.go:49] "None policy: Start" Apr 21 15:35:19.192371 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.192321 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 15:35:19.192371 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.192331 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 21 15:35:19.192433 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.192336 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-237.ec2.internal.18a86932a245236c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-237.ec2.internal,UID:ip-10-0-133-237.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-133-237.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-133-237.ec2.internal,},FirstTimestamp:2026-04-21 15:35:19.189381996 +0000 UTC m=+0.490236188,LastTimestamp:2026-04-21 15:35:19.189381996 +0000 UTC m=+0.490236188,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-237.ec2.internal,}" Apr 21 15:35:19.206725 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.206653 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-237.ec2.internal.18a86932a245637e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-237.ec2.internal,UID:ip-10-0-133-237.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-133-237.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-133-237.ec2.internal,},FirstTimestamp:2026-04-21 15:35:19.189398398 +0000 UTC m=+0.490252590,LastTimestamp:2026-04-21 15:35:19.189398398 +0000 UTC m=+0.490252590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-237.ec2.internal,}" Apr 21 15:35:19.217246 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.217228 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-r728z" Apr 21 15:35:19.219890 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.219828 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-237.ec2.internal.18a86932a24586a8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-237.ec2.internal,UID:ip-10-0-133-237.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-133-237.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-133-237.ec2.internal,},FirstTimestamp:2026-04-21 15:35:19.1894074 +0000 UTC m=+0.490261596,LastTimestamp:2026-04-21 15:35:19.1894074 +0000 UTC m=+0.490261596,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-237.ec2.internal,}" Apr 21 15:35:19.229511 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.229493 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-r728z" Apr 21 15:35:19.248495 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.231675 2570 manager.go:341] "Starting Device Plugin manager" Apr 21 15:35:19.248495 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.231700 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 15:35:19.248495 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.231709 2570 server.go:85] "Starting device plugin registration server" Apr 21 15:35:19.248495 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.231961 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 15:35:19.248495 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.231974 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 15:35:19.248495 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.232053 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 15:35:19.248495 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.232148 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 15:35:19.248495 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.232157 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 15:35:19.248495 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.232731 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 15:35:19.248495 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.232770 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-237.ec2.internal\" not found" Apr 21 15:35:19.331797 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.331736 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 15:35:19.331797 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.331766 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 15:35:19.331961 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.331818 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 15:35:19.331961 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.331824 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 15:35:19.331961 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.331857 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 15:35:19.332118 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.332097 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:19.334656 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.334640 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:19.336213 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.336189 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-237.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:19.336313 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.336233 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:19.336313 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.336249 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-237.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:19.336313 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.336284 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-237.ec2.internal" Apr 21 15:35:19.344967 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.344952 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-237.ec2.internal" Apr 21 15:35:19.345041 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.344971 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-237.ec2.internal\": node \"ip-10-0-133-237.ec2.internal\" not found" Apr 21 15:35:19.363793 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.363769 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-237.ec2.internal\" not found" Apr 21 15:35:19.432969 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.432946 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-237.ec2.internal"] Apr 21 15:35:19.433049 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.433006 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:19.433812 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.433798 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-237.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:19.433870 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.433822 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:19.433870 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.433831 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-237.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:19.435078 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.435066 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:19.435266 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.435253 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal" Apr 21 15:35:19.435313 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.435286 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:19.435749 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.435734 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-237.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:19.435796 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.435767 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:19.435796 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.435770 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-237.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:19.435796 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.435790 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:19.435896 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.435777 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-237.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:19.435896 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.435805 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-237.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:19.436921 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.436907 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-237.ec2.internal" Apr 21 15:35:19.436969 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.436942 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:19.438122 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.438110 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-237.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:19.438201 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.438129 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:19.438201 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.438155 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-237.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:19.461660 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.461643 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-237.ec2.internal\" not found" node="ip-10-0-133-237.ec2.internal" Apr 21 15:35:19.463849 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.463833 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-237.ec2.internal\" not found" Apr 21 15:35:19.465767 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.465753 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-237.ec2.internal\" not found" node="ip-10-0-133-237.ec2.internal" Apr 21 15:35:19.474049 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.474034 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7cf83ad13b80947a7f325864637563c9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal\" (UID: \"7cf83ad13b80947a7f325864637563c9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal" Apr 21 15:35:19.474113 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.474057 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cf83ad13b80947a7f325864637563c9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal\" (UID: \"7cf83ad13b80947a7f325864637563c9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal" Apr 21 15:35:19.474113 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.474074 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c9f0db16f43c11bf91bf71ea6d873f37-config\") pod \"kube-apiserver-proxy-ip-10-0-133-237.ec2.internal\" (UID: \"c9f0db16f43c11bf91bf71ea6d873f37\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-237.ec2.internal" Apr 21 15:35:19.564528 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.564497 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-237.ec2.internal\" not found" Apr 21 15:35:19.574864 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.574843 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7cf83ad13b80947a7f325864637563c9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal\" (UID: \"7cf83ad13b80947a7f325864637563c9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal" Apr 21 15:35:19.574943 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.574876 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cf83ad13b80947a7f325864637563c9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal\" (UID: \"7cf83ad13b80947a7f325864637563c9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal" Apr 21 15:35:19.574943 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.574893 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c9f0db16f43c11bf91bf71ea6d873f37-config\") pod \"kube-apiserver-proxy-ip-10-0-133-237.ec2.internal\" (UID: \"c9f0db16f43c11bf91bf71ea6d873f37\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-237.ec2.internal" Apr 21 15:35:19.575019 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.574985 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c9f0db16f43c11bf91bf71ea6d873f37-config\") pod \"kube-apiserver-proxy-ip-10-0-133-237.ec2.internal\" (UID: \"c9f0db16f43c11bf91bf71ea6d873f37\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-237.ec2.internal" Apr 21 15:35:19.575061 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.575042 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cf83ad13b80947a7f325864637563c9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal\" (UID: \"7cf83ad13b80947a7f325864637563c9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal" Apr 21 15:35:19.575330 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.575311 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7cf83ad13b80947a7f325864637563c9-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal\" (UID: \"7cf83ad13b80947a7f325864637563c9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal" Apr 21 15:35:19.664792 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.664724 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-237.ec2.internal\" not found" Apr 21 15:35:19.765285 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.765255 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-237.ec2.internal\" not found" Apr 21 15:35:19.765343 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.765288 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal" Apr 21 15:35:19.768344 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:19.767896 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-237.ec2.internal" Apr 21 15:35:19.866080 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.866043 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-237.ec2.internal\" not found" Apr 21 15:35:19.966616 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:19.966544 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-237.ec2.internal\" not found" Apr 21 15:35:20.067215 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:20.067184 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-237.ec2.internal\" not found" Apr 21 15:35:20.075523 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:20.075509 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 15:35:20.075659 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:20.075638 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 15:35:20.167751 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:20.167725 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-237.ec2.internal\" not found" Apr 21 15:35:20.170904 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:20.170888 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 15:35:20.194444 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:20.194414 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 15:35:20.216055 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:20.216033 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-2fr8r" Apr 21 15:35:20.223125 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:20.223077 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-2fr8r" Apr 21 15:35:20.231967 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:20.231935 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 15:30:19 +0000 UTC" deadline="2027-09-21 20:48:27.47730743 +0000 UTC" Apr 21 15:35:20.231967 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:20.231959 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12437h13m7.245350774s" Apr 21 15:35:20.268200 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:20.268176 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-237.ec2.internal\" not found" Apr 21 15:35:20.323526 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:20.323501 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:20.345317 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:20.345267 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cf83ad13b80947a7f325864637563c9.slice/crio-720c822f8ce02f3ea27dba018c6fc5084bbafdfb63ad477735484e541c90f7b4 WatchSource:0}: Error finding container 720c822f8ce02f3ea27dba018c6fc5084bbafdfb63ad477735484e541c90f7b4: Status 404 returned error can't find the container with id 720c822f8ce02f3ea27dba018c6fc5084bbafdfb63ad477735484e541c90f7b4 Apr 21 15:35:20.345711 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:20.345689 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9f0db16f43c11bf91bf71ea6d873f37.slice/crio-d8380a83e9f78a1295b60877428434bd2412f668cc5b3a825ea50d0894e9f464 WatchSource:0}: Error finding container d8380a83e9f78a1295b60877428434bd2412f668cc5b3a825ea50d0894e9f464: Status 404 returned error can't find the container with id d8380a83e9f78a1295b60877428434bd2412f668cc5b3a825ea50d0894e9f464 Apr 21 15:35:20.349732 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:20.349719 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:35:20.368772 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:20.368754 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-237.ec2.internal\" not found" Apr 21 15:35:20.469328 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:20.469299 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-237.ec2.internal\" not found" Apr 21 15:35:20.498695 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:20.498634 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:20.521142 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:20.521117 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:20.570602 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:20.570575 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal" Apr 21 15:35:20.582593 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:20.582571 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 15:35:20.583685 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:20.583673 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-237.ec2.internal" Apr 21 15:35:20.592801 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:20.592787 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 15:35:21.150714 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.150682 2570 apiserver.go:52] "Watching apiserver" Apr 21 15:35:21.157584 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.157561 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 15:35:21.158937 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.158901 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-x2rv7","openshift-network-operator/iptables-alerter-tvmfk","openshift-ovn-kubernetes/ovnkube-node-8ndqn","kube-system/kube-apiserver-proxy-ip-10-0-133-237.ec2.internal","openshift-cluster-node-tuning-operator/tuned-r9ngc","openshift-image-registry/node-ca-qsgrz","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal","openshift-multus/multus-n4dx6","openshift-network-diagnostics/network-check-target-4spst","kube-system/konnectivity-agent-9222j","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm","openshift-multus/multus-additional-cni-plugins-kltss"] Apr 21 15:35:21.161909 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.161880 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:21.162032 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:21.161970 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x2rv7" podUID="54264bba-76e1-44c8-8581-4f2271e68bd7" Apr 21 15:35:21.162032 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.162011 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tvmfk" Apr 21 15:35:21.163125 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.163103 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.164149 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.164092 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 15:35:21.164149 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.164119 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:35:21.164304 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.164178 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-mdsh5\"" Apr 21 15:35:21.164304 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.164223 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 15:35:21.165126 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.165093 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 15:35:21.165305 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.165106 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 15:35:21.165305 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.165181 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 15:35:21.165305 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.165237 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-p5zgv\"" Apr 21 15:35:21.165528 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.165455 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 15:35:21.165594 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.165532 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.165594 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.165556 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 15:35:21.166881 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.166837 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 15:35:21.168758 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.167230 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qsgrz" Apr 21 15:35:21.168758 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.167405 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.169103 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.169080 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-csxqg\"" Apr 21 15:35:21.169326 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.169308 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 15:35:21.170518 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.169662 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:35:21.170518 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.169949 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 15:35:21.170518 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.170006 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 15:35:21.170518 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.170298 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 15:35:21.170518 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.170358 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 15:35:21.170518 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.170384 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 15:35:21.170518 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.170302 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 15:35:21.171218 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.170553 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9brrf\"" Apr 21 15:35:21.171218 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.170640 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 15:35:21.171218 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.170643 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pp4l5\"" Apr 21 15:35:21.171345 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.171220 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:21.171345 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:21.171281 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4spst" podUID="efb241d1-f7e0-44b6-8014-d8a71973aa71" Apr 21 15:35:21.172825 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.172804 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9222j" Apr 21 15:35:21.174068 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.174036 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.174707 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.174626 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 15:35:21.174707 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.174688 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jtnbc\"" Apr 21 15:35:21.174861 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.174691 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 15:35:21.175399 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.175380 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.176033 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.176015 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 15:35:21.176126 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.176071 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 15:35:21.176126 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.176082 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 15:35:21.176126 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.176073 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mlqqg\"" Apr 21 15:35:21.177051 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.177032 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-62wlx\"" Apr 21 15:35:21.177220 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.177200 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 15:35:21.177659 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.177644 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 15:35:21.184753 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.184736 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4vnk\" (UniqueName: \"kubernetes.io/projected/08d2130c-7332-485b-95f8-0728da25787a-kube-api-access-c4vnk\") pod \"node-ca-qsgrz\" (UID: \"08d2130c-7332-485b-95f8-0728da25787a\") " pod="openshift-image-registry/node-ca-qsgrz" Apr 21 15:35:21.184847 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.184761 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-host-var-lib-cni-bin\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.184847 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.184781 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-host-var-lib-kubelet\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.184847 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.184805 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.184847 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.184830 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-ovnkube-config\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.185030 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.184887 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/96ab5087-ce27-46c7-81df-f0c0d2767c77-host-slash\") pod \"iptables-alerter-tvmfk\" (UID: \"96ab5087-ce27-46c7-81df-f0c0d2767c77\") " pod="openshift-network-operator/iptables-alerter-tvmfk" Apr 21 15:35:21.185030 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.184922 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-multus-conf-dir\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.185030 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.184950 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdsdn\" (UniqueName: \"kubernetes.io/projected/96ab5087-ce27-46c7-81df-f0c0d2767c77-kube-api-access-sdsdn\") pod \"iptables-alerter-tvmfk\" (UID: \"96ab5087-ce27-46c7-81df-f0c0d2767c77\") " pod="openshift-network-operator/iptables-alerter-tvmfk" Apr 21 15:35:21.185030 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.184974 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-ovn-node-metrics-cert\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.185030 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.184997 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-node-log\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.185030 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185020 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-cni-bin\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.185253 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185069 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-os-release\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.185253 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185113 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-etc-kubernetes\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.185253 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185158 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/50b4859e-da58-4584-a53e-a4daaccafc4c-konnectivity-ca\") pod \"konnectivity-agent-9222j\" (UID: \"50b4859e-da58-4584-a53e-a4daaccafc4c\") " pod="kube-system/konnectivity-agent-9222j" Apr 21 15:35:21.185253 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185186 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-kubelet\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.185253 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185208 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-cni-netd\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.185253 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185230 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmqp7\" (UniqueName: \"kubernetes.io/projected/81a2659f-602c-4c12-bd0d-20488c10a56f-kube-api-access-wmqp7\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.185253 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185251 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08d2130c-7332-485b-95f8-0728da25787a-host\") pod \"node-ca-qsgrz\" (UID: \"08d2130c-7332-485b-95f8-0728da25787a\") " pod="openshift-image-registry/node-ca-qsgrz" Apr 21 15:35:21.185480 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185288 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/81a2659f-602c-4c12-bd0d-20488c10a56f-cni-binary-copy\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.185480 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185320 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-hostroot\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.185480 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185365 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-tuned\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.185480 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185398 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-run-ovn\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.185480 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185425 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-run-ovn-kubernetes\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.185480 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185462 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-run\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.185703 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185485 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-host\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.185703 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185509 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twv2f\" (UniqueName: \"kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f\") pod \"network-check-target-4spst\" (UID: \"efb241d1-f7e0-44b6-8014-d8a71973aa71\") " pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:21.185703 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185531 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-env-overrides\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.185703 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185552 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-sys\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.185703 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185573 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/96ab5087-ce27-46c7-81df-f0c0d2767c77-iptables-alerter-script\") pod \"iptables-alerter-tvmfk\" (UID: \"96ab5087-ce27-46c7-81df-f0c0d2767c77\") " pod="openshift-network-operator/iptables-alerter-tvmfk" Apr 21 15:35:21.185703 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185604 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-lib-modules\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.185703 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185626 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-system-cni-dir\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.185703 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185640 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/81a2659f-602c-4c12-bd0d-20488c10a56f-multus-daemon-config\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.185703 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185663 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-var-lib-openvswitch\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.185703 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185688 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-host-run-netns\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.186087 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185709 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-host-var-lib-cni-multus\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.186087 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185736 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5kqx\" (UniqueName: \"kubernetes.io/projected/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-kube-api-access-g5kqx\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.186087 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185759 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-modprobe-d\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.186087 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185772 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-run-systemd\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.186087 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185785 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-kubernetes\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.186087 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185798 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-systemd\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.186087 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185819 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-multus-cni-dir\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.186087 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185845 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh6fw\" (UniqueName: \"kubernetes.io/projected/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-kube-api-access-rh6fw\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.186087 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185881 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-cnibin\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.186087 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185912 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-multus-socket-dir-parent\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.186087 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185940 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs\") pod \"network-metrics-daemon-x2rv7\" (UID: \"54264bba-76e1-44c8-8581-4f2271e68bd7\") " pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:21.186087 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185963 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-sysctl-conf\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.186087 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.185983 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-tmp\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.186087 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.186004 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/50b4859e-da58-4584-a53e-a4daaccafc4c-agent-certs\") pod \"konnectivity-agent-9222j\" (UID: \"50b4859e-da58-4584-a53e-a4daaccafc4c\") " pod="kube-system/konnectivity-agent-9222j" Apr 21 15:35:21.186087 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.186036 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-ovnkube-script-lib\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.186087 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.186068 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-var-lib-kubelet\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.186087 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.186089 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-host-run-k8s-cni-cncf-io\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.186866 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.186121 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-host-run-multus-certs\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.186866 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.186159 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmh8h\" (UniqueName: \"kubernetes.io/projected/54264bba-76e1-44c8-8581-4f2271e68bd7-kube-api-access-zmh8h\") pod \"network-metrics-daemon-x2rv7\" (UID: \"54264bba-76e1-44c8-8581-4f2271e68bd7\") " pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:21.186866 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.186177 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/08d2130c-7332-485b-95f8-0728da25787a-serviceca\") pod \"node-ca-qsgrz\" (UID: \"08d2130c-7332-485b-95f8-0728da25787a\") " pod="openshift-image-registry/node-ca-qsgrz" Apr 21 15:35:21.186866 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.186190 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-run-openvswitch\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.186866 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.186207 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-systemd-units\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.186866 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.186240 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-sysctl-d\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.186866 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.186276 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-slash\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.186866 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.186298 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-run-netns\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.186866 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.186318 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-etc-openvswitch\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.186866 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.186339 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-log-socket\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.186866 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.186359 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-sysconfig\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.224690 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.224660 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:30:20 +0000 UTC" deadline="2027-12-16 10:51:13.981110747 +0000 UTC" Apr 21 15:35:21.224786 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.224693 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14491h15m52.756421644s" Apr 21 15:35:21.272643 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.272613 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 15:35:21.286907 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.286878 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/96ab5087-ce27-46c7-81df-f0c0d2767c77-iptables-alerter-script\") pod \"iptables-alerter-tvmfk\" (UID: \"96ab5087-ce27-46c7-81df-f0c0d2767c77\") " pod="openshift-network-operator/iptables-alerter-tvmfk" Apr 21 15:35:21.287027 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.286914 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-lib-modules\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.287027 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.286935 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-system-cni-dir\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.287027 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.286958 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/81a2659f-602c-4c12-bd0d-20488c10a56f-multus-daemon-config\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.287027 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.286988 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1962acd0-a20c-4e31-9994-8d210722d639-sys-fs\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.287027 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287013 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-var-lib-openvswitch\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.287272 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287036 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-host-run-netns\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.287272 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287059 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-host-var-lib-cni-multus\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.287272 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287099 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5kqx\" (UniqueName: \"kubernetes.io/projected/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-kube-api-access-g5kqx\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.287272 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287119 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-modprobe-d\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.287272 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287146 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-run-systemd\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.287272 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287168 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-kubernetes\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.287272 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287224 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-systemd\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.287272 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287242 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-multus-cni-dir\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.287272 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287262 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rh6fw\" (UniqueName: \"kubernetes.io/projected/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-kube-api-access-rh6fw\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.287621 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287285 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-cnibin\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.287621 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287310 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-multus-socket-dir-parent\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.287621 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287334 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs\") pod \"network-metrics-daemon-x2rv7\" (UID: \"54264bba-76e1-44c8-8581-4f2271e68bd7\") " pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:21.287621 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287361 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1962acd0-a20c-4e31-9994-8d210722d639-device-dir\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.287621 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287382 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/abc50d87-ddda-484f-bcda-07b2af6fbf70-os-release\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.287621 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287397 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-sysctl-conf\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.287621 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287416 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-tmp\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.287621 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287439 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/50b4859e-da58-4584-a53e-a4daaccafc4c-agent-certs\") pod \"konnectivity-agent-9222j\" (UID: \"50b4859e-da58-4584-a53e-a4daaccafc4c\") " pod="kube-system/konnectivity-agent-9222j" Apr 21 15:35:21.287621 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287462 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1962acd0-a20c-4e31-9994-8d210722d639-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.287621 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287482 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-ovnkube-script-lib\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.287621 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287503 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-var-lib-kubelet\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.287621 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287527 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-host-run-k8s-cni-cncf-io\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.287621 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287547 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-host-run-multus-certs\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.287621 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287572 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmh8h\" (UniqueName: \"kubernetes.io/projected/54264bba-76e1-44c8-8581-4f2271e68bd7-kube-api-access-zmh8h\") pod \"network-metrics-daemon-x2rv7\" (UID: \"54264bba-76e1-44c8-8581-4f2271e68bd7\") " pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:21.287621 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287594 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/08d2130c-7332-485b-95f8-0728da25787a-serviceca\") pod \"node-ca-qsgrz\" (UID: \"08d2130c-7332-485b-95f8-0728da25787a\") " pod="openshift-image-registry/node-ca-qsgrz" Apr 21 15:35:21.287621 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287618 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-run-openvswitch\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.288314 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287642 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-systemd-units\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.288314 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287667 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-sysctl-d\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.288314 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287690 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-slash\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.288314 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287737 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/81a2659f-602c-4c12-bd0d-20488c10a56f-multus-daemon-config\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.288314 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287751 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-run-netns\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.288314 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287948 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-var-lib-openvswitch\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.288314 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287955 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-host-var-lib-cni-multus\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.288314 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287795 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-modprobe-d\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.288314 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287970 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-host-run-multus-certs\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.288314 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287879 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-slash\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.288314 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:21.287894 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:21.288314 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287988 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-systemd-units\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.288314 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287943 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-var-lib-kubelet\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.288314 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287973 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-etc-openvswitch\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.288314 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:21.288089 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs podName:54264bba-76e1-44c8-8581-4f2271e68bd7 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:21.788030364 +0000 UTC m=+3.088884556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs") pod "network-metrics-daemon-x2rv7" (UID: "54264bba-76e1-44c8-8581-4f2271e68bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:21.288314 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288094 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-sysctl-conf\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.288314 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288107 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-log-socket\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.289076 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288126 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-sysconfig\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.289076 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288154 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-systemd\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.289076 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288155 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-sysctl-d\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.289076 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288169 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4vnk\" (UniqueName: \"kubernetes.io/projected/08d2130c-7332-485b-95f8-0728da25787a-kube-api-access-c4vnk\") pod \"node-ca-qsgrz\" (UID: \"08d2130c-7332-485b-95f8-0728da25787a\") " pod="openshift-image-registry/node-ca-qsgrz" Apr 21 15:35:21.289076 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288195 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-run-systemd\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.289076 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288199 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-host-var-lib-cni-bin\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.289076 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287826 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-system-cni-dir\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.289076 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288196 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-cnibin\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.289076 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287936 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-multus-cni-dir\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.289076 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288222 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-run-netns\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.289076 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287886 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-multus-socket-dir-parent\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.289076 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288251 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-host-run-netns\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.289076 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288285 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-run-openvswitch\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.289076 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288354 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/08d2130c-7332-485b-95f8-0728da25787a-serviceca\") pod \"node-ca-qsgrz\" (UID: \"08d2130c-7332-485b-95f8-0728da25787a\") " pod="openshift-image-registry/node-ca-qsgrz" Apr 21 15:35:21.289076 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288380 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-host-var-lib-cni-bin\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.289076 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288404 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-etc-openvswitch\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.289076 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.287793 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-lib-modules\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.289076 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288414 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-kubernetes\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.289880 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288463 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-host-var-lib-kubelet\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.289880 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288496 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.289880 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288463 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-host-run-k8s-cni-cncf-io\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.289880 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288558 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-ovnkube-config\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.289880 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288585 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1962acd0-a20c-4e31-9994-8d210722d639-socket-dir\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.289880 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288612 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1962acd0-a20c-4e31-9994-8d210722d639-registration-dir\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.289880 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288637 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/abc50d87-ddda-484f-bcda-07b2af6fbf70-cnibin\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.289880 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288643 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 15:35:21.289880 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288663 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/abc50d87-ddda-484f-bcda-07b2af6fbf70-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.289880 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288724 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/96ab5087-ce27-46c7-81df-f0c0d2767c77-host-slash\") pod \"iptables-alerter-tvmfk\" (UID: \"96ab5087-ce27-46c7-81df-f0c0d2767c77\") " pod="openshift-network-operator/iptables-alerter-tvmfk" Apr 21 15:35:21.289880 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288735 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.289880 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288750 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-multus-conf-dir\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.289880 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288753 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-host-var-lib-kubelet\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.289880 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288776 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdsdn\" (UniqueName: \"kubernetes.io/projected/96ab5087-ce27-46c7-81df-f0c0d2767c77-kube-api-access-sdsdn\") pod \"iptables-alerter-tvmfk\" (UID: \"96ab5087-ce27-46c7-81df-f0c0d2767c77\") " pod="openshift-network-operator/iptables-alerter-tvmfk" Apr 21 15:35:21.289880 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288789 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-log-socket\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.289880 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288823 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-ovnkube-script-lib\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.289880 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288863 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/96ab5087-ce27-46c7-81df-f0c0d2767c77-iptables-alerter-script\") pod \"iptables-alerter-tvmfk\" (UID: \"96ab5087-ce27-46c7-81df-f0c0d2767c77\") " pod="openshift-network-operator/iptables-alerter-tvmfk" Apr 21 15:35:21.290663 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288874 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/96ab5087-ce27-46c7-81df-f0c0d2767c77-host-slash\") pod \"iptables-alerter-tvmfk\" (UID: \"96ab5087-ce27-46c7-81df-f0c0d2767c77\") " pod="openshift-network-operator/iptables-alerter-tvmfk" Apr 21 15:35:21.290663 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288918 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-ovn-node-metrics-cert\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.290663 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288927 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-sysconfig\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.290663 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.288966 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/abc50d87-ddda-484f-bcda-07b2af6fbf70-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.290663 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289012 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-multus-conf-dir\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.290663 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289027 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr9dn\" (UniqueName: \"kubernetes.io/projected/abc50d87-ddda-484f-bcda-07b2af6fbf70-kube-api-access-dr9dn\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.290663 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289061 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-ovnkube-config\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.290663 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289069 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-node-log\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.290663 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289106 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-cni-bin\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.290663 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289113 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-node-log\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.290663 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289155 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-os-release\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.290663 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289179 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-cni-bin\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.290663 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289216 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-etc-kubernetes\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.290663 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289256 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/50b4859e-da58-4584-a53e-a4daaccafc4c-konnectivity-ca\") pod \"konnectivity-agent-9222j\" (UID: \"50b4859e-da58-4584-a53e-a4daaccafc4c\") " pod="kube-system/konnectivity-agent-9222j" Apr 21 15:35:21.290663 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289290 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/abc50d87-ddda-484f-bcda-07b2af6fbf70-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.290663 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289301 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-os-release\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.290663 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289345 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-kubelet\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.291415 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289356 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-etc-kubernetes\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.291415 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289390 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-cni-netd\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.291415 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289425 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmqp7\" (UniqueName: \"kubernetes.io/projected/81a2659f-602c-4c12-bd0d-20488c10a56f-kube-api-access-wmqp7\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.291415 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289474 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqgl7\" (UniqueName: \"kubernetes.io/projected/1962acd0-a20c-4e31-9994-8d210722d639-kube-api-access-fqgl7\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.291415 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289501 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08d2130c-7332-485b-95f8-0728da25787a-host\") pod \"node-ca-qsgrz\" (UID: \"08d2130c-7332-485b-95f8-0728da25787a\") " pod="openshift-image-registry/node-ca-qsgrz" Apr 21 15:35:21.291415 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289523 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/81a2659f-602c-4c12-bd0d-20488c10a56f-cni-binary-copy\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.291415 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289545 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-hostroot\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.291415 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289595 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1962acd0-a20c-4e31-9994-8d210722d639-etc-selinux\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.291415 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289620 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-tuned\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.291415 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289646 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-run-ovn\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.291415 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289669 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-run-ovn-kubernetes\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.291415 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289706 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-run\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.291415 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289742 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-kubelet\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.291415 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289738 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/08d2130c-7332-485b-95f8-0728da25787a-host\") pod \"node-ca-qsgrz\" (UID: \"08d2130c-7332-485b-95f8-0728da25787a\") " pod="openshift-image-registry/node-ca-qsgrz" Apr 21 15:35:21.291415 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289782 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-host\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.291415 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289809 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twv2f\" (UniqueName: \"kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f\") pod \"network-check-target-4spst\" (UID: \"efb241d1-f7e0-44b6-8014-d8a71973aa71\") " pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:21.291415 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289815 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-run-ovn-kubernetes\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.291415 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289817 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-run\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.292090 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289782 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/81a2659f-602c-4c12-bd0d-20488c10a56f-hostroot\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.292090 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289827 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/50b4859e-da58-4584-a53e-a4daaccafc4c-konnectivity-ca\") pod \"konnectivity-agent-9222j\" (UID: \"50b4859e-da58-4584-a53e-a4daaccafc4c\") " pod="kube-system/konnectivity-agent-9222j" Apr 21 15:35:21.292090 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289867 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abc50d87-ddda-484f-bcda-07b2af6fbf70-system-cni-dir\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.292090 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289905 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-env-overrides\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.292090 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289931 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-sys\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.292090 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.289976 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/abc50d87-ddda-484f-bcda-07b2af6fbf70-cni-binary-copy\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.292090 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.290039 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-host\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.292090 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.290056 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-sys\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.292090 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.290079 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-run-ovn\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.292090 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.290114 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-host-cni-netd\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.292090 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.290235 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/81a2659f-602c-4c12-bd0d-20488c10a56f-cni-binary-copy\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.292090 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.291166 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-env-overrides\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.292625 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.292348 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-etc-tuned\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.292625 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.292384 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-tmp\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.292625 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.292510 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/50b4859e-da58-4584-a53e-a4daaccafc4c-agent-certs\") pod \"konnectivity-agent-9222j\" (UID: \"50b4859e-da58-4584-a53e-a4daaccafc4c\") " pod="kube-system/konnectivity-agent-9222j" Apr 21 15:35:21.292625 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.292605 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-ovn-node-metrics-cert\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.310676 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.310649 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdsdn\" (UniqueName: \"kubernetes.io/projected/96ab5087-ce27-46c7-81df-f0c0d2767c77-kube-api-access-sdsdn\") pod \"iptables-alerter-tvmfk\" (UID: \"96ab5087-ce27-46c7-81df-f0c0d2767c77\") " pod="openshift-network-operator/iptables-alerter-tvmfk" Apr 21 15:35:21.311280 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.311097 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh6fw\" (UniqueName: \"kubernetes.io/projected/9fe3c049-1ac3-41bd-9da4-6cbd245a22bc-kube-api-access-rh6fw\") pod \"tuned-r9ngc\" (UID: \"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc\") " pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.311280 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.311186 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmh8h\" (UniqueName: \"kubernetes.io/projected/54264bba-76e1-44c8-8581-4f2271e68bd7-kube-api-access-zmh8h\") pod \"network-metrics-daemon-x2rv7\" (UID: \"54264bba-76e1-44c8-8581-4f2271e68bd7\") " pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:21.311280 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.311239 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4vnk\" (UniqueName: \"kubernetes.io/projected/08d2130c-7332-485b-95f8-0728da25787a-kube-api-access-c4vnk\") pod \"node-ca-qsgrz\" (UID: \"08d2130c-7332-485b-95f8-0728da25787a\") " pod="openshift-image-registry/node-ca-qsgrz" Apr 21 15:35:21.311629 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.311608 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5kqx\" (UniqueName: \"kubernetes.io/projected/7aae9a17-ae2e-4328-91d9-7fb4b43f79e2-kube-api-access-g5kqx\") pod \"ovnkube-node-8ndqn\" (UID: \"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.311716 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.311612 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmqp7\" (UniqueName: \"kubernetes.io/projected/81a2659f-602c-4c12-bd0d-20488c10a56f-kube-api-access-wmqp7\") pod \"multus-n4dx6\" (UID: \"81a2659f-602c-4c12-bd0d-20488c10a56f\") " pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.312750 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:21.312734 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:21.312821 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:21.312756 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:21.312821 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:21.312770 2570 projected.go:194] Error preparing data for projected volume kube-api-access-twv2f for pod openshift-network-diagnostics/network-check-target-4spst: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:21.312905 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:21.312825 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f podName:efb241d1-f7e0-44b6-8014-d8a71973aa71 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:21.812810821 +0000 UTC m=+3.113665004 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-twv2f" (UniqueName: "kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f") pod "network-check-target-4spst" (UID: "efb241d1-f7e0-44b6-8014-d8a71973aa71") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:21.335906 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.335867 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-237.ec2.internal" event={"ID":"c9f0db16f43c11bf91bf71ea6d873f37","Type":"ContainerStarted","Data":"d8380a83e9f78a1295b60877428434bd2412f668cc5b3a825ea50d0894e9f464"} Apr 21 15:35:21.336808 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.336786 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal" event={"ID":"7cf83ad13b80947a7f325864637563c9","Type":"ContainerStarted","Data":"720c822f8ce02f3ea27dba018c6fc5084bbafdfb63ad477735484e541c90f7b4"} Apr 21 15:35:21.390842 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.390807 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/abc50d87-ddda-484f-bcda-07b2af6fbf70-cnibin\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.391020 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.390854 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/abc50d87-ddda-484f-bcda-07b2af6fbf70-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.391020 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.390932 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/abc50d87-ddda-484f-bcda-07b2af6fbf70-cnibin\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.391020 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.390990 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/abc50d87-ddda-484f-bcda-07b2af6fbf70-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.391191 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391020 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dr9dn\" (UniqueName: \"kubernetes.io/projected/abc50d87-ddda-484f-bcda-07b2af6fbf70-kube-api-access-dr9dn\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.391191 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391054 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/abc50d87-ddda-484f-bcda-07b2af6fbf70-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.391191 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391085 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqgl7\" (UniqueName: \"kubernetes.io/projected/1962acd0-a20c-4e31-9994-8d210722d639-kube-api-access-fqgl7\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.391191 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391116 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1962acd0-a20c-4e31-9994-8d210722d639-etc-selinux\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.391191 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391170 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abc50d87-ddda-484f-bcda-07b2af6fbf70-system-cni-dir\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.391408 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391233 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/abc50d87-ddda-484f-bcda-07b2af6fbf70-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.391408 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391233 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1962acd0-a20c-4e31-9994-8d210722d639-etc-selinux\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.391408 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391283 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abc50d87-ddda-484f-bcda-07b2af6fbf70-system-cni-dir\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.391408 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391367 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/abc50d87-ddda-484f-bcda-07b2af6fbf70-cni-binary-copy\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.391567 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391411 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1962acd0-a20c-4e31-9994-8d210722d639-sys-fs\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.391567 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391504 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1962acd0-a20c-4e31-9994-8d210722d639-device-dir\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.391567 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391500 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/abc50d87-ddda-484f-bcda-07b2af6fbf70-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.391567 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391532 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/abc50d87-ddda-484f-bcda-07b2af6fbf70-os-release\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.391567 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391526 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1962acd0-a20c-4e31-9994-8d210722d639-sys-fs\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.391567 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391561 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1962acd0-a20c-4e31-9994-8d210722d639-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.391845 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391561 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1962acd0-a20c-4e31-9994-8d210722d639-device-dir\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.391845 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391602 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1962acd0-a20c-4e31-9994-8d210722d639-socket-dir\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.391845 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391614 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/abc50d87-ddda-484f-bcda-07b2af6fbf70-os-release\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.391845 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391623 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1962acd0-a20c-4e31-9994-8d210722d639-registration-dir\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.391845 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391643 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1962acd0-a20c-4e31-9994-8d210722d639-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.391845 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391641 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/abc50d87-ddda-484f-bcda-07b2af6fbf70-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.391845 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391697 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1962acd0-a20c-4e31-9994-8d210722d639-registration-dir\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.391845 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391708 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1962acd0-a20c-4e31-9994-8d210722d639-socket-dir\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.392267 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.391849 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/abc50d87-ddda-484f-bcda-07b2af6fbf70-cni-binary-copy\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.409262 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.409183 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqgl7\" (UniqueName: \"kubernetes.io/projected/1962acd0-a20c-4e31-9994-8d210722d639-kube-api-access-fqgl7\") pod \"aws-ebs-csi-driver-node-6g8bm\" (UID: \"1962acd0-a20c-4e31-9994-8d210722d639\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.409473 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.409452 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr9dn\" (UniqueName: \"kubernetes.io/projected/abc50d87-ddda-484f-bcda-07b2af6fbf70-kube-api-access-dr9dn\") pod \"multus-additional-cni-plugins-kltss\" (UID: \"abc50d87-ddda-484f-bcda-07b2af6fbf70\") " pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.478530 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.478498 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tvmfk" Apr 21 15:35:21.483985 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.483956 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qsgrz" Apr 21 15:35:21.491570 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.491544 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n4dx6" Apr 21 15:35:21.497255 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.497234 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" Apr 21 15:35:21.503894 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.503874 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:21.509800 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.509776 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9222j" Apr 21 15:35:21.515362 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.515342 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" Apr 21 15:35:21.519976 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.519952 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kltss" Apr 21 15:35:21.520145 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.520114 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:21.793353 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.793321 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs\") pod \"network-metrics-daemon-x2rv7\" (UID: \"54264bba-76e1-44c8-8581-4f2271e68bd7\") " pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:21.793521 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:21.793448 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:21.793521 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:21.793520 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs podName:54264bba-76e1-44c8-8581-4f2271e68bd7 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:22.793502202 +0000 UTC m=+4.094356397 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs") pod "network-metrics-daemon-x2rv7" (UID: "54264bba-76e1-44c8-8581-4f2271e68bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:21.893801 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:21.893773 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twv2f\" (UniqueName: \"kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f\") pod \"network-check-target-4spst\" (UID: \"efb241d1-f7e0-44b6-8014-d8a71973aa71\") " pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:21.893951 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:21.893883 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:21.893951 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:21.893896 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:21.893951 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:21.893903 2570 projected.go:194] Error preparing data for projected volume kube-api-access-twv2f for pod openshift-network-diagnostics/network-check-target-4spst: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:21.893951 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:21.893949 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f podName:efb241d1-f7e0-44b6-8014-d8a71973aa71 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:22.893935443 +0000 UTC m=+4.194789623 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-twv2f" (UniqueName: "kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f") pod "network-check-target-4spst" (UID: "efb241d1-f7e0-44b6-8014-d8a71973aa71") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:21.922894 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:21.922864 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fe3c049_1ac3_41bd_9da4_6cbd245a22bc.slice/crio-ef75abf33ae7b24b3ed3d701901e01a6845f99e5e8d028b0e0c0378147d9b631 WatchSource:0}: Error finding container ef75abf33ae7b24b3ed3d701901e01a6845f99e5e8d028b0e0c0378147d9b631: Status 404 returned error can't find the container with id ef75abf33ae7b24b3ed3d701901e01a6845f99e5e8d028b0e0c0378147d9b631 Apr 21 15:35:21.923368 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:21.923252 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabc50d87_ddda_484f_bcda_07b2af6fbf70.slice/crio-e4b3a639229c208d37957cebbfaeb9fbbed487bde2fc4adc8779770e4d55f18d WatchSource:0}: Error finding container e4b3a639229c208d37957cebbfaeb9fbbed487bde2fc4adc8779770e4d55f18d: Status 404 returned error can't find the container with id e4b3a639229c208d37957cebbfaeb9fbbed487bde2fc4adc8779770e4d55f18d Apr 21 15:35:21.927744 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:21.927609 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81a2659f_602c_4c12_bd0d_20488c10a56f.slice/crio-a59ee48d87edb7e13cadaaf7f7ff749189d0ef43aa3310b940e56cfcf491aebb WatchSource:0}: Error finding container a59ee48d87edb7e13cadaaf7f7ff749189d0ef43aa3310b940e56cfcf491aebb: Status 404 returned error can't find the container with id a59ee48d87edb7e13cadaaf7f7ff749189d0ef43aa3310b940e56cfcf491aebb Apr 21 15:35:21.930018 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:21.930002 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1962acd0_a20c_4e31_9994_8d210722d639.slice/crio-b311731498e330a09e0e0dd0e6cbe047663cc2b1bca7f5e39c822e2fae0d2b07 WatchSource:0}: Error finding container b311731498e330a09e0e0dd0e6cbe047663cc2b1bca7f5e39c822e2fae0d2b07: Status 404 returned error can't find the container with id b311731498e330a09e0e0dd0e6cbe047663cc2b1bca7f5e39c822e2fae0d2b07 Apr 21 15:35:21.930915 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:21.930893 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96ab5087_ce27_46c7_81df_f0c0d2767c77.slice/crio-4b64aabc22330e5279f8982aa5e3f976426466127ac01cd0d3b9d633b97d0259 WatchSource:0}: Error finding container 4b64aabc22330e5279f8982aa5e3f976426466127ac01cd0d3b9d633b97d0259: Status 404 returned error can't find the container with id 4b64aabc22330e5279f8982aa5e3f976426466127ac01cd0d3b9d633b97d0259 Apr 21 15:35:21.932120 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:21.932100 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08d2130c_7332_485b_95f8_0728da25787a.slice/crio-5e8bf9e5d444239b464082e7acc70f96ccf863206ec8353950fb9b9193990fef WatchSource:0}: Error finding container 5e8bf9e5d444239b464082e7acc70f96ccf863206ec8353950fb9b9193990fef: Status 404 returned error can't find the container with id 5e8bf9e5d444239b464082e7acc70f96ccf863206ec8353950fb9b9193990fef Apr 21 15:35:21.933497 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:21.933467 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50b4859e_da58_4584_a53e_a4daaccafc4c.slice/crio-6dbc37960a8d5d685f217e8f6a7102a6ae9d613924ae1cfb40efb3bd56091189 WatchSource:0}: Error finding container 6dbc37960a8d5d685f217e8f6a7102a6ae9d613924ae1cfb40efb3bd56091189: Status 404 returned error can't find the container with id 6dbc37960a8d5d685f217e8f6a7102a6ae9d613924ae1cfb40efb3bd56091189 Apr 21 15:35:21.934563 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:21.934541 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7aae9a17_ae2e_4328_91d9_7fb4b43f79e2.slice/crio-b1828f0e6fa405b65d41869347c4fba60c745f9f5e2368f6f764391c067fca7d WatchSource:0}: Error finding container b1828f0e6fa405b65d41869347c4fba60c745f9f5e2368f6f764391c067fca7d: Status 404 returned error can't find the container with id b1828f0e6fa405b65d41869347c4fba60c745f9f5e2368f6f764391c067fca7d Apr 21 15:35:22.225450 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:22.225367 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:30:20 +0000 UTC" deadline="2027-10-03 13:42:36.170337968 +0000 UTC" Apr 21 15:35:22.225450 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:22.225397 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12718h7m13.944943201s" Apr 21 15:35:22.333265 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:22.333236 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:22.333417 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:22.333381 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x2rv7" podUID="54264bba-76e1-44c8-8581-4f2271e68bd7" Apr 21 15:35:22.339961 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:22.339920 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n4dx6" event={"ID":"81a2659f-602c-4c12-bd0d-20488c10a56f","Type":"ContainerStarted","Data":"a59ee48d87edb7e13cadaaf7f7ff749189d0ef43aa3310b940e56cfcf491aebb"} Apr 21 15:35:22.341367 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:22.341331 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kltss" event={"ID":"abc50d87-ddda-484f-bcda-07b2af6fbf70","Type":"ContainerStarted","Data":"e4b3a639229c208d37957cebbfaeb9fbbed487bde2fc4adc8779770e4d55f18d"} Apr 21 15:35:22.343200 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:22.343089 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-237.ec2.internal" event={"ID":"c9f0db16f43c11bf91bf71ea6d873f37","Type":"ContainerStarted","Data":"c8185f6ae1fe5d9ee2d055cddc5b848150020204ad526a792909f63e6de40838"} Apr 21 15:35:22.344502 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:22.344455 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" event={"ID":"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2","Type":"ContainerStarted","Data":"b1828f0e6fa405b65d41869347c4fba60c745f9f5e2368f6f764391c067fca7d"} Apr 21 15:35:22.345686 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:22.345664 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9222j" event={"ID":"50b4859e-da58-4584-a53e-a4daaccafc4c","Type":"ContainerStarted","Data":"6dbc37960a8d5d685f217e8f6a7102a6ae9d613924ae1cfb40efb3bd56091189"} Apr 21 15:35:22.347264 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:22.347240 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qsgrz" event={"ID":"08d2130c-7332-485b-95f8-0728da25787a","Type":"ContainerStarted","Data":"5e8bf9e5d444239b464082e7acc70f96ccf863206ec8353950fb9b9193990fef"} Apr 21 15:35:22.348480 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:22.348459 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tvmfk" event={"ID":"96ab5087-ce27-46c7-81df-f0c0d2767c77","Type":"ContainerStarted","Data":"4b64aabc22330e5279f8982aa5e3f976426466127ac01cd0d3b9d633b97d0259"} Apr 21 15:35:22.350072 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:22.350048 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" event={"ID":"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc","Type":"ContainerStarted","Data":"ef75abf33ae7b24b3ed3d701901e01a6845f99e5e8d028b0e0c0378147d9b631"} Apr 21 15:35:22.351235 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:22.351209 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" event={"ID":"1962acd0-a20c-4e31-9994-8d210722d639","Type":"ContainerStarted","Data":"b311731498e330a09e0e0dd0e6cbe047663cc2b1bca7f5e39c822e2fae0d2b07"} Apr 21 15:35:22.362826 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:22.362750 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-237.ec2.internal" podStartSLOduration=2.362738376 podStartE2EDuration="2.362738376s" podCreationTimestamp="2026-04-21 15:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:35:22.362057082 +0000 UTC m=+3.662911282" watchObservedRunningTime="2026-04-21 15:35:22.362738376 +0000 UTC m=+3.663592577" Apr 21 15:35:22.800300 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:22.800250 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs\") pod \"network-metrics-daemon-x2rv7\" (UID: \"54264bba-76e1-44c8-8581-4f2271e68bd7\") " pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:22.800577 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:22.800402 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:22.800577 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:22.800462 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs podName:54264bba-76e1-44c8-8581-4f2271e68bd7 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:24.800444959 +0000 UTC m=+6.101299141 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs") pod "network-metrics-daemon-x2rv7" (UID: "54264bba-76e1-44c8-8581-4f2271e68bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:22.901311 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:22.901272 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twv2f\" (UniqueName: \"kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f\") pod \"network-check-target-4spst\" (UID: \"efb241d1-f7e0-44b6-8014-d8a71973aa71\") " pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:22.901486 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:22.901470 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:22.901544 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:22.901493 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:22.901544 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:22.901505 2570 projected.go:194] Error preparing data for projected volume kube-api-access-twv2f for pod openshift-network-diagnostics/network-check-target-4spst: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:22.901654 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:22.901560 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f podName:efb241d1-f7e0-44b6-8014-d8a71973aa71 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:24.901543011 +0000 UTC m=+6.202397205 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-twv2f" (UniqueName: "kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f") pod "network-check-target-4spst" (UID: "efb241d1-f7e0-44b6-8014-d8a71973aa71") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:23.335181 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:23.335148 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:23.335643 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:23.335271 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4spst" podUID="efb241d1-f7e0-44b6-8014-d8a71973aa71" Apr 21 15:35:23.362077 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:23.362039 2570 generic.go:358] "Generic (PLEG): container finished" podID="7cf83ad13b80947a7f325864637563c9" containerID="b7dc5cde3149f2bed7d70d957a4f3a314fe9017c86ca03bd39c9113486948a34" exitCode=0 Apr 21 15:35:23.368246 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:23.368201 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal" event={"ID":"7cf83ad13b80947a7f325864637563c9","Type":"ContainerDied","Data":"b7dc5cde3149f2bed7d70d957a4f3a314fe9017c86ca03bd39c9113486948a34"} Apr 21 15:35:23.786614 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:23.785833 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-dmvhg"] Apr 21 15:35:23.788303 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:23.787772 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:23.788303 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:23.787848 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmvhg" podUID="90b42886-9124-47a7-8a37-518ea2f64986" Apr 21 15:35:23.807327 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:23.807111 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/90b42886-9124-47a7-8a37-518ea2f64986-dbus\") pod \"global-pull-secret-syncer-dmvhg\" (UID: \"90b42886-9124-47a7-8a37-518ea2f64986\") " pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:23.807327 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:23.807174 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret\") pod \"global-pull-secret-syncer-dmvhg\" (UID: \"90b42886-9124-47a7-8a37-518ea2f64986\") " pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:23.807327 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:23.807215 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/90b42886-9124-47a7-8a37-518ea2f64986-kubelet-config\") pod \"global-pull-secret-syncer-dmvhg\" (UID: \"90b42886-9124-47a7-8a37-518ea2f64986\") " pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:23.909323 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:23.908018 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/90b42886-9124-47a7-8a37-518ea2f64986-dbus\") pod \"global-pull-secret-syncer-dmvhg\" (UID: \"90b42886-9124-47a7-8a37-518ea2f64986\") " pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:23.909323 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:23.908065 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret\") pod \"global-pull-secret-syncer-dmvhg\" (UID: \"90b42886-9124-47a7-8a37-518ea2f64986\") " pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:23.909323 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:23.908107 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/90b42886-9124-47a7-8a37-518ea2f64986-kubelet-config\") pod \"global-pull-secret-syncer-dmvhg\" (UID: \"90b42886-9124-47a7-8a37-518ea2f64986\") " pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:23.909323 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:23.908272 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/90b42886-9124-47a7-8a37-518ea2f64986-kubelet-config\") pod \"global-pull-secret-syncer-dmvhg\" (UID: \"90b42886-9124-47a7-8a37-518ea2f64986\") " pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:23.909323 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:23.908409 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/90b42886-9124-47a7-8a37-518ea2f64986-dbus\") pod \"global-pull-secret-syncer-dmvhg\" (UID: \"90b42886-9124-47a7-8a37-518ea2f64986\") " pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:23.909323 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:23.908509 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:23.909323 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:23.908590 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret podName:90b42886-9124-47a7-8a37-518ea2f64986 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:24.408549366 +0000 UTC m=+5.709403548 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret") pod "global-pull-secret-syncer-dmvhg" (UID: "90b42886-9124-47a7-8a37-518ea2f64986") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:24.332362 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:24.332328 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:24.332558 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:24.332484 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x2rv7" podUID="54264bba-76e1-44c8-8581-4f2271e68bd7" Apr 21 15:35:24.369981 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:24.369227 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal" event={"ID":"7cf83ad13b80947a7f325864637563c9","Type":"ContainerStarted","Data":"6d95fc449aa8c563936a974e84c256cbc86d897c19e40b9794e6e7b1e1757a38"} Apr 21 15:35:24.385983 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:24.385931 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-237.ec2.internal" podStartSLOduration=4.385911881 podStartE2EDuration="4.385911881s" podCreationTimestamp="2026-04-21 15:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:35:24.385838816 +0000 UTC m=+5.686693022" watchObservedRunningTime="2026-04-21 15:35:24.385911881 +0000 UTC m=+5.686766086" Apr 21 15:35:24.412408 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:24.411809 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret\") pod \"global-pull-secret-syncer-dmvhg\" (UID: \"90b42886-9124-47a7-8a37-518ea2f64986\") " pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:24.412408 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:24.412004 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:24.412408 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:24.412065 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret podName:90b42886-9124-47a7-8a37-518ea2f64986 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:25.412047283 +0000 UTC m=+6.712901463 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret") pod "global-pull-secret-syncer-dmvhg" (UID: "90b42886-9124-47a7-8a37-518ea2f64986") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:24.814924 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:24.814883 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs\") pod \"network-metrics-daemon-x2rv7\" (UID: \"54264bba-76e1-44c8-8581-4f2271e68bd7\") " pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:24.815089 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:24.815030 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:24.815164 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:24.815094 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs podName:54264bba-76e1-44c8-8581-4f2271e68bd7 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:28.815076882 +0000 UTC m=+10.115931085 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs") pod "network-metrics-daemon-x2rv7" (UID: "54264bba-76e1-44c8-8581-4f2271e68bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:24.915719 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:24.915687 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twv2f\" (UniqueName: \"kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f\") pod \"network-check-target-4spst\" (UID: \"efb241d1-f7e0-44b6-8014-d8a71973aa71\") " pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:24.915896 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:24.915855 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:24.915896 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:24.915873 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:24.915896 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:24.915885 2570 projected.go:194] Error preparing data for projected volume kube-api-access-twv2f for pod openshift-network-diagnostics/network-check-target-4spst: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:24.916052 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:24.915939 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f podName:efb241d1-f7e0-44b6-8014-d8a71973aa71 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:28.915920196 +0000 UTC m=+10.216774382 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-twv2f" (UniqueName: "kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f") pod "network-check-target-4spst" (UID: "efb241d1-f7e0-44b6-8014-d8a71973aa71") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:25.332803 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:25.332475 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:25.332803 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:25.332602 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmvhg" podUID="90b42886-9124-47a7-8a37-518ea2f64986" Apr 21 15:35:25.332803 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:25.332475 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:25.332803 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:25.332753 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4spst" podUID="efb241d1-f7e0-44b6-8014-d8a71973aa71" Apr 21 15:35:25.418433 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:25.418397 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret\") pod \"global-pull-secret-syncer-dmvhg\" (UID: \"90b42886-9124-47a7-8a37-518ea2f64986\") " pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:25.418893 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:25.418548 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:25.418893 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:25.418617 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret podName:90b42886-9124-47a7-8a37-518ea2f64986 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:27.418597171 +0000 UTC m=+8.719451363 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret") pod "global-pull-secret-syncer-dmvhg" (UID: "90b42886-9124-47a7-8a37-518ea2f64986") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:26.332970 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:26.332936 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:26.333151 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:26.333079 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x2rv7" podUID="54264bba-76e1-44c8-8581-4f2271e68bd7" Apr 21 15:35:27.332550 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:27.332513 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:27.332984 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:27.332691 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4spst" podUID="efb241d1-f7e0-44b6-8014-d8a71973aa71" Apr 21 15:35:27.333111 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:27.332514 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:27.333240 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:27.333202 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmvhg" podUID="90b42886-9124-47a7-8a37-518ea2f64986" Apr 21 15:35:27.435400 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:27.435280 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret\") pod \"global-pull-secret-syncer-dmvhg\" (UID: \"90b42886-9124-47a7-8a37-518ea2f64986\") " pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:27.435581 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:27.435411 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:27.435581 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:27.435490 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret podName:90b42886-9124-47a7-8a37-518ea2f64986 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:31.43547453 +0000 UTC m=+12.736328713 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret") pod "global-pull-secret-syncer-dmvhg" (UID: "90b42886-9124-47a7-8a37-518ea2f64986") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:28.332846 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:28.332739 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:28.333343 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:28.332887 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x2rv7" podUID="54264bba-76e1-44c8-8581-4f2271e68bd7" Apr 21 15:35:28.845975 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:28.845938 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs\") pod \"network-metrics-daemon-x2rv7\" (UID: \"54264bba-76e1-44c8-8581-4f2271e68bd7\") " pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:28.846204 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:28.846116 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:28.846282 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:28.846214 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs podName:54264bba-76e1-44c8-8581-4f2271e68bd7 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:36.84619278 +0000 UTC m=+18.147046967 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs") pod "network-metrics-daemon-x2rv7" (UID: "54264bba-76e1-44c8-8581-4f2271e68bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:28.946745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:28.946700 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twv2f\" (UniqueName: \"kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f\") pod \"network-check-target-4spst\" (UID: \"efb241d1-f7e0-44b6-8014-d8a71973aa71\") " pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:28.946931 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:28.946871 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:28.946931 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:28.946885 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:28.946931 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:28.946894 2570 projected.go:194] Error preparing data for projected volume kube-api-access-twv2f for pod openshift-network-diagnostics/network-check-target-4spst: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:28.947073 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:28.946943 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f podName:efb241d1-f7e0-44b6-8014-d8a71973aa71 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:36.946928968 +0000 UTC m=+18.247783149 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-twv2f" (UniqueName: "kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f") pod "network-check-target-4spst" (UID: "efb241d1-f7e0-44b6-8014-d8a71973aa71") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:29.333420 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:29.333380 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:29.333875 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:29.333498 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4spst" podUID="efb241d1-f7e0-44b6-8014-d8a71973aa71" Apr 21 15:35:29.333875 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:29.333660 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:29.333875 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:29.333820 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmvhg" podUID="90b42886-9124-47a7-8a37-518ea2f64986" Apr 21 15:35:30.332873 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:30.332846 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:30.333000 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:30.332965 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x2rv7" podUID="54264bba-76e1-44c8-8581-4f2271e68bd7" Apr 21 15:35:31.332574 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:31.332539 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:31.333012 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:31.332581 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:31.333012 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:31.332658 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4spst" podUID="efb241d1-f7e0-44b6-8014-d8a71973aa71" Apr 21 15:35:31.333012 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:31.332787 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmvhg" podUID="90b42886-9124-47a7-8a37-518ea2f64986" Apr 21 15:35:31.468313 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:31.468280 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret\") pod \"global-pull-secret-syncer-dmvhg\" (UID: \"90b42886-9124-47a7-8a37-518ea2f64986\") " pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:31.468476 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:31.468436 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:31.468531 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:31.468503 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret podName:90b42886-9124-47a7-8a37-518ea2f64986 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:39.468487458 +0000 UTC m=+20.769341638 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret") pod "global-pull-secret-syncer-dmvhg" (UID: "90b42886-9124-47a7-8a37-518ea2f64986") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:32.332630 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:32.332599 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:32.333049 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:32.332737 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x2rv7" podUID="54264bba-76e1-44c8-8581-4f2271e68bd7" Apr 21 15:35:33.332490 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:33.332455 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:33.332680 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:33.332455 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:33.332680 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:33.332590 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4spst" podUID="efb241d1-f7e0-44b6-8014-d8a71973aa71" Apr 21 15:35:33.332680 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:33.332665 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmvhg" podUID="90b42886-9124-47a7-8a37-518ea2f64986" Apr 21 15:35:34.332874 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:34.332838 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:34.333365 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:34.332973 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x2rv7" podUID="54264bba-76e1-44c8-8581-4f2271e68bd7" Apr 21 15:35:35.332407 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:35.332366 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:35.332580 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:35.332366 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:35.332580 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:35.332482 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4spst" podUID="efb241d1-f7e0-44b6-8014-d8a71973aa71" Apr 21 15:35:35.332669 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:35.332574 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmvhg" podUID="90b42886-9124-47a7-8a37-518ea2f64986" Apr 21 15:35:36.332613 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:36.332580 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:36.333052 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:36.332697 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x2rv7" podUID="54264bba-76e1-44c8-8581-4f2271e68bd7" Apr 21 15:35:36.911877 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:36.911838 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs\") pod \"network-metrics-daemon-x2rv7\" (UID: \"54264bba-76e1-44c8-8581-4f2271e68bd7\") " pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:36.912051 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:36.912010 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:36.912107 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:36.912080 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs podName:54264bba-76e1-44c8-8581-4f2271e68bd7 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:52.912059247 +0000 UTC m=+34.212913429 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs") pod "network-metrics-daemon-x2rv7" (UID: "54264bba-76e1-44c8-8581-4f2271e68bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:37.012992 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:37.012950 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twv2f\" (UniqueName: \"kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f\") pod \"network-check-target-4spst\" (UID: \"efb241d1-f7e0-44b6-8014-d8a71973aa71\") " pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:37.013213 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:37.013151 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:37.013213 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:37.013176 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:37.013213 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:37.013188 2570 projected.go:194] Error preparing data for projected volume kube-api-access-twv2f for pod openshift-network-diagnostics/network-check-target-4spst: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:37.013368 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:37.013248 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f podName:efb241d1-f7e0-44b6-8014-d8a71973aa71 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:53.013231075 +0000 UTC m=+34.314085272 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-twv2f" (UniqueName: "kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f") pod "network-check-target-4spst" (UID: "efb241d1-f7e0-44b6-8014-d8a71973aa71") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:37.332144 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:37.332107 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:37.332308 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:37.332160 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:37.332308 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:37.332236 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmvhg" podUID="90b42886-9124-47a7-8a37-518ea2f64986" Apr 21 15:35:37.332389 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:37.332346 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4spst" podUID="efb241d1-f7e0-44b6-8014-d8a71973aa71" Apr 21 15:35:38.332720 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:38.332697 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:38.333028 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:38.332795 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x2rv7" podUID="54264bba-76e1-44c8-8581-4f2271e68bd7" Apr 21 15:35:39.335462 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.335113 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:39.336097 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.335115 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:39.336097 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:39.335499 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmvhg" podUID="90b42886-9124-47a7-8a37-518ea2f64986" Apr 21 15:35:39.336097 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:39.335539 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4spst" podUID="efb241d1-f7e0-44b6-8014-d8a71973aa71" Apr 21 15:35:39.399481 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.399389 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" event={"ID":"1962acd0-a20c-4e31-9994-8d210722d639","Type":"ContainerStarted","Data":"7bca90b10f9c362eccfde9d70f9858198720e990cc007b2b3061f5943edf359a"} Apr 21 15:35:39.400696 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.400670 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n4dx6" event={"ID":"81a2659f-602c-4c12-bd0d-20488c10a56f","Type":"ContainerStarted","Data":"3b995d1f037efea97b5ea315e1f382c1d019c5da14bcca90eefd37ef28b7125f"} Apr 21 15:35:39.402052 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.402028 2570 generic.go:358] "Generic (PLEG): container finished" podID="abc50d87-ddda-484f-bcda-07b2af6fbf70" containerID="6ea666caa92287656dcaa0921ef6a0229ea2c553649e2f305737c4ca0ef7df53" exitCode=0 Apr 21 15:35:39.402151 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.402113 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kltss" event={"ID":"abc50d87-ddda-484f-bcda-07b2af6fbf70","Type":"ContainerDied","Data":"6ea666caa92287656dcaa0921ef6a0229ea2c553649e2f305737c4ca0ef7df53"} Apr 21 15:35:39.405239 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.405202 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" event={"ID":"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2","Type":"ContainerStarted","Data":"23006bf38f0cfe4fc9de676c7049407aeaadb34ac8ec9d6e3a3b9ae281314de9"} Apr 21 15:35:39.405239 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.405232 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" event={"ID":"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2","Type":"ContainerStarted","Data":"86504ece4179c12c28a033e3c183142d5c18ce0699aaca9cc8c3d0b77f751bd5"} Apr 21 15:35:39.405383 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.405245 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" event={"ID":"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2","Type":"ContainerStarted","Data":"e65f15540d61110198d22aa019289bc490894193934d813315feee02d764f05a"} Apr 21 15:35:39.405383 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.405256 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" event={"ID":"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2","Type":"ContainerStarted","Data":"264f61b152a103435188d90f79d41c24b3718d86fc062e2be959663ce81b8f4c"} Apr 21 15:35:39.405383 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.405268 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" event={"ID":"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2","Type":"ContainerStarted","Data":"34ef683c05400007c2308630c27568e2ddd3b00ca17df041e3c0c660c081e6c1"} Apr 21 15:35:39.405383 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.405278 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" event={"ID":"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2","Type":"ContainerStarted","Data":"e372ef69c1e6edf28e6a0eca10de98f93d70361ff91872252670c212afa3689f"} Apr 21 15:35:39.406876 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.406853 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9222j" event={"ID":"50b4859e-da58-4584-a53e-a4daaccafc4c","Type":"ContainerStarted","Data":"42f812f090bace07c020a0d039bc4c78ceaa2a0095ecf45780409dbb40bce35d"} Apr 21 15:35:39.408411 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.408367 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qsgrz" event={"ID":"08d2130c-7332-485b-95f8-0728da25787a","Type":"ContainerStarted","Data":"1e59ac5d8013d823ce56802f60fbae03319845cd042d1980297660dd38db2d5f"} Apr 21 15:35:39.409821 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.409802 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" event={"ID":"9fe3c049-1ac3-41bd-9da4-6cbd245a22bc","Type":"ContainerStarted","Data":"421c40db2b4b631f9320bc574a2ca1ba3c9fb73974fd812b3016f39865595c51"} Apr 21 15:35:39.421580 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.421216 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-n4dx6" podStartSLOduration=3.983492174 podStartE2EDuration="20.421201858s" podCreationTimestamp="2026-04-21 15:35:19 +0000 UTC" firstStartedPulling="2026-04-21 15:35:21.929273418 +0000 UTC m=+3.230127599" lastFinishedPulling="2026-04-21 15:35:38.366983088 +0000 UTC m=+19.667837283" observedRunningTime="2026-04-21 15:35:39.420595529 +0000 UTC m=+20.721449732" watchObservedRunningTime="2026-04-21 15:35:39.421201858 +0000 UTC m=+20.722056060" Apr 21 15:35:39.453744 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.453702 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-r9ngc" podStartSLOduration=4.042297422 podStartE2EDuration="20.453689919s" podCreationTimestamp="2026-04-21 15:35:19 +0000 UTC" firstStartedPulling="2026-04-21 15:35:21.925737305 +0000 UTC m=+3.226591499" lastFinishedPulling="2026-04-21 15:35:38.337129803 +0000 UTC m=+19.637983996" observedRunningTime="2026-04-21 15:35:39.438358356 +0000 UTC m=+20.739212558" watchObservedRunningTime="2026-04-21 15:35:39.453689919 +0000 UTC m=+20.754544120" Apr 21 15:35:39.474147 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.474090 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qsgrz" podStartSLOduration=4.070812387 podStartE2EDuration="20.474080128s" podCreationTimestamp="2026-04-21 15:35:19 +0000 UTC" firstStartedPulling="2026-04-21 15:35:21.933929222 +0000 UTC m=+3.234783403" lastFinishedPulling="2026-04-21 15:35:38.337196951 +0000 UTC m=+19.638051144" observedRunningTime="2026-04-21 15:35:39.453270126 +0000 UTC m=+20.754124319" watchObservedRunningTime="2026-04-21 15:35:39.474080128 +0000 UTC m=+20.774934329" Apr 21 15:35:39.489109 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.489062 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-9222j" podStartSLOduration=4.087189808 podStartE2EDuration="20.489044405s" podCreationTimestamp="2026-04-21 15:35:19 +0000 UTC" firstStartedPulling="2026-04-21 15:35:21.935292014 +0000 UTC m=+3.236146194" lastFinishedPulling="2026-04-21 15:35:38.337146593 +0000 UTC m=+19.638000791" observedRunningTime="2026-04-21 15:35:39.48895838 +0000 UTC m=+20.789812583" watchObservedRunningTime="2026-04-21 15:35:39.489044405 +0000 UTC m=+20.789898608" Apr 21 15:35:39.508876 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.508850 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 15:35:39.529053 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:39.529020 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret\") pod \"global-pull-secret-syncer-dmvhg\" (UID: \"90b42886-9124-47a7-8a37-518ea2f64986\") " pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:39.529445 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:39.529426 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:39.529530 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:39.529480 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret podName:90b42886-9124-47a7-8a37-518ea2f64986 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:55.52946281 +0000 UTC m=+36.830316999 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret") pod "global-pull-secret-syncer-dmvhg" (UID: "90b42886-9124-47a7-8a37-518ea2f64986") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:40.243368 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.243105 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T15:35:39.508873971Z","UUID":"33b8b245-f10a-473c-84df-25a63ae26d7f","Handler":null,"Name":"","Endpoint":""} Apr 21 15:35:40.246688 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.246657 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 15:35:40.246688 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.246684 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 15:35:40.290361 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.290335 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-6kcrm"] Apr 21 15:35:40.308422 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.308389 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6kcrm" Apr 21 15:35:40.310688 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.310669 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8dt82\"" Apr 21 15:35:40.310813 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.310670 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 15:35:40.310813 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.310670 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 15:35:40.332974 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.332950 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:40.333093 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:40.333063 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x2rv7" podUID="54264bba-76e1-44c8-8581-4f2271e68bd7" Apr 21 15:35:40.413037 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.413001 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tvmfk" event={"ID":"96ab5087-ce27-46c7-81df-f0c0d2767c77","Type":"ContainerStarted","Data":"d05f861433c56d91e208461574c2f9f4f91c4d9392bf9491b42123744e68adf8"} Apr 21 15:35:40.414831 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.414785 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" event={"ID":"1962acd0-a20c-4e31-9994-8d210722d639","Type":"ContainerStarted","Data":"7e9696d2ffbcdc91b0cb8ce45674d3825d523e51021d9f5b370f07c5b944e3cd"} Apr 21 15:35:40.432458 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.430807 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-tvmfk" podStartSLOduration=5.026296638 podStartE2EDuration="21.430790857s" podCreationTimestamp="2026-04-21 15:35:19 +0000 UTC" firstStartedPulling="2026-04-21 15:35:21.932728896 +0000 UTC m=+3.233583084" lastFinishedPulling="2026-04-21 15:35:38.337223108 +0000 UTC m=+19.638077303" observedRunningTime="2026-04-21 15:35:40.430230243 +0000 UTC m=+21.731084444" watchObservedRunningTime="2026-04-21 15:35:40.430790857 +0000 UTC m=+21.731645071" Apr 21 15:35:40.436354 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.436330 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d5081a65-e77f-4228-83e9-044b28aa3b8b-hosts-file\") pod \"node-resolver-6kcrm\" (UID: \"d5081a65-e77f-4228-83e9-044b28aa3b8b\") " pod="openshift-dns/node-resolver-6kcrm" Apr 21 15:35:40.436482 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.436371 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d5081a65-e77f-4228-83e9-044b28aa3b8b-tmp-dir\") pod \"node-resolver-6kcrm\" (UID: \"d5081a65-e77f-4228-83e9-044b28aa3b8b\") " pod="openshift-dns/node-resolver-6kcrm" Apr 21 15:35:40.436482 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.436408 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpkl6\" (UniqueName: \"kubernetes.io/projected/d5081a65-e77f-4228-83e9-044b28aa3b8b-kube-api-access-fpkl6\") pod \"node-resolver-6kcrm\" (UID: \"d5081a65-e77f-4228-83e9-044b28aa3b8b\") " pod="openshift-dns/node-resolver-6kcrm" Apr 21 15:35:40.537179 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.536934 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d5081a65-e77f-4228-83e9-044b28aa3b8b-tmp-dir\") pod \"node-resolver-6kcrm\" (UID: \"d5081a65-e77f-4228-83e9-044b28aa3b8b\") " pod="openshift-dns/node-resolver-6kcrm" Apr 21 15:35:40.537309 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.537226 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpkl6\" (UniqueName: \"kubernetes.io/projected/d5081a65-e77f-4228-83e9-044b28aa3b8b-kube-api-access-fpkl6\") pod \"node-resolver-6kcrm\" (UID: \"d5081a65-e77f-4228-83e9-044b28aa3b8b\") " pod="openshift-dns/node-resolver-6kcrm" Apr 21 15:35:40.537309 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.537297 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d5081a65-e77f-4228-83e9-044b28aa3b8b-tmp-dir\") pod \"node-resolver-6kcrm\" (UID: \"d5081a65-e77f-4228-83e9-044b28aa3b8b\") " pod="openshift-dns/node-resolver-6kcrm" Apr 21 15:35:40.537435 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.537418 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d5081a65-e77f-4228-83e9-044b28aa3b8b-hosts-file\") pod \"node-resolver-6kcrm\" (UID: \"d5081a65-e77f-4228-83e9-044b28aa3b8b\") " pod="openshift-dns/node-resolver-6kcrm" Apr 21 15:35:40.537496 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.537482 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d5081a65-e77f-4228-83e9-044b28aa3b8b-hosts-file\") pod \"node-resolver-6kcrm\" (UID: \"d5081a65-e77f-4228-83e9-044b28aa3b8b\") " pod="openshift-dns/node-resolver-6kcrm" Apr 21 15:35:40.549030 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.549005 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpkl6\" (UniqueName: \"kubernetes.io/projected/d5081a65-e77f-4228-83e9-044b28aa3b8b-kube-api-access-fpkl6\") pod \"node-resolver-6kcrm\" (UID: \"d5081a65-e77f-4228-83e9-044b28aa3b8b\") " pod="openshift-dns/node-resolver-6kcrm" Apr 21 15:35:40.619371 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:40.619348 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6kcrm" Apr 21 15:35:40.628571 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:40.628542 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5081a65_e77f_4228_83e9_044b28aa3b8b.slice/crio-799c0c784fb79249e18a14bc28193d063e1c0d9ec3831b31aa54419c308c3132 WatchSource:0}: Error finding container 799c0c784fb79249e18a14bc28193d063e1c0d9ec3831b31aa54419c308c3132: Status 404 returned error can't find the container with id 799c0c784fb79249e18a14bc28193d063e1c0d9ec3831b31aa54419c308c3132 Apr 21 15:35:41.332348 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:41.332313 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:41.332491 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:41.332313 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:41.332491 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:41.332429 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmvhg" podUID="90b42886-9124-47a7-8a37-518ea2f64986" Apr 21 15:35:41.332594 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:41.332502 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4spst" podUID="efb241d1-f7e0-44b6-8014-d8a71973aa71" Apr 21 15:35:41.420266 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:41.420228 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" event={"ID":"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2","Type":"ContainerStarted","Data":"902594f9733302dd7a40f71169d8f67b4e5272c27d6e30251476e65b9bcb5b4f"} Apr 21 15:35:41.422088 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:41.422053 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" event={"ID":"1962acd0-a20c-4e31-9994-8d210722d639","Type":"ContainerStarted","Data":"2f8073f43eccec3c33156e5d3bd17a61ea2dee4fa54f8a78429c5917eac63480"} Apr 21 15:35:41.423707 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:41.423681 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6kcrm" event={"ID":"d5081a65-e77f-4228-83e9-044b28aa3b8b","Type":"ContainerStarted","Data":"159c79e03708a89ad74292eb3972f01ccc61af5ab90a0158fdb2aff8d02bca88"} Apr 21 15:35:41.423796 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:41.423719 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6kcrm" event={"ID":"d5081a65-e77f-4228-83e9-044b28aa3b8b","Type":"ContainerStarted","Data":"799c0c784fb79249e18a14bc28193d063e1c0d9ec3831b31aa54419c308c3132"} Apr 21 15:35:41.450796 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:41.450743 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6g8bm" podStartSLOduration=3.877815994 podStartE2EDuration="22.450729367s" podCreationTimestamp="2026-04-21 15:35:19 +0000 UTC" firstStartedPulling="2026-04-21 15:35:21.931634353 +0000 UTC m=+3.232488533" lastFinishedPulling="2026-04-21 15:35:40.504547707 +0000 UTC m=+21.805401906" observedRunningTime="2026-04-21 15:35:41.450075509 +0000 UTC m=+22.750929713" watchObservedRunningTime="2026-04-21 15:35:41.450729367 +0000 UTC m=+22.751583573" Apr 21 15:35:41.471500 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:41.471452 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6kcrm" podStartSLOduration=1.47143642 podStartE2EDuration="1.47143642s" podCreationTimestamp="2026-04-21 15:35:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:35:41.471333384 +0000 UTC m=+22.772187588" watchObservedRunningTime="2026-04-21 15:35:41.47143642 +0000 UTC m=+22.772290623" Apr 21 15:35:41.481472 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:41.481415 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-9222j" Apr 21 15:35:41.482101 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:41.482082 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-9222j" Apr 21 15:35:42.332587 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:42.332553 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:42.332778 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:42.332688 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x2rv7" podUID="54264bba-76e1-44c8-8581-4f2271e68bd7" Apr 21 15:35:42.425792 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:42.425698 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-9222j" Apr 21 15:35:42.426409 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:42.426391 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-9222j" Apr 21 15:35:43.335216 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:43.334982 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:43.335328 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:43.334982 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:43.335328 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:43.335250 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4spst" podUID="efb241d1-f7e0-44b6-8014-d8a71973aa71" Apr 21 15:35:43.335328 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:43.335304 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmvhg" podUID="90b42886-9124-47a7-8a37-518ea2f64986" Apr 21 15:35:43.428442 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:43.428415 2570 generic.go:358] "Generic (PLEG): container finished" podID="abc50d87-ddda-484f-bcda-07b2af6fbf70" containerID="ea1a492b34d04a89ecacef18602ec8c39f2d7408834348abd60ad85b4f1a6e0d" exitCode=0 Apr 21 15:35:43.428732 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:43.428504 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kltss" event={"ID":"abc50d87-ddda-484f-bcda-07b2af6fbf70","Type":"ContainerDied","Data":"ea1a492b34d04a89ecacef18602ec8c39f2d7408834348abd60ad85b4f1a6e0d"} Apr 21 15:35:44.332680 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:44.332649 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:44.332822 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:44.332748 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x2rv7" podUID="54264bba-76e1-44c8-8581-4f2271e68bd7" Apr 21 15:35:44.432809 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:44.432714 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" event={"ID":"7aae9a17-ae2e-4328-91d9-7fb4b43f79e2","Type":"ContainerStarted","Data":"adc1515a9519e8db7093125851fc88fdac1a16bf10e56cdbc2919f2fcbd51272"} Apr 21 15:35:44.433258 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:44.432949 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:44.434600 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:44.434569 2570 generic.go:358] "Generic (PLEG): container finished" podID="abc50d87-ddda-484f-bcda-07b2af6fbf70" containerID="8096c44b21614e7a05e57d907a394e1e6152fddd5c6baa28d1809166791054ff" exitCode=0 Apr 21 15:35:44.434725 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:44.434662 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kltss" event={"ID":"abc50d87-ddda-484f-bcda-07b2af6fbf70","Type":"ContainerDied","Data":"8096c44b21614e7a05e57d907a394e1e6152fddd5c6baa28d1809166791054ff"} Apr 21 15:35:44.447596 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:44.447446 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:44.485557 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:44.485517 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" podStartSLOduration=8.752338878 podStartE2EDuration="25.485503396s" podCreationTimestamp="2026-04-21 15:35:19 +0000 UTC" firstStartedPulling="2026-04-21 15:35:21.936214356 +0000 UTC m=+3.237068539" lastFinishedPulling="2026-04-21 15:35:38.669378876 +0000 UTC m=+19.970233057" observedRunningTime="2026-04-21 15:35:44.484202541 +0000 UTC m=+25.785056742" watchObservedRunningTime="2026-04-21 15:35:44.485503396 +0000 UTC m=+25.786357598" Apr 21 15:35:45.332678 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:45.332647 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:45.332678 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:45.332667 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:45.332874 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:45.332738 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4spst" podUID="efb241d1-f7e0-44b6-8014-d8a71973aa71" Apr 21 15:35:45.332921 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:45.332868 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmvhg" podUID="90b42886-9124-47a7-8a37-518ea2f64986" Apr 21 15:35:45.438354 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:45.438323 2570 generic.go:358] "Generic (PLEG): container finished" podID="abc50d87-ddda-484f-bcda-07b2af6fbf70" containerID="7d7ce9dc681c7dec2d057419153a4d93cb0d805521d8af1c1d43bf82a80f8aeb" exitCode=0 Apr 21 15:35:45.438901 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:45.438404 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kltss" event={"ID":"abc50d87-ddda-484f-bcda-07b2af6fbf70","Type":"ContainerDied","Data":"7d7ce9dc681c7dec2d057419153a4d93cb0d805521d8af1c1d43bf82a80f8aeb"} Apr 21 15:35:45.439296 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:45.439037 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:45.439296 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:45.439062 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:45.452463 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:45.452444 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:35:45.614400 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:45.614328 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4spst"] Apr 21 15:35:45.614536 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:45.614417 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:45.614536 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:45.614485 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4spst" podUID="efb241d1-f7e0-44b6-8014-d8a71973aa71" Apr 21 15:35:45.615534 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:45.615511 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dmvhg"] Apr 21 15:35:45.615665 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:45.615590 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:45.615713 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:45.615665 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmvhg" podUID="90b42886-9124-47a7-8a37-518ea2f64986" Apr 21 15:35:45.616331 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:45.616310 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x2rv7"] Apr 21 15:35:45.616441 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:45.616421 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:45.616565 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:45.616545 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x2rv7" podUID="54264bba-76e1-44c8-8581-4f2271e68bd7" Apr 21 15:35:47.332649 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:47.332610 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:47.332649 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:47.332638 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:47.333190 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:47.332662 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:47.333190 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:47.332746 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmvhg" podUID="90b42886-9124-47a7-8a37-518ea2f64986" Apr 21 15:35:47.333190 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:47.333116 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4spst" podUID="efb241d1-f7e0-44b6-8014-d8a71973aa71" Apr 21 15:35:47.333345 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:47.333227 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x2rv7" podUID="54264bba-76e1-44c8-8581-4f2271e68bd7" Apr 21 15:35:49.333479 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:49.333446 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:49.334058 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:49.333526 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:49.334058 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:49.333528 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:49.334058 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:49.333593 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmvhg" podUID="90b42886-9124-47a7-8a37-518ea2f64986" Apr 21 15:35:49.334058 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:49.333727 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x2rv7" podUID="54264bba-76e1-44c8-8581-4f2271e68bd7" Apr 21 15:35:49.334058 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:49.333816 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4spst" podUID="efb241d1-f7e0-44b6-8014-d8a71973aa71" Apr 21 15:35:51.334778 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:51.334749 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:51.335225 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:51.334760 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:51.335225 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:51.334856 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dmvhg" podUID="90b42886-9124-47a7-8a37-518ea2f64986" Apr 21 15:35:51.335225 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:51.334908 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4spst" podUID="efb241d1-f7e0-44b6-8014-d8a71973aa71" Apr 21 15:35:51.335225 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:51.334759 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:51.335225 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:51.334980 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x2rv7" podUID="54264bba-76e1-44c8-8581-4f2271e68bd7" Apr 21 15:35:52.453441 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.453215 2570 generic.go:358] "Generic (PLEG): container finished" podID="abc50d87-ddda-484f-bcda-07b2af6fbf70" containerID="e4edbafda9a247b5c094635860bf460fa38f9091c351e4d101d06bae5a6ba11f" exitCode=0 Apr 21 15:35:52.453441 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.453294 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kltss" event={"ID":"abc50d87-ddda-484f-bcda-07b2af6fbf70","Type":"ContainerDied","Data":"e4edbafda9a247b5c094635860bf460fa38f9091c351e4d101d06bae5a6ba11f"} Apr 21 15:35:52.536564 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.536536 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-237.ec2.internal" event="NodeReady" Apr 21 15:35:52.536740 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.536686 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 15:35:52.590718 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.590692 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66ffb968db-h6xsp"] Apr 21 15:35:52.605805 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.605779 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4xm8n"] Apr 21 15:35:52.605955 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.605929 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.608386 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.608286 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 15:35:52.608802 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.608781 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 15:35:52.608913 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.608784 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tfsv8\"" Apr 21 15:35:52.608913 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.608883 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 15:35:52.615904 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.615885 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 15:35:52.618230 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.618206 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9lpqk"] Apr 21 15:35:52.618490 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.618433 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4xm8n" Apr 21 15:35:52.620260 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.620241 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 15:35:52.620260 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.620256 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 15:35:52.620416 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.620256 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8774b\"" Apr 21 15:35:52.639612 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.639591 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66ffb968db-h6xsp"] Apr 21 15:35:52.639612 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.639615 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9lpqk"] Apr 21 15:35:52.639746 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.639623 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4xm8n"] Apr 21 15:35:52.639746 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.639702 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9lpqk" Apr 21 15:35:52.641931 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.641910 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 15:35:52.642052 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.642011 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 15:35:52.642712 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.642690 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 15:35:52.643069 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.643029 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l9cdd\"" Apr 21 15:35:52.730767 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.730694 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-cert\") pod \"ingress-canary-9lpqk\" (UID: \"b5a5676f-25d5-4f87-ad65-41d268c5e9f4\") " pod="openshift-ingress-canary/ingress-canary-9lpqk" Apr 21 15:35:52.730767 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.730730 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqt99\" (UniqueName: \"kubernetes.io/projected/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-kube-api-access-hqt99\") pod \"ingress-canary-9lpqk\" (UID: \"b5a5676f-25d5-4f87-ad65-41d268c5e9f4\") " pod="openshift-ingress-canary/ingress-canary-9lpqk" Apr 21 15:35:52.730767 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.730748 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-certificates\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.730767 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.730762 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70851abd-b4ff-4289-82b5-49d3c83c3007-trusted-ca\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.731003 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.730827 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/70851abd-b4ff-4289-82b5-49d3c83c3007-installation-pull-secrets\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.731003 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.730857 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/63d08935-bd63-4c7f-83c9-df40083b472a-metrics-tls\") pod \"dns-default-4xm8n\" (UID: \"63d08935-bd63-4c7f-83c9-df40083b472a\") " pod="openshift-dns/dns-default-4xm8n" Apr 21 15:35:52.731003 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.730878 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.731003 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.730900 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/63d08935-bd63-4c7f-83c9-df40083b472a-tmp-dir\") pod \"dns-default-4xm8n\" (UID: \"63d08935-bd63-4c7f-83c9-df40083b472a\") " pod="openshift-dns/dns-default-4xm8n" Apr 21 15:35:52.731003 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.730920 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9vfs\" (UniqueName: \"kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-kube-api-access-z9vfs\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.731003 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.730949 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/70851abd-b4ff-4289-82b5-49d3c83c3007-image-registry-private-configuration\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.731003 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.730976 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-bound-sa-token\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.731003 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.730993 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/70851abd-b4ff-4289-82b5-49d3c83c3007-ca-trust-extracted\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.731265 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.731061 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63d08935-bd63-4c7f-83c9-df40083b472a-config-volume\") pod \"dns-default-4xm8n\" (UID: \"63d08935-bd63-4c7f-83c9-df40083b472a\") " pod="openshift-dns/dns-default-4xm8n" Apr 21 15:35:52.731265 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.731095 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc7sb\" (UniqueName: \"kubernetes.io/projected/63d08935-bd63-4c7f-83c9-df40083b472a-kube-api-access-tc7sb\") pod \"dns-default-4xm8n\" (UID: \"63d08935-bd63-4c7f-83c9-df40083b472a\") " pod="openshift-dns/dns-default-4xm8n" Apr 21 15:35:52.831976 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.831946 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-cert\") pod \"ingress-canary-9lpqk\" (UID: \"b5a5676f-25d5-4f87-ad65-41d268c5e9f4\") " pod="openshift-ingress-canary/ingress-canary-9lpqk" Apr 21 15:35:52.831976 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.831978 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqt99\" (UniqueName: \"kubernetes.io/projected/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-kube-api-access-hqt99\") pod \"ingress-canary-9lpqk\" (UID: \"b5a5676f-25d5-4f87-ad65-41d268c5e9f4\") " pod="openshift-ingress-canary/ingress-canary-9lpqk" Apr 21 15:35:52.832304 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.831993 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-certificates\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.832304 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.832009 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70851abd-b4ff-4289-82b5-49d3c83c3007-trusted-ca\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.832304 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.832037 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/70851abd-b4ff-4289-82b5-49d3c83c3007-installation-pull-secrets\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.832304 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.832064 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/63d08935-bd63-4c7f-83c9-df40083b472a-metrics-tls\") pod \"dns-default-4xm8n\" (UID: \"63d08935-bd63-4c7f-83c9-df40083b472a\") " pod="openshift-dns/dns-default-4xm8n" Apr 21 15:35:52.832304 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.832090 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.832304 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:52.832104 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:52.832304 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.832110 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/63d08935-bd63-4c7f-83c9-df40083b472a-tmp-dir\") pod \"dns-default-4xm8n\" (UID: \"63d08935-bd63-4c7f-83c9-df40083b472a\") " pod="openshift-dns/dns-default-4xm8n" Apr 21 15:35:52.832304 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:52.832193 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-cert podName:b5a5676f-25d5-4f87-ad65-41d268c5e9f4 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:53.332170859 +0000 UTC m=+34.633025063 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-cert") pod "ingress-canary-9lpqk" (UID: "b5a5676f-25d5-4f87-ad65-41d268c5e9f4") : secret "canary-serving-cert" not found Apr 21 15:35:52.832304 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:52.832220 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:35:52.832304 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:52.832233 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66ffb968db-h6xsp: secret "image-registry-tls" not found Apr 21 15:35:52.832304 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:52.832233 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:52.832304 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.832240 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9vfs\" (UniqueName: \"kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-kube-api-access-z9vfs\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.832304 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:52.832277 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls podName:70851abd-b4ff-4289-82b5-49d3c83c3007 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:53.332262614 +0000 UTC m=+34.633116814 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls") pod "image-registry-66ffb968db-h6xsp" (UID: "70851abd-b4ff-4289-82b5-49d3c83c3007") : secret "image-registry-tls" not found Apr 21 15:35:52.832304 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.832313 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/70851abd-b4ff-4289-82b5-49d3c83c3007-image-registry-private-configuration\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.832994 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.832350 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-bound-sa-token\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.832994 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.832382 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/70851abd-b4ff-4289-82b5-49d3c83c3007-ca-trust-extracted\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.832994 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.832435 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63d08935-bd63-4c7f-83c9-df40083b472a-config-volume\") pod \"dns-default-4xm8n\" (UID: \"63d08935-bd63-4c7f-83c9-df40083b472a\") " pod="openshift-dns/dns-default-4xm8n" Apr 21 15:35:52.832994 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.832463 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc7sb\" (UniqueName: \"kubernetes.io/projected/63d08935-bd63-4c7f-83c9-df40083b472a-kube-api-access-tc7sb\") pod \"dns-default-4xm8n\" (UID: \"63d08935-bd63-4c7f-83c9-df40083b472a\") " pod="openshift-dns/dns-default-4xm8n" Apr 21 15:35:52.832994 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.832541 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/63d08935-bd63-4c7f-83c9-df40083b472a-tmp-dir\") pod \"dns-default-4xm8n\" (UID: \"63d08935-bd63-4c7f-83c9-df40083b472a\") " pod="openshift-dns/dns-default-4xm8n" Apr 21 15:35:52.832994 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:52.832634 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63d08935-bd63-4c7f-83c9-df40083b472a-metrics-tls podName:63d08935-bd63-4c7f-83c9-df40083b472a nodeName:}" failed. No retries permitted until 2026-04-21 15:35:53.332615597 +0000 UTC m=+34.633469777 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/63d08935-bd63-4c7f-83c9-df40083b472a-metrics-tls") pod "dns-default-4xm8n" (UID: "63d08935-bd63-4c7f-83c9-df40083b472a") : secret "dns-default-metrics-tls" not found Apr 21 15:35:52.832994 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.832846 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-certificates\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.832994 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.832846 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/70851abd-b4ff-4289-82b5-49d3c83c3007-ca-trust-extracted\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.833295 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.833020 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/63d08935-bd63-4c7f-83c9-df40083b472a-config-volume\") pod \"dns-default-4xm8n\" (UID: \"63d08935-bd63-4c7f-83c9-df40083b472a\") " pod="openshift-dns/dns-default-4xm8n" Apr 21 15:35:52.833295 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.833083 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70851abd-b4ff-4289-82b5-49d3c83c3007-trusted-ca\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.836395 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.836372 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/70851abd-b4ff-4289-82b5-49d3c83c3007-installation-pull-secrets\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.836490 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.836379 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/70851abd-b4ff-4289-82b5-49d3c83c3007-image-registry-private-configuration\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.846788 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.846756 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-bound-sa-token\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.847221 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.847201 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqt99\" (UniqueName: \"kubernetes.io/projected/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-kube-api-access-hqt99\") pod \"ingress-canary-9lpqk\" (UID: \"b5a5676f-25d5-4f87-ad65-41d268c5e9f4\") " pod="openshift-ingress-canary/ingress-canary-9lpqk" Apr 21 15:35:52.848340 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.848317 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc7sb\" (UniqueName: \"kubernetes.io/projected/63d08935-bd63-4c7f-83c9-df40083b472a-kube-api-access-tc7sb\") pod \"dns-default-4xm8n\" (UID: \"63d08935-bd63-4c7f-83c9-df40083b472a\") " pod="openshift-dns/dns-default-4xm8n" Apr 21 15:35:52.849554 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.849534 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9vfs\" (UniqueName: \"kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-kube-api-access-z9vfs\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:52.933625 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:52.933585 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs\") pod \"network-metrics-daemon-x2rv7\" (UID: \"54264bba-76e1-44c8-8581-4f2271e68bd7\") " pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:52.933824 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:52.933710 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:52.933824 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:52.933786 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs podName:54264bba-76e1-44c8-8581-4f2271e68bd7 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:24.933767732 +0000 UTC m=+66.234621919 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs") pod "network-metrics-daemon-x2rv7" (UID: "54264bba-76e1-44c8-8581-4f2271e68bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:53.034328 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:53.034296 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twv2f\" (UniqueName: \"kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f\") pod \"network-check-target-4spst\" (UID: \"efb241d1-f7e0-44b6-8014-d8a71973aa71\") " pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:53.034484 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:53.034442 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:53.034484 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:53.034461 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:53.034484 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:53.034471 2570 projected.go:194] Error preparing data for projected volume kube-api-access-twv2f for pod openshift-network-diagnostics/network-check-target-4spst: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:53.034589 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:53.034520 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f podName:efb241d1-f7e0-44b6-8014-d8a71973aa71 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:25.03450605 +0000 UTC m=+66.335360235 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-twv2f" (UniqueName: "kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f") pod "network-check-target-4spst" (UID: "efb241d1-f7e0-44b6-8014-d8a71973aa71") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:53.332745 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:53.332663 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:35:53.332911 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:53.332663 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:53.332911 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:53.332664 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:35:53.335010 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:53.334990 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 15:35:53.335293 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:53.335275 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 15:35:53.335410 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:53.335308 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 15:35:53.335410 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:53.335309 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 15:35:53.335410 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:53.335377 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-l72g6\"" Apr 21 15:35:53.335410 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:53.335314 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dzs5m\"" Apr 21 15:35:53.336585 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:53.336570 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-cert\") pod \"ingress-canary-9lpqk\" (UID: \"b5a5676f-25d5-4f87-ad65-41d268c5e9f4\") " pod="openshift-ingress-canary/ingress-canary-9lpqk" Apr 21 15:35:53.336636 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:53.336598 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/63d08935-bd63-4c7f-83c9-df40083b472a-metrics-tls\") pod \"dns-default-4xm8n\" (UID: \"63d08935-bd63-4c7f-83c9-df40083b472a\") " pod="openshift-dns/dns-default-4xm8n" Apr 21 15:35:53.336636 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:53.336615 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:53.336722 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:53.336703 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:53.336769 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:53.336746 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:53.336769 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:53.336706 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:35:53.336769 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:53.336768 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-cert podName:b5a5676f-25d5-4f87-ad65-41d268c5e9f4 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:54.336749367 +0000 UTC m=+35.637603561 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-cert") pod "ingress-canary-9lpqk" (UID: "b5a5676f-25d5-4f87-ad65-41d268c5e9f4") : secret "canary-serving-cert" not found Apr 21 15:35:53.336903 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:53.336771 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66ffb968db-h6xsp: secret "image-registry-tls" not found Apr 21 15:35:53.336903 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:53.336798 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63d08935-bd63-4c7f-83c9-df40083b472a-metrics-tls podName:63d08935-bd63-4c7f-83c9-df40083b472a nodeName:}" failed. No retries permitted until 2026-04-21 15:35:54.336780596 +0000 UTC m=+35.637634787 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/63d08935-bd63-4c7f-83c9-df40083b472a-metrics-tls") pod "dns-default-4xm8n" (UID: "63d08935-bd63-4c7f-83c9-df40083b472a") : secret "dns-default-metrics-tls" not found Apr 21 15:35:53.336903 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:53.336820 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls podName:70851abd-b4ff-4289-82b5-49d3c83c3007 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:54.336810652 +0000 UTC m=+35.637664839 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls") pod "image-registry-66ffb968db-h6xsp" (UID: "70851abd-b4ff-4289-82b5-49d3c83c3007") : secret "image-registry-tls" not found Apr 21 15:35:53.457183 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:53.457150 2570 generic.go:358] "Generic (PLEG): container finished" podID="abc50d87-ddda-484f-bcda-07b2af6fbf70" containerID="dbaed5f7cd52d7e46faa98c9bd0ea7f1199f9cf0852752a7f87a71c91050a997" exitCode=0 Apr 21 15:35:53.457538 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:53.457207 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kltss" event={"ID":"abc50d87-ddda-484f-bcda-07b2af6fbf70","Type":"ContainerDied","Data":"dbaed5f7cd52d7e46faa98c9bd0ea7f1199f9cf0852752a7f87a71c91050a997"} Apr 21 15:35:54.345125 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:54.345083 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-cert\") pod \"ingress-canary-9lpqk\" (UID: \"b5a5676f-25d5-4f87-ad65-41d268c5e9f4\") " pod="openshift-ingress-canary/ingress-canary-9lpqk" Apr 21 15:35:54.345125 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:54.345125 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/63d08935-bd63-4c7f-83c9-df40083b472a-metrics-tls\") pod \"dns-default-4xm8n\" (UID: \"63d08935-bd63-4c7f-83c9-df40083b472a\") " pod="openshift-dns/dns-default-4xm8n" Apr 21 15:35:54.345398 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:54.345164 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:54.345398 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:54.345241 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:54.345398 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:54.345280 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:54.345398 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:54.345285 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:35:54.345398 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:54.345316 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-cert podName:b5a5676f-25d5-4f87-ad65-41d268c5e9f4 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:56.345297134 +0000 UTC m=+37.646151334 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-cert") pod "ingress-canary-9lpqk" (UID: "b5a5676f-25d5-4f87-ad65-41d268c5e9f4") : secret "canary-serving-cert" not found Apr 21 15:35:54.345398 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:54.345320 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66ffb968db-h6xsp: secret "image-registry-tls" not found Apr 21 15:35:54.345398 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:54.345333 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63d08935-bd63-4c7f-83c9-df40083b472a-metrics-tls podName:63d08935-bd63-4c7f-83c9-df40083b472a nodeName:}" failed. No retries permitted until 2026-04-21 15:35:56.345324291 +0000 UTC m=+37.646178472 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/63d08935-bd63-4c7f-83c9-df40083b472a-metrics-tls") pod "dns-default-4xm8n" (UID: "63d08935-bd63-4c7f-83c9-df40083b472a") : secret "dns-default-metrics-tls" not found Apr 21 15:35:54.345398 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:54.345352 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls podName:70851abd-b4ff-4289-82b5-49d3c83c3007 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:56.345341852 +0000 UTC m=+37.646196048 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls") pod "image-registry-66ffb968db-h6xsp" (UID: "70851abd-b4ff-4289-82b5-49d3c83c3007") : secret "image-registry-tls" not found Apr 21 15:35:54.461694 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:54.461663 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kltss" event={"ID":"abc50d87-ddda-484f-bcda-07b2af6fbf70","Type":"ContainerStarted","Data":"f6fc5b09de6c4130b4fa9d79264b55265b7a7b2a3b2f23c36b1721da94402d24"} Apr 21 15:35:54.499129 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:54.499082 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kltss" podStartSLOduration=6.055930445 podStartE2EDuration="35.499068705s" podCreationTimestamp="2026-04-21 15:35:19 +0000 UTC" firstStartedPulling="2026-04-21 15:35:21.925446732 +0000 UTC m=+3.226300925" lastFinishedPulling="2026-04-21 15:35:51.368584988 +0000 UTC m=+32.669439185" observedRunningTime="2026-04-21 15:35:54.498936172 +0000 UTC m=+35.799790374" watchObservedRunningTime="2026-04-21 15:35:54.499068705 +0000 UTC m=+35.799922906" Apr 21 15:35:54.520117 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:54.520087 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-smkv2"] Apr 21 15:35:54.529836 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:54.529820 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-smkv2" Apr 21 15:35:54.533378 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:54.533352 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 21 15:35:54.533525 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:54.533397 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 21 15:35:54.533525 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:54.533471 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-zkl9p\"" Apr 21 15:35:54.539675 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:54.539654 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-smkv2"] Apr 21 15:35:54.647244 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:54.647157 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvw4h\" (UniqueName: \"kubernetes.io/projected/7b0361fd-9757-471d-becd-b04ebf9ab715-kube-api-access-mvw4h\") pod \"migrator-74bb7799d9-smkv2\" (UID: \"7b0361fd-9757-471d-becd-b04ebf9ab715\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-smkv2" Apr 21 15:35:54.748476 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:54.748447 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvw4h\" (UniqueName: \"kubernetes.io/projected/7b0361fd-9757-471d-becd-b04ebf9ab715-kube-api-access-mvw4h\") pod \"migrator-74bb7799d9-smkv2\" (UID: \"7b0361fd-9757-471d-becd-b04ebf9ab715\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-smkv2" Apr 21 15:35:54.762791 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:54.762763 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvw4h\" (UniqueName: \"kubernetes.io/projected/7b0361fd-9757-471d-becd-b04ebf9ab715-kube-api-access-mvw4h\") pod \"migrator-74bb7799d9-smkv2\" (UID: \"7b0361fd-9757-471d-becd-b04ebf9ab715\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-smkv2" Apr 21 15:35:54.841600 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:54.841568 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-smkv2" Apr 21 15:35:54.985526 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:54.985343 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-smkv2"] Apr 21 15:35:54.989103 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:54.989068 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b0361fd_9757_471d_becd_b04ebf9ab715.slice/crio-e7c0539eae0b1a5b7dd8bbd999d8a377c91a1803f05f7a32fe08d686b9f40ab1 WatchSource:0}: Error finding container e7c0539eae0b1a5b7dd8bbd999d8a377c91a1803f05f7a32fe08d686b9f40ab1: Status 404 returned error can't find the container with id e7c0539eae0b1a5b7dd8bbd999d8a377c91a1803f05f7a32fe08d686b9f40ab1 Apr 21 15:35:55.466890 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:55.466847 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-smkv2" event={"ID":"7b0361fd-9757-471d-becd-b04ebf9ab715","Type":"ContainerStarted","Data":"e7c0539eae0b1a5b7dd8bbd999d8a377c91a1803f05f7a32fe08d686b9f40ab1"} Apr 21 15:35:55.556462 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:55.556404 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret\") pod \"global-pull-secret-syncer-dmvhg\" (UID: \"90b42886-9124-47a7-8a37-518ea2f64986\") " pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:55.560183 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:55.560129 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/90b42886-9124-47a7-8a37-518ea2f64986-original-pull-secret\") pod \"global-pull-secret-syncer-dmvhg\" (UID: \"90b42886-9124-47a7-8a37-518ea2f64986\") " pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:55.747293 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:55.747218 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dmvhg" Apr 21 15:35:56.114852 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:56.114813 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dmvhg"] Apr 21 15:35:56.118578 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:35:56.118547 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90b42886_9124_47a7_8a37_518ea2f64986.slice/crio-055bcdc0041e9e9ac99efc0353cbc9aff6c3a45434e26b573ab8acd4afc0e6de WatchSource:0}: Error finding container 055bcdc0041e9e9ac99efc0353cbc9aff6c3a45434e26b573ab8acd4afc0e6de: Status 404 returned error can't find the container with id 055bcdc0041e9e9ac99efc0353cbc9aff6c3a45434e26b573ab8acd4afc0e6de Apr 21 15:35:56.363429 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:56.363410 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-cert\") pod \"ingress-canary-9lpqk\" (UID: \"b5a5676f-25d5-4f87-ad65-41d268c5e9f4\") " pod="openshift-ingress-canary/ingress-canary-9lpqk" Apr 21 15:35:56.363505 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:56.363443 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/63d08935-bd63-4c7f-83c9-df40083b472a-metrics-tls\") pod \"dns-default-4xm8n\" (UID: \"63d08935-bd63-4c7f-83c9-df40083b472a\") " pod="openshift-dns/dns-default-4xm8n" Apr 21 15:35:56.363505 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:56.363460 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:35:56.363570 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:56.363557 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:56.363613 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:56.363559 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:35:56.363649 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:56.363619 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66ffb968db-h6xsp: secret "image-registry-tls" not found Apr 21 15:35:56.363682 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:56.363559 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:56.363682 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:56.363609 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-cert podName:b5a5676f-25d5-4f87-ad65-41d268c5e9f4 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:00.363595233 +0000 UTC m=+41.664449417 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-cert") pod "ingress-canary-9lpqk" (UID: "b5a5676f-25d5-4f87-ad65-41d268c5e9f4") : secret "canary-serving-cert" not found Apr 21 15:35:56.363746 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:56.363684 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls podName:70851abd-b4ff-4289-82b5-49d3c83c3007 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:00.363671509 +0000 UTC m=+41.664525692 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls") pod "image-registry-66ffb968db-h6xsp" (UID: "70851abd-b4ff-4289-82b5-49d3c83c3007") : secret "image-registry-tls" not found Apr 21 15:35:56.363746 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:35:56.363695 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63d08935-bd63-4c7f-83c9-df40083b472a-metrics-tls podName:63d08935-bd63-4c7f-83c9-df40083b472a nodeName:}" failed. No retries permitted until 2026-04-21 15:36:00.363688909 +0000 UTC m=+41.664543089 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/63d08935-bd63-4c7f-83c9-df40083b472a-metrics-tls") pod "dns-default-4xm8n" (UID: "63d08935-bd63-4c7f-83c9-df40083b472a") : secret "dns-default-metrics-tls" not found Apr 21 15:35:56.470453 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:56.470243 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-smkv2" event={"ID":"7b0361fd-9757-471d-becd-b04ebf9ab715","Type":"ContainerStarted","Data":"ae14f9f50917655df16437a6c1aa47d7be801a2eda1ab16e9a447df0047e3046"} Apr 21 15:35:56.471196 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:56.471172 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dmvhg" event={"ID":"90b42886-9124-47a7-8a37-518ea2f64986","Type":"ContainerStarted","Data":"055bcdc0041e9e9ac99efc0353cbc9aff6c3a45434e26b573ab8acd4afc0e6de"} Apr 21 15:35:57.474442 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:57.474399 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-smkv2" event={"ID":"7b0361fd-9757-471d-becd-b04ebf9ab715","Type":"ContainerStarted","Data":"b7664fa957393df15e70c890a3c8a246e5ee470c5b40278276f78eae6580e183"} Apr 21 15:35:57.491151 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:35:57.491083 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-smkv2" podStartSLOduration=2.146435599 podStartE2EDuration="3.491063296s" podCreationTimestamp="2026-04-21 15:35:54 +0000 UTC" firstStartedPulling="2026-04-21 15:35:54.990876076 +0000 UTC m=+36.291730257" lastFinishedPulling="2026-04-21 15:35:56.335503769 +0000 UTC m=+37.636357954" observedRunningTime="2026-04-21 15:35:57.49093396 +0000 UTC m=+38.791788185" watchObservedRunningTime="2026-04-21 15:35:57.491063296 +0000 UTC m=+38.791917497" Apr 21 15:36:00.394612 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:00.394512 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-cert\") pod \"ingress-canary-9lpqk\" (UID: \"b5a5676f-25d5-4f87-ad65-41d268c5e9f4\") " pod="openshift-ingress-canary/ingress-canary-9lpqk" Apr 21 15:36:00.394612 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:00.394556 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/63d08935-bd63-4c7f-83c9-df40083b472a-metrics-tls\") pod \"dns-default-4xm8n\" (UID: \"63d08935-bd63-4c7f-83c9-df40083b472a\") " pod="openshift-dns/dns-default-4xm8n" Apr 21 15:36:00.394612 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:00.394582 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:36:00.395025 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:36:00.394679 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:36:00.395025 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:36:00.394740 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-cert podName:b5a5676f-25d5-4f87-ad65-41d268c5e9f4 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:08.394725136 +0000 UTC m=+49.695579316 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-cert") pod "ingress-canary-9lpqk" (UID: "b5a5676f-25d5-4f87-ad65-41d268c5e9f4") : secret "canary-serving-cert" not found Apr 21 15:36:00.395025 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:36:00.394679 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:36:00.395025 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:36:00.394814 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63d08935-bd63-4c7f-83c9-df40083b472a-metrics-tls podName:63d08935-bd63-4c7f-83c9-df40083b472a nodeName:}" failed. No retries permitted until 2026-04-21 15:36:08.394801883 +0000 UTC m=+49.695656064 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/63d08935-bd63-4c7f-83c9-df40083b472a-metrics-tls") pod "dns-default-4xm8n" (UID: "63d08935-bd63-4c7f-83c9-df40083b472a") : secret "dns-default-metrics-tls" not found Apr 21 15:36:00.395025 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:36:00.394686 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:36:00.395025 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:36:00.394830 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66ffb968db-h6xsp: secret "image-registry-tls" not found Apr 21 15:36:00.395025 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:36:00.394853 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls podName:70851abd-b4ff-4289-82b5-49d3c83c3007 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:08.394843127 +0000 UTC m=+49.695697969 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls") pod "image-registry-66ffb968db-h6xsp" (UID: "70851abd-b4ff-4289-82b5-49d3c83c3007") : secret "image-registry-tls" not found Apr 21 15:36:00.481639 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:00.481606 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dmvhg" event={"ID":"90b42886-9124-47a7-8a37-518ea2f64986","Type":"ContainerStarted","Data":"227fe843aff7640af6be167b68c475bf4f7a099145f092208e0f66f8ea2312c7"} Apr 21 15:36:00.511447 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:00.511399 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-dmvhg" podStartSLOduration=33.598414098 podStartE2EDuration="37.511384778s" podCreationTimestamp="2026-04-21 15:35:23 +0000 UTC" firstStartedPulling="2026-04-21 15:35:56.120208665 +0000 UTC m=+37.421062845" lastFinishedPulling="2026-04-21 15:36:00.033179329 +0000 UTC m=+41.334033525" observedRunningTime="2026-04-21 15:36:00.510926343 +0000 UTC m=+41.811780545" watchObservedRunningTime="2026-04-21 15:36:00.511384778 +0000 UTC m=+41.812238980" Apr 21 15:36:08.451797 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:08.451756 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-cert\") pod \"ingress-canary-9lpqk\" (UID: \"b5a5676f-25d5-4f87-ad65-41d268c5e9f4\") " pod="openshift-ingress-canary/ingress-canary-9lpqk" Apr 21 15:36:08.451797 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:08.451800 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/63d08935-bd63-4c7f-83c9-df40083b472a-metrics-tls\") pod \"dns-default-4xm8n\" (UID: \"63d08935-bd63-4c7f-83c9-df40083b472a\") " pod="openshift-dns/dns-default-4xm8n" Apr 21 15:36:08.452353 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:08.451931 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:36:08.454839 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:08.454805 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/63d08935-bd63-4c7f-83c9-df40083b472a-metrics-tls\") pod \"dns-default-4xm8n\" (UID: \"63d08935-bd63-4c7f-83c9-df40083b472a\") " pod="openshift-dns/dns-default-4xm8n" Apr 21 15:36:08.454839 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:08.454817 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls\") pod \"image-registry-66ffb968db-h6xsp\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:36:08.454994 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:08.454901 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5a5676f-25d5-4f87-ad65-41d268c5e9f4-cert\") pod \"ingress-canary-9lpqk\" (UID: \"b5a5676f-25d5-4f87-ad65-41d268c5e9f4\") " pod="openshift-ingress-canary/ingress-canary-9lpqk" Apr 21 15:36:08.527644 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:08.527612 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4xm8n" Apr 21 15:36:08.530430 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:08.530410 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:36:08.555316 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:08.554898 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9lpqk" Apr 21 15:36:08.673797 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:08.673764 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4xm8n"] Apr 21 15:36:08.683992 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:36:08.683957 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63d08935_bd63_4c7f_83c9_df40083b472a.slice/crio-ae17d8c2c1f79572d19743c86ed7d9618a1d7d181a388f21a84c4f7edf1a2cd6 WatchSource:0}: Error finding container ae17d8c2c1f79572d19743c86ed7d9618a1d7d181a388f21a84c4f7edf1a2cd6: Status 404 returned error can't find the container with id ae17d8c2c1f79572d19743c86ed7d9618a1d7d181a388f21a84c4f7edf1a2cd6 Apr 21 15:36:08.695756 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:08.695727 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66ffb968db-h6xsp"] Apr 21 15:36:08.698722 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:36:08.698691 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70851abd_b4ff_4289_82b5_49d3c83c3007.slice/crio-f658684bf45e7e7fc4c96c85bb8dc3a9e89686d55a35f26f6cadd1a71427caca WatchSource:0}: Error finding container f658684bf45e7e7fc4c96c85bb8dc3a9e89686d55a35f26f6cadd1a71427caca: Status 404 returned error can't find the container with id f658684bf45e7e7fc4c96c85bb8dc3a9e89686d55a35f26f6cadd1a71427caca Apr 21 15:36:08.707982 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:08.707956 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9lpqk"] Apr 21 15:36:08.711338 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:36:08.711312 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5a5676f_25d5_4f87_ad65_41d268c5e9f4.slice/crio-3dc2d9039bb23e52d8e219958d04b7060dbc251d47581b0b3c84a3facc1ae014 WatchSource:0}: Error finding container 3dc2d9039bb23e52d8e219958d04b7060dbc251d47581b0b3c84a3facc1ae014: Status 404 returned error can't find the container with id 3dc2d9039bb23e52d8e219958d04b7060dbc251d47581b0b3c84a3facc1ae014 Apr 21 15:36:09.498780 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:09.498736 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4xm8n" event={"ID":"63d08935-bd63-4c7f-83c9-df40083b472a","Type":"ContainerStarted","Data":"ae17d8c2c1f79572d19743c86ed7d9618a1d7d181a388f21a84c4f7edf1a2cd6"} Apr 21 15:36:09.499860 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:09.499821 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9lpqk" event={"ID":"b5a5676f-25d5-4f87-ad65-41d268c5e9f4","Type":"ContainerStarted","Data":"3dc2d9039bb23e52d8e219958d04b7060dbc251d47581b0b3c84a3facc1ae014"} Apr 21 15:36:09.501393 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:09.501361 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" event={"ID":"70851abd-b4ff-4289-82b5-49d3c83c3007","Type":"ContainerStarted","Data":"af6fc97823587a656b73f99f228697431285e2f2cfc748990ad3d53d84627ca7"} Apr 21 15:36:09.501393 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:09.501390 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" event={"ID":"70851abd-b4ff-4289-82b5-49d3c83c3007","Type":"ContainerStarted","Data":"f658684bf45e7e7fc4c96c85bb8dc3a9e89686d55a35f26f6cadd1a71427caca"} Apr 21 15:36:09.501611 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:09.501596 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:36:09.531661 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:09.531616 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" podStartSLOduration=35.531598789 podStartE2EDuration="35.531598789s" podCreationTimestamp="2026-04-21 15:35:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:36:09.531153269 +0000 UTC m=+50.832007474" watchObservedRunningTime="2026-04-21 15:36:09.531598789 +0000 UTC m=+50.832452988" Apr 21 15:36:11.509833 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:11.509799 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4xm8n" event={"ID":"63d08935-bd63-4c7f-83c9-df40083b472a","Type":"ContainerStarted","Data":"d1e19eed79b6f191393c49e1c640a331a380284c6075e74d115ee8c20ac84ce5"} Apr 21 15:36:11.509833 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:11.509832 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4xm8n" event={"ID":"63d08935-bd63-4c7f-83c9-df40083b472a","Type":"ContainerStarted","Data":"e7d858a08c7b7a3cc3e8ea4117728c10af885b07df3ab991c3db768cad85ee5c"} Apr 21 15:36:11.510318 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:11.510008 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-4xm8n" Apr 21 15:36:11.511068 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:11.511047 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9lpqk" event={"ID":"b5a5676f-25d5-4f87-ad65-41d268c5e9f4","Type":"ContainerStarted","Data":"f8172b687a1df0eed60a8811f60502e5eaead4a4346c80b448fe70554ff80c1c"} Apr 21 15:36:11.530086 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:11.530038 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4xm8n" podStartSLOduration=17.749956777 podStartE2EDuration="19.530026967s" podCreationTimestamp="2026-04-21 15:35:52 +0000 UTC" firstStartedPulling="2026-04-21 15:36:08.686176677 +0000 UTC m=+49.987030859" lastFinishedPulling="2026-04-21 15:36:10.466246866 +0000 UTC m=+51.767101049" observedRunningTime="2026-04-21 15:36:11.529877708 +0000 UTC m=+52.830731911" watchObservedRunningTime="2026-04-21 15:36:11.530026967 +0000 UTC m=+52.830881170" Apr 21 15:36:11.550022 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:11.549971 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9lpqk" podStartSLOduration=17.793400442 podStartE2EDuration="19.54995325s" podCreationTimestamp="2026-04-21 15:35:52 +0000 UTC" firstStartedPulling="2026-04-21 15:36:08.712885159 +0000 UTC m=+50.013739340" lastFinishedPulling="2026-04-21 15:36:10.469437956 +0000 UTC m=+51.770292148" observedRunningTime="2026-04-21 15:36:11.548718327 +0000 UTC m=+52.849572529" watchObservedRunningTime="2026-04-21 15:36:11.54995325 +0000 UTC m=+52.850807452" Apr 21 15:36:17.453013 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:17.452987 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8ndqn" Apr 21 15:36:19.509469 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.509434 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l"] Apr 21 15:36:19.514233 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.514211 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" Apr 21 15:36:19.517400 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.517189 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 21 15:36:19.517400 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.517185 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 21 15:36:19.517400 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.517228 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 21 15:36:19.517400 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.517276 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 21 15:36:19.517400 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.517229 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 15:36:19.517400 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.517282 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 15:36:19.517658 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.517623 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 15:36:19.528270 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.528244 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l"] Apr 21 15:36:19.538360 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.538338 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/12a1197a-7ac6-4a8e-a947-f4091d8aace2-hub\") pod \"cluster-proxy-proxy-agent-7c7c4df99f-v9h4l\" (UID: \"12a1197a-7ac6-4a8e-a947-f4091d8aace2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" Apr 21 15:36:19.538463 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.538384 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/12a1197a-7ac6-4a8e-a947-f4091d8aace2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c7c4df99f-v9h4l\" (UID: \"12a1197a-7ac6-4a8e-a947-f4091d8aace2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" Apr 21 15:36:19.538463 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.538405 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzfsr\" (UniqueName: \"kubernetes.io/projected/12a1197a-7ac6-4a8e-a947-f4091d8aace2-kube-api-access-tzfsr\") pod \"cluster-proxy-proxy-agent-7c7c4df99f-v9h4l\" (UID: \"12a1197a-7ac6-4a8e-a947-f4091d8aace2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" Apr 21 15:36:19.538542 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.538475 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/12a1197a-7ac6-4a8e-a947-f4091d8aace2-ca\") pod \"cluster-proxy-proxy-agent-7c7c4df99f-v9h4l\" (UID: \"12a1197a-7ac6-4a8e-a947-f4091d8aace2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" Apr 21 15:36:19.538542 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.538509 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/12a1197a-7ac6-4a8e-a947-f4091d8aace2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c7c4df99f-v9h4l\" (UID: \"12a1197a-7ac6-4a8e-a947-f4091d8aace2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" Apr 21 15:36:19.538617 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.538567 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/12a1197a-7ac6-4a8e-a947-f4091d8aace2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c7c4df99f-v9h4l\" (UID: \"12a1197a-7ac6-4a8e-a947-f4091d8aace2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" Apr 21 15:36:19.617510 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.617470 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66ffb968db-h6xsp"] Apr 21 15:36:19.639669 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.639639 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/12a1197a-7ac6-4a8e-a947-f4091d8aace2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c7c4df99f-v9h4l\" (UID: \"12a1197a-7ac6-4a8e-a947-f4091d8aace2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" Apr 21 15:36:19.639831 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.639678 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/12a1197a-7ac6-4a8e-a947-f4091d8aace2-hub\") pod \"cluster-proxy-proxy-agent-7c7c4df99f-v9h4l\" (UID: \"12a1197a-7ac6-4a8e-a947-f4091d8aace2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" Apr 21 15:36:19.639831 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.639712 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/12a1197a-7ac6-4a8e-a947-f4091d8aace2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c7c4df99f-v9h4l\" (UID: \"12a1197a-7ac6-4a8e-a947-f4091d8aace2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" Apr 21 15:36:19.639831 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.639730 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzfsr\" (UniqueName: \"kubernetes.io/projected/12a1197a-7ac6-4a8e-a947-f4091d8aace2-kube-api-access-tzfsr\") pod \"cluster-proxy-proxy-agent-7c7c4df99f-v9h4l\" (UID: \"12a1197a-7ac6-4a8e-a947-f4091d8aace2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" Apr 21 15:36:19.639831 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.639758 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/12a1197a-7ac6-4a8e-a947-f4091d8aace2-ca\") pod \"cluster-proxy-proxy-agent-7c7c4df99f-v9h4l\" (UID: \"12a1197a-7ac6-4a8e-a947-f4091d8aace2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" Apr 21 15:36:19.639831 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.639777 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/12a1197a-7ac6-4a8e-a947-f4091d8aace2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c7c4df99f-v9h4l\" (UID: \"12a1197a-7ac6-4a8e-a947-f4091d8aace2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" Apr 21 15:36:19.640498 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.640472 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/12a1197a-7ac6-4a8e-a947-f4091d8aace2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c7c4df99f-v9h4l\" (UID: \"12a1197a-7ac6-4a8e-a947-f4091d8aace2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" Apr 21 15:36:19.641974 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.641945 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/12a1197a-7ac6-4a8e-a947-f4091d8aace2-ca\") pod \"cluster-proxy-proxy-agent-7c7c4df99f-v9h4l\" (UID: \"12a1197a-7ac6-4a8e-a947-f4091d8aace2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" Apr 21 15:36:19.642097 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.642081 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/12a1197a-7ac6-4a8e-a947-f4091d8aace2-hub\") pod \"cluster-proxy-proxy-agent-7c7c4df99f-v9h4l\" (UID: \"12a1197a-7ac6-4a8e-a947-f4091d8aace2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" Apr 21 15:36:19.642440 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.642416 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/12a1197a-7ac6-4a8e-a947-f4091d8aace2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c7c4df99f-v9h4l\" (UID: \"12a1197a-7ac6-4a8e-a947-f4091d8aace2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" Apr 21 15:36:19.642919 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.642903 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/12a1197a-7ac6-4a8e-a947-f4091d8aace2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c7c4df99f-v9h4l\" (UID: \"12a1197a-7ac6-4a8e-a947-f4091d8aace2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" Apr 21 15:36:19.663714 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.663693 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzfsr\" (UniqueName: \"kubernetes.io/projected/12a1197a-7ac6-4a8e-a947-f4091d8aace2-kube-api-access-tzfsr\") pod \"cluster-proxy-proxy-agent-7c7c4df99f-v9h4l\" (UID: \"12a1197a-7ac6-4a8e-a947-f4091d8aace2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" Apr 21 15:36:19.833447 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.833413 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" Apr 21 15:36:19.949443 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:19.949392 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l"] Apr 21 15:36:19.954289 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:36:19.954265 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12a1197a_7ac6_4a8e_a947_f4091d8aace2.slice/crio-08d7db8857678fb89e60dc399c2c6b0faca9dd357aed0c1a651cf9b7842fe7bd WatchSource:0}: Error finding container 08d7db8857678fb89e60dc399c2c6b0faca9dd357aed0c1a651cf9b7842fe7bd: Status 404 returned error can't find the container with id 08d7db8857678fb89e60dc399c2c6b0faca9dd357aed0c1a651cf9b7842fe7bd Apr 21 15:36:20.537189 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:20.537125 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" event={"ID":"12a1197a-7ac6-4a8e-a947-f4091d8aace2","Type":"ContainerStarted","Data":"08d7db8857678fb89e60dc399c2c6b0faca9dd357aed0c1a651cf9b7842fe7bd"} Apr 21 15:36:21.518341 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:21.518307 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4xm8n" Apr 21 15:36:24.550597 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:24.550548 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" event={"ID":"12a1197a-7ac6-4a8e-a947-f4091d8aace2","Type":"ContainerStarted","Data":"f83d7d0c5a4b2655aafd9ce1515c914d4bb89faf1a60844973712904a66a1402"} Apr 21 15:36:24.980263 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:24.980175 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs\") pod \"network-metrics-daemon-x2rv7\" (UID: \"54264bba-76e1-44c8-8581-4f2271e68bd7\") " pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:36:24.982406 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:24.982377 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 15:36:24.992895 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:24.992864 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54264bba-76e1-44c8-8581-4f2271e68bd7-metrics-certs\") pod \"network-metrics-daemon-x2rv7\" (UID: \"54264bba-76e1-44c8-8581-4f2271e68bd7\") " pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:36:25.080944 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:25.080904 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twv2f\" (UniqueName: \"kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f\") pod \"network-check-target-4spst\" (UID: \"efb241d1-f7e0-44b6-8014-d8a71973aa71\") " pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:36:25.083202 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:25.083177 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 15:36:25.094228 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:25.094204 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 15:36:25.105640 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:25.105611 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twv2f\" (UniqueName: \"kubernetes.io/projected/efb241d1-f7e0-44b6-8014-d8a71973aa71-kube-api-access-twv2f\") pod \"network-check-target-4spst\" (UID: \"efb241d1-f7e0-44b6-8014-d8a71973aa71\") " pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:36:25.145505 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:25.145469 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dzs5m\"" Apr 21 15:36:25.152516 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:25.152489 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x2rv7" Apr 21 15:36:25.154582 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:25.154510 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-l72g6\"" Apr 21 15:36:25.163519 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:25.163426 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:36:25.172624 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:25.172602 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4xm8n_63d08935-bd63-4c7f-83c9-df40083b472a/dns/0.log" Apr 21 15:36:25.283008 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:25.282946 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x2rv7"] Apr 21 15:36:25.304917 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:25.304876 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4spst"] Apr 21 15:36:25.331799 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:25.331776 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4xm8n_63d08935-bd63-4c7f-83c9-df40083b472a/kube-rbac-proxy/0.log" Apr 21 15:36:25.720270 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:36:25.720233 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54264bba_76e1_44c8_8581_4f2271e68bd7.slice/crio-1a1c2c6579fca62f647894dec4a73e97598466e4440451f8996922f0f69745e9 WatchSource:0}: Error finding container 1a1c2c6579fca62f647894dec4a73e97598466e4440451f8996922f0f69745e9: Status 404 returned error can't find the container with id 1a1c2c6579fca62f647894dec4a73e97598466e4440451f8996922f0f69745e9 Apr 21 15:36:25.720836 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:36:25.720715 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefb241d1_f7e0_44b6_8014_d8a71973aa71.slice/crio-7d50adae81e748d5e878385d3ca53617932eae5c8c33facf3d60348b2f4b4f50 WatchSource:0}: Error finding container 7d50adae81e748d5e878385d3ca53617932eae5c8c33facf3d60348b2f4b4f50: Status 404 returned error can't find the container with id 7d50adae81e748d5e878385d3ca53617932eae5c8c33facf3d60348b2f4b4f50 Apr 21 15:36:26.331734 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:26.331703 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6kcrm_d5081a65-e77f-4228-83e9-044b28aa3b8b/dns-node-resolver/0.log" Apr 21 15:36:26.557684 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:26.557643 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x2rv7" event={"ID":"54264bba-76e1-44c8-8581-4f2271e68bd7","Type":"ContainerStarted","Data":"1a1c2c6579fca62f647894dec4a73e97598466e4440451f8996922f0f69745e9"} Apr 21 15:36:26.560033 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:26.560002 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" event={"ID":"12a1197a-7ac6-4a8e-a947-f4091d8aace2","Type":"ContainerStarted","Data":"30fac04252d9a2ed04c541700db579a82b70b42a28b29b626c07e09ce970edd9"} Apr 21 15:36:26.560186 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:26.560039 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" event={"ID":"12a1197a-7ac6-4a8e-a947-f4091d8aace2","Type":"ContainerStarted","Data":"c3071de40d8561b569342be157c9d27de82047c8a65b8b4bec2199864b5f1388"} Apr 21 15:36:26.561294 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:26.561263 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4spst" event={"ID":"efb241d1-f7e0-44b6-8014-d8a71973aa71","Type":"ContainerStarted","Data":"7d50adae81e748d5e878385d3ca53617932eae5c8c33facf3d60348b2f4b4f50"} Apr 21 15:36:26.581609 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:26.581557 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c7c4df99f-v9h4l" podStartSLOduration=1.761970388 podStartE2EDuration="7.581539052s" podCreationTimestamp="2026-04-21 15:36:19 +0000 UTC" firstStartedPulling="2026-04-21 15:36:19.955843341 +0000 UTC m=+61.256697522" lastFinishedPulling="2026-04-21 15:36:25.775412002 +0000 UTC m=+67.076266186" observedRunningTime="2026-04-21 15:36:26.580720743 +0000 UTC m=+67.881574958" watchObservedRunningTime="2026-04-21 15:36:26.581539052 +0000 UTC m=+67.882393253" Apr 21 15:36:26.932449 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:26.932380 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-66ffb968db-h6xsp_70851abd-b4ff-4289-82b5-49d3c83c3007/registry/0.log" Apr 21 15:36:27.565350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:27.565314 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x2rv7" event={"ID":"54264bba-76e1-44c8-8581-4f2271e68bd7","Type":"ContainerStarted","Data":"e9b91b69bc2631bee7bb86d957f9c53779adc4e2d79c85c27c183fadd9b12831"} Apr 21 15:36:27.565350 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:27.565355 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x2rv7" event={"ID":"54264bba-76e1-44c8-8581-4f2271e68bd7","Type":"ContainerStarted","Data":"0a3520b2da70bd2f73014a05f724064eeaec35ced2053cf84004a34d397c8f03"} Apr 21 15:36:27.584326 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:27.584269 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-x2rv7" podStartSLOduration=67.585273376 podStartE2EDuration="1m8.584252472s" podCreationTimestamp="2026-04-21 15:35:19 +0000 UTC" firstStartedPulling="2026-04-21 15:36:25.722090688 +0000 UTC m=+67.022944869" lastFinishedPulling="2026-04-21 15:36:26.721069773 +0000 UTC m=+68.021923965" observedRunningTime="2026-04-21 15:36:27.582606097 +0000 UTC m=+68.883460347" watchObservedRunningTime="2026-04-21 15:36:27.584252472 +0000 UTC m=+68.885106677" Apr 21 15:36:27.930932 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:27.930850 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qsgrz_08d2130c-7332-485b-95f8-0728da25787a/node-ca/0.log" Apr 21 15:36:28.332477 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:28.332455 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9lpqk_b5a5676f-25d5-4f87-ad65-41d268c5e9f4/serve-healthcheck-canary/0.log" Apr 21 15:36:28.569664 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:28.569583 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4spst" event={"ID":"efb241d1-f7e0-44b6-8014-d8a71973aa71","Type":"ContainerStarted","Data":"e6ef9eed5255438a3eecdadd59039160555d68a897cdb4bc09e90048ee01dafa"} Apr 21 15:36:28.569890 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:28.569867 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:36:29.622083 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:29.622055 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:36:29.643749 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:29.643696 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-4spst" podStartSLOduration=68.066526823 podStartE2EDuration="1m10.643678592s" podCreationTimestamp="2026-04-21 15:35:19 +0000 UTC" firstStartedPulling="2026-04-21 15:36:25.722798191 +0000 UTC m=+67.023652375" lastFinishedPulling="2026-04-21 15:36:28.299949964 +0000 UTC m=+69.600804144" observedRunningTime="2026-04-21 15:36:28.58590308 +0000 UTC m=+69.886757293" watchObservedRunningTime="2026-04-21 15:36:29.643678592 +0000 UTC m=+70.944532795" Apr 21 15:36:30.979090 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:30.979051 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-rfzdj"] Apr 21 15:36:31.008530 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.008505 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.011492 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.011462 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 15:36:31.012753 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.012725 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 15:36:31.012753 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.012735 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 15:36:31.012915 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.012732 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 15:36:31.013313 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.013300 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 15:36:31.013750 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.013734 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mk69c\"" Apr 21 15:36:31.016219 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.016203 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 15:36:31.123674 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.123637 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ce65871c-6257-4a8d-9c07-cc6a8eab239e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.123674 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.123675 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ce65871c-6257-4a8d-9c07-cc6a8eab239e-node-exporter-tls\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.123872 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.123695 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce65871c-6257-4a8d-9c07-cc6a8eab239e-metrics-client-ca\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.123872 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.123738 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce65871c-6257-4a8d-9c07-cc6a8eab239e-sys\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.123872 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.123761 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ce65871c-6257-4a8d-9c07-cc6a8eab239e-root\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.123872 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.123781 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ce65871c-6257-4a8d-9c07-cc6a8eab239e-node-exporter-textfile\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.123872 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.123806 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shpsj\" (UniqueName: \"kubernetes.io/projected/ce65871c-6257-4a8d-9c07-cc6a8eab239e-kube-api-access-shpsj\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.124012 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.123889 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ce65871c-6257-4a8d-9c07-cc6a8eab239e-node-exporter-wtmp\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.124012 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.123915 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ce65871c-6257-4a8d-9c07-cc6a8eab239e-node-exporter-accelerators-collector-config\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.224882 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.224844 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ce65871c-6257-4a8d-9c07-cc6a8eab239e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.224882 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.224879 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ce65871c-6257-4a8d-9c07-cc6a8eab239e-node-exporter-tls\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.225121 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.224900 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce65871c-6257-4a8d-9c07-cc6a8eab239e-metrics-client-ca\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.225121 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.224918 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce65871c-6257-4a8d-9c07-cc6a8eab239e-sys\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.225121 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.224965 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce65871c-6257-4a8d-9c07-cc6a8eab239e-sys\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.225121 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:36:31.225004 2570 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 15:36:31.225121 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.225030 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ce65871c-6257-4a8d-9c07-cc6a8eab239e-root\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.225121 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:36:31.225071 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce65871c-6257-4a8d-9c07-cc6a8eab239e-node-exporter-tls podName:ce65871c-6257-4a8d-9c07-cc6a8eab239e nodeName:}" failed. No retries permitted until 2026-04-21 15:36:31.725052347 +0000 UTC m=+73.025906534 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/ce65871c-6257-4a8d-9c07-cc6a8eab239e-node-exporter-tls") pod "node-exporter-rfzdj" (UID: "ce65871c-6257-4a8d-9c07-cc6a8eab239e") : secret "node-exporter-tls" not found Apr 21 15:36:31.225121 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.225092 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ce65871c-6257-4a8d-9c07-cc6a8eab239e-root\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.225121 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.225094 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ce65871c-6257-4a8d-9c07-cc6a8eab239e-node-exporter-textfile\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.225506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.225128 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shpsj\" (UniqueName: \"kubernetes.io/projected/ce65871c-6257-4a8d-9c07-cc6a8eab239e-kube-api-access-shpsj\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.225506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.225213 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ce65871c-6257-4a8d-9c07-cc6a8eab239e-node-exporter-wtmp\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.225506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.225263 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ce65871c-6257-4a8d-9c07-cc6a8eab239e-node-exporter-accelerators-collector-config\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.225506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.225347 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ce65871c-6257-4a8d-9c07-cc6a8eab239e-node-exporter-wtmp\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.225506 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.225432 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce65871c-6257-4a8d-9c07-cc6a8eab239e-metrics-client-ca\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.225704 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.225684 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ce65871c-6257-4a8d-9c07-cc6a8eab239e-node-exporter-accelerators-collector-config\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.227166 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.227149 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ce65871c-6257-4a8d-9c07-cc6a8eab239e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.235185 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.235113 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shpsj\" (UniqueName: \"kubernetes.io/projected/ce65871c-6257-4a8d-9c07-cc6a8eab239e-kube-api-access-shpsj\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.236540 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.236520 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ce65871c-6257-4a8d-9c07-cc6a8eab239e-node-exporter-textfile\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.728207 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.728164 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ce65871c-6257-4a8d-9c07-cc6a8eab239e-node-exporter-tls\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.730331 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.730310 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ce65871c-6257-4a8d-9c07-cc6a8eab239e-node-exporter-tls\") pod \"node-exporter-rfzdj\" (UID: \"ce65871c-6257-4a8d-9c07-cc6a8eab239e\") " pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.828903 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.828869 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-tqfjd"] Apr 21 15:36:31.833706 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.833687 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tqfjd" Apr 21 15:36:31.836296 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.836269 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 15:36:31.836398 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.836365 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 15:36:31.837857 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.837833 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-dcdll\"" Apr 21 15:36:31.837944 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.837874 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 15:36:31.837944 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.837874 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 15:36:31.842494 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.842467 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tqfjd"] Apr 21 15:36:31.917370 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.917339 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rfzdj" Apr 21 15:36:31.924803 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:36:31.924775 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce65871c_6257_4a8d_9c07_cc6a8eab239e.slice/crio-7a71426ea78866c44f432e815cbcc966b0fe8c8c08a4ef8a418928be5be631b6 WatchSource:0}: Error finding container 7a71426ea78866c44f432e815cbcc966b0fe8c8c08a4ef8a418928be5be631b6: Status 404 returned error can't find the container with id 7a71426ea78866c44f432e815cbcc966b0fe8c8c08a4ef8a418928be5be631b6 Apr 21 15:36:31.929283 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.929258 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/45da1054-656b-482f-b6dd-c2df1f588ac7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tqfjd\" (UID: \"45da1054-656b-482f-b6dd-c2df1f588ac7\") " pod="openshift-insights/insights-runtime-extractor-tqfjd" Apr 21 15:36:31.929379 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.929304 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/45da1054-656b-482f-b6dd-c2df1f588ac7-crio-socket\") pod \"insights-runtime-extractor-tqfjd\" (UID: \"45da1054-656b-482f-b6dd-c2df1f588ac7\") " pod="openshift-insights/insights-runtime-extractor-tqfjd" Apr 21 15:36:31.929379 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.929346 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/45da1054-656b-482f-b6dd-c2df1f588ac7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tqfjd\" (UID: \"45da1054-656b-482f-b6dd-c2df1f588ac7\") " pod="openshift-insights/insights-runtime-extractor-tqfjd" Apr 21 15:36:31.929487 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.929402 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/45da1054-656b-482f-b6dd-c2df1f588ac7-data-volume\") pod \"insights-runtime-extractor-tqfjd\" (UID: \"45da1054-656b-482f-b6dd-c2df1f588ac7\") " pod="openshift-insights/insights-runtime-extractor-tqfjd" Apr 21 15:36:31.929487 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.929458 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvnbh\" (UniqueName: \"kubernetes.io/projected/45da1054-656b-482f-b6dd-c2df1f588ac7-kube-api-access-rvnbh\") pod \"insights-runtime-extractor-tqfjd\" (UID: \"45da1054-656b-482f-b6dd-c2df1f588ac7\") " pod="openshift-insights/insights-runtime-extractor-tqfjd" Apr 21 15:36:31.999470 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:31.999395 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:36:32.004395 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.004378 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.008092 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.008067 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 15:36:32.008231 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.008157 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 15:36:32.008231 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.008187 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 15:36:32.008231 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.008067 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 15:36:32.008231 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.008067 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 15:36:32.008427 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.008242 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 15:36:32.008427 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.008192 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-4ndch\"" Apr 21 15:36:32.008427 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.008197 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 15:36:32.008427 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.008166 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 15:36:32.008427 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.008300 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 15:36:32.023783 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.023756 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:36:32.030652 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.030626 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/45da1054-656b-482f-b6dd-c2df1f588ac7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tqfjd\" (UID: \"45da1054-656b-482f-b6dd-c2df1f588ac7\") " pod="openshift-insights/insights-runtime-extractor-tqfjd" Apr 21 15:36:32.030779 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.030679 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-web-config\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.030779 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.030732 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.030779 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:36:32.030772 2570 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 15:36:32.030885 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:36:32.030836 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45da1054-656b-482f-b6dd-c2df1f588ac7-insights-runtime-extractor-tls podName:45da1054-656b-482f-b6dd-c2df1f588ac7 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:32.530822033 +0000 UTC m=+73.831676216 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/45da1054-656b-482f-b6dd-c2df1f588ac7-insights-runtime-extractor-tls") pod "insights-runtime-extractor-tqfjd" (UID: "45da1054-656b-482f-b6dd-c2df1f588ac7") : secret "insights-runtime-extractor-tls" not found Apr 21 15:36:32.030885 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.030855 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-564b8\" (UniqueName: \"kubernetes.io/projected/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-kube-api-access-564b8\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.030885 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.030878 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/45da1054-656b-482f-b6dd-c2df1f588ac7-data-volume\") pod \"insights-runtime-extractor-tqfjd\" (UID: \"45da1054-656b-482f-b6dd-c2df1f588ac7\") " pod="openshift-insights/insights-runtime-extractor-tqfjd" Apr 21 15:36:32.030990 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.030919 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.030990 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.030944 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnbh\" (UniqueName: \"kubernetes.io/projected/45da1054-656b-482f-b6dd-c2df1f588ac7-kube-api-access-rvnbh\") pod \"insights-runtime-extractor-tqfjd\" (UID: \"45da1054-656b-482f-b6dd-c2df1f588ac7\") " pod="openshift-insights/insights-runtime-extractor-tqfjd" Apr 21 15:36:32.030990 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.030962 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.030990 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.030980 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-config-volume\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.031173 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.031116 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.031223 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.031185 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.031223 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.031197 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/45da1054-656b-482f-b6dd-c2df1f588ac7-data-volume\") pod \"insights-runtime-extractor-tqfjd\" (UID: \"45da1054-656b-482f-b6dd-c2df1f588ac7\") " pod="openshift-insights/insights-runtime-extractor-tqfjd" Apr 21 15:36:32.031298 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.031226 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.031298 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.031248 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/45da1054-656b-482f-b6dd-c2df1f588ac7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tqfjd\" (UID: \"45da1054-656b-482f-b6dd-c2df1f588ac7\") " pod="openshift-insights/insights-runtime-extractor-tqfjd" Apr 21 15:36:32.031298 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.031266 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.031421 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.031398 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.031472 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.031438 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/45da1054-656b-482f-b6dd-c2df1f588ac7-crio-socket\") pod \"insights-runtime-extractor-tqfjd\" (UID: \"45da1054-656b-482f-b6dd-c2df1f588ac7\") " pod="openshift-insights/insights-runtime-extractor-tqfjd" Apr 21 15:36:32.031580 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.031475 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-config-out\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.031580 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.031510 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.031580 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.031545 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/45da1054-656b-482f-b6dd-c2df1f588ac7-crio-socket\") pod \"insights-runtime-extractor-tqfjd\" (UID: \"45da1054-656b-482f-b6dd-c2df1f588ac7\") " pod="openshift-insights/insights-runtime-extractor-tqfjd" Apr 21 15:36:32.031677 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.031633 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/45da1054-656b-482f-b6dd-c2df1f588ac7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tqfjd\" (UID: \"45da1054-656b-482f-b6dd-c2df1f588ac7\") " pod="openshift-insights/insights-runtime-extractor-tqfjd" Apr 21 15:36:32.043657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.043633 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvnbh\" (UniqueName: \"kubernetes.io/projected/45da1054-656b-482f-b6dd-c2df1f588ac7-kube-api-access-rvnbh\") pod \"insights-runtime-extractor-tqfjd\" (UID: \"45da1054-656b-482f-b6dd-c2df1f588ac7\") " pod="openshift-insights/insights-runtime-extractor-tqfjd" Apr 21 15:36:32.132504 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.132468 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.132504 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.132507 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.132757 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.132529 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.132757 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.132559 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-config-out\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.132757 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.132669 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.132757 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.132737 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-web-config\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.132935 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.132758 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.132935 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.132782 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-564b8\" (UniqueName: \"kubernetes.io/projected/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-kube-api-access-564b8\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.132935 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.132836 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.132935 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.132866 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.132935 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.132895 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-config-volume\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.132935 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.132919 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.133221 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.132949 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.133221 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:36:32.133068 2570 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 21 15:36:32.133221 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:36:32.133187 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-alertmanager-trusted-ca-bundle podName:00f6ca9e-1b0b-4ca1-9901-9c1922cab33f nodeName:}" failed. No retries permitted until 2026-04-21 15:36:32.633161412 +0000 UTC m=+73.934015622 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "00f6ca9e-1b0b-4ca1-9901-9c1922cab33f") : configmap references non-existent config key: ca-bundle.crt Apr 21 15:36:32.133221 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:36:32.133212 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-secret-alertmanager-main-tls podName:00f6ca9e-1b0b-4ca1-9901-9c1922cab33f nodeName:}" failed. No retries permitted until 2026-04-21 15:36:32.633202647 +0000 UTC m=+73.934056842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "00f6ca9e-1b0b-4ca1-9901-9c1922cab33f") : secret "alertmanager-main-tls" not found Apr 21 15:36:32.133442 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.133386 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.133821 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.133796 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.135439 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.135407 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-config-out\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.135609 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.135587 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.135755 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.135734 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-web-config\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.135827 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.135758 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.135996 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.135977 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-config-volume\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.136061 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.135996 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.136289 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.136271 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.136365 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.136347 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.148345 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.148327 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-564b8\" (UniqueName: \"kubernetes.io/projected/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-kube-api-access-564b8\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.535486 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.535452 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/45da1054-656b-482f-b6dd-c2df1f588ac7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tqfjd\" (UID: \"45da1054-656b-482f-b6dd-c2df1f588ac7\") " pod="openshift-insights/insights-runtime-extractor-tqfjd" Apr 21 15:36:32.537892 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.537863 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/45da1054-656b-482f-b6dd-c2df1f588ac7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tqfjd\" (UID: \"45da1054-656b-482f-b6dd-c2df1f588ac7\") " pod="openshift-insights/insights-runtime-extractor-tqfjd" Apr 21 15:36:32.581541 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.581507 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rfzdj" event={"ID":"ce65871c-6257-4a8d-9c07-cc6a8eab239e","Type":"ContainerStarted","Data":"7a71426ea78866c44f432e815cbcc966b0fe8c8c08a4ef8a418928be5be631b6"} Apr 21 15:36:32.635828 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.635801 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.635941 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.635849 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.636580 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.636558 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.638372 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.638351 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/00f6ca9e-1b0b-4ca1-9901-9c1922cab33f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:32.742298 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.742266 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tqfjd" Apr 21 15:36:32.858449 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.858419 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tqfjd"] Apr 21 15:36:32.861562 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:36:32.861536 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45da1054_656b_482f_b6dd_c2df1f588ac7.slice/crio-80b7475025bf9f1f1fe6eafba9b60b5d994bf02617d980086c191562cbc3f21d WatchSource:0}: Error finding container 80b7475025bf9f1f1fe6eafba9b60b5d994bf02617d980086c191562cbc3f21d: Status 404 returned error can't find the container with id 80b7475025bf9f1f1fe6eafba9b60b5d994bf02617d980086c191562cbc3f21d Apr 21 15:36:32.913342 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:32.913320 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:36:33.063365 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:33.063282 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:36:33.067332 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:36:33.067301 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00f6ca9e_1b0b_4ca1_9901_9c1922cab33f.slice/crio-4cf93084843088b520a16d1c3e1696a5ee9f3962c84ef64d220a094f6a8977c0 WatchSource:0}: Error finding container 4cf93084843088b520a16d1c3e1696a5ee9f3962c84ef64d220a094f6a8977c0: Status 404 returned error can't find the container with id 4cf93084843088b520a16d1c3e1696a5ee9f3962c84ef64d220a094f6a8977c0 Apr 21 15:36:33.585902 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:33.585857 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tqfjd" event={"ID":"45da1054-656b-482f-b6dd-c2df1f588ac7","Type":"ContainerStarted","Data":"c297a34cd186fbedbc9ef8de68b1e782383778798e3316ae2ae48fb5ba33a988"} Apr 21 15:36:33.585902 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:33.585904 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tqfjd" event={"ID":"45da1054-656b-482f-b6dd-c2df1f588ac7","Type":"ContainerStarted","Data":"78682cc0c42ea19dcb93bd565f0efc2ea50f09b8d4d25a40b2cd34816012072b"} Apr 21 15:36:33.586165 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:33.585916 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tqfjd" event={"ID":"45da1054-656b-482f-b6dd-c2df1f588ac7","Type":"ContainerStarted","Data":"80b7475025bf9f1f1fe6eafba9b60b5d994bf02617d980086c191562cbc3f21d"} Apr 21 15:36:33.587454 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:33.587424 2570 generic.go:358] "Generic (PLEG): container finished" podID="ce65871c-6257-4a8d-9c07-cc6a8eab239e" containerID="d5534e67e814aa99deb2ffea9654d0c9fc77c70f1066069359f2a1f1fc7f7b33" exitCode=0 Apr 21 15:36:33.587569 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:33.587528 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rfzdj" event={"ID":"ce65871c-6257-4a8d-9c07-cc6a8eab239e","Type":"ContainerDied","Data":"d5534e67e814aa99deb2ffea9654d0c9fc77c70f1066069359f2a1f1fc7f7b33"} Apr 21 15:36:33.588929 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:33.588904 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f","Type":"ContainerStarted","Data":"4cf93084843088b520a16d1c3e1696a5ee9f3962c84ef64d220a094f6a8977c0"} Apr 21 15:36:34.594166 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:34.594108 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rfzdj" event={"ID":"ce65871c-6257-4a8d-9c07-cc6a8eab239e","Type":"ContainerStarted","Data":"02e1ff0da643c588285a55532b6a5247c5c37feb2095f42ff0e439bb43beab58"} Apr 21 15:36:34.594166 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:34.594171 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rfzdj" event={"ID":"ce65871c-6257-4a8d-9c07-cc6a8eab239e","Type":"ContainerStarted","Data":"23a55177bc4f4ab007fb01a3bfa24499411a9ac4413daea8ccfcd1625e09020c"} Apr 21 15:36:34.595822 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:34.595790 2570 generic.go:358] "Generic (PLEG): container finished" podID="00f6ca9e-1b0b-4ca1-9901-9c1922cab33f" containerID="b5209d69ba989ab891a87835d0ccb109f84c34d05018bd538267f96bf0e18c63" exitCode=0 Apr 21 15:36:34.595935 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:34.595849 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f","Type":"ContainerDied","Data":"b5209d69ba989ab891a87835d0ccb109f84c34d05018bd538267f96bf0e18c63"} Apr 21 15:36:34.655332 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:34.655273 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-rfzdj" podStartSLOduration=3.958946478 podStartE2EDuration="4.655254398s" podCreationTimestamp="2026-04-21 15:36:30 +0000 UTC" firstStartedPulling="2026-04-21 15:36:31.926576805 +0000 UTC m=+73.227430998" lastFinishedPulling="2026-04-21 15:36:32.622884722 +0000 UTC m=+73.923738918" observedRunningTime="2026-04-21 15:36:34.65493137 +0000 UTC m=+75.955785573" watchObservedRunningTime="2026-04-21 15:36:34.655254398 +0000 UTC m=+75.956108601" Apr 21 15:36:35.606292 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:35.606143 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tqfjd" event={"ID":"45da1054-656b-482f-b6dd-c2df1f588ac7","Type":"ContainerStarted","Data":"768d8ea179fe47ada420357971607cdb993ff9d63b480b9c63f6a3b26e81de82"} Apr 21 15:36:35.629224 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:35.629168 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-tqfjd" podStartSLOduration=2.408745815 podStartE2EDuration="4.629149188s" podCreationTimestamp="2026-04-21 15:36:31 +0000 UTC" firstStartedPulling="2026-04-21 15:36:32.911965008 +0000 UTC m=+74.212819188" lastFinishedPulling="2026-04-21 15:36:35.132368377 +0000 UTC m=+76.433222561" observedRunningTime="2026-04-21 15:36:35.62810665 +0000 UTC m=+76.928960854" watchObservedRunningTime="2026-04-21 15:36:35.629149188 +0000 UTC m=+76.930003389" Apr 21 15:36:36.204201 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.204176 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-db98c9989-mrq85"] Apr 21 15:36:36.207279 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.207259 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.211069 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.211046 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 21 15:36:36.211214 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.211164 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 21 15:36:36.211214 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.211176 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 21 15:36:36.211214 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.211205 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 21 15:36:36.211563 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.211544 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 21 15:36:36.217416 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.217397 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-5vsh4\"" Apr 21 15:36:36.218023 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.218005 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 21 15:36:36.228887 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.228855 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-db98c9989-mrq85"] Apr 21 15:36:36.266079 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.266058 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6139496d-4b2d-4915-972b-f5ce78080077-telemeter-client-tls\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.266213 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.266091 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6139496d-4b2d-4915-972b-f5ce78080077-telemeter-trusted-ca-bundle\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.266213 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.266112 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxr8b\" (UniqueName: \"kubernetes.io/projected/6139496d-4b2d-4915-972b-f5ce78080077-kube-api-access-cxr8b\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.266213 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.266184 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6139496d-4b2d-4915-972b-f5ce78080077-serving-certs-ca-bundle\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.266331 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.266226 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6139496d-4b2d-4915-972b-f5ce78080077-federate-client-tls\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.266331 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.266265 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6139496d-4b2d-4915-972b-f5ce78080077-metrics-client-ca\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.266331 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.266302 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6139496d-4b2d-4915-972b-f5ce78080077-secret-telemeter-client\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.266331 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.266323 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6139496d-4b2d-4915-972b-f5ce78080077-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.366674 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.366651 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6139496d-4b2d-4915-972b-f5ce78080077-telemeter-client-tls\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.366805 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.366682 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6139496d-4b2d-4915-972b-f5ce78080077-telemeter-trusted-ca-bundle\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.366805 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.366703 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxr8b\" (UniqueName: \"kubernetes.io/projected/6139496d-4b2d-4915-972b-f5ce78080077-kube-api-access-cxr8b\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.366805 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.366722 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6139496d-4b2d-4915-972b-f5ce78080077-serving-certs-ca-bundle\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.366805 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.366743 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6139496d-4b2d-4915-972b-f5ce78080077-federate-client-tls\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.367082 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.367030 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6139496d-4b2d-4915-972b-f5ce78080077-metrics-client-ca\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.367082 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.367074 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6139496d-4b2d-4915-972b-f5ce78080077-secret-telemeter-client\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.367240 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.367106 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6139496d-4b2d-4915-972b-f5ce78080077-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.367871 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.367509 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6139496d-4b2d-4915-972b-f5ce78080077-telemeter-trusted-ca-bundle\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.367871 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.367784 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6139496d-4b2d-4915-972b-f5ce78080077-serving-certs-ca-bundle\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.368353 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.368328 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6139496d-4b2d-4915-972b-f5ce78080077-metrics-client-ca\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.369481 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.369460 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6139496d-4b2d-4915-972b-f5ce78080077-federate-client-tls\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.369606 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.369519 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6139496d-4b2d-4915-972b-f5ce78080077-telemeter-client-tls\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.369801 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.369780 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6139496d-4b2d-4915-972b-f5ce78080077-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.370544 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.370526 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6139496d-4b2d-4915-972b-f5ce78080077-secret-telemeter-client\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.378476 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.378456 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxr8b\" (UniqueName: \"kubernetes.io/projected/6139496d-4b2d-4915-972b-f5ce78080077-kube-api-access-cxr8b\") pod \"telemeter-client-db98c9989-mrq85\" (UID: \"6139496d-4b2d-4915-972b-f5ce78080077\") " pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.517988 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.517826 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" Apr 21 15:36:36.611796 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.611757 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f","Type":"ContainerStarted","Data":"1313dfb9e7c4e45b99589b50dbc7afde9fe111501bd58eeafe193900e5884d85"} Apr 21 15:36:36.611796 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.611802 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f","Type":"ContainerStarted","Data":"a264df32cca01865bbddadd7a89b06e769e382642d9e0c209aecf5a6a4d71dad"} Apr 21 15:36:36.612346 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.611816 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f","Type":"ContainerStarted","Data":"ad45103475dc9ea4a9611268745a876392c9d6bb07cf47d2f278349f3fbbfe39"} Apr 21 15:36:36.612346 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.611827 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f","Type":"ContainerStarted","Data":"9fe192c73463a97bb69a9af6d450bd53d086385d8fd421c840b5b5d9aa76de73"} Apr 21 15:36:36.612346 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.611838 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f","Type":"ContainerStarted","Data":"0e361ab0015471a6ea334305d74b0ceaf654e7eac61342b0c5926a6414d4a346"} Apr 21 15:36:36.640315 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:36.640284 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-db98c9989-mrq85"] Apr 21 15:36:36.643684 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:36:36.643656 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6139496d_4b2d_4915_972b_f5ce78080077.slice/crio-d2734be9669f4b335414c66ef42e0bd27d764d96fe15378d8c41ea2663509984 WatchSource:0}: Error finding container d2734be9669f4b335414c66ef42e0bd27d764d96fe15378d8c41ea2663509984: Status 404 returned error can't find the container with id d2734be9669f4b335414c66ef42e0bd27d764d96fe15378d8c41ea2663509984 Apr 21 15:36:37.259820 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.259781 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:36:37.263462 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.263439 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.267511 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.267374 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 15:36:37.267511 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.267403 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 15:36:37.267511 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.267412 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 15:36:37.267941 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.267923 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 15:36:37.268015 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.267999 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 15:36:37.268093 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.268073 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-79qbc\"" Apr 21 15:36:37.268675 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.268654 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 15:36:37.268993 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.268967 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 15:36:37.269079 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.269030 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-cgffa3aa5be4b\"" Apr 21 15:36:37.269977 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.269307 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 15:36:37.269977 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.269365 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 15:36:37.269977 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.269608 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 15:36:37.269977 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.269690 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 15:36:37.270373 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.270354 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 15:36:37.274044 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.274022 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 15:36:37.282802 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.282778 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:36:37.377270 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.377182 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn9wc\" (UniqueName: \"kubernetes.io/projected/73686fb8-2565-4898-8b87-a933f73b46a0-kube-api-access-jn9wc\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.377270 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.377233 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.377489 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.377305 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73686fb8-2565-4898-8b87-a933f73b46a0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.377489 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.377360 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.377489 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.377430 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73686fb8-2565-4898-8b87-a933f73b46a0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.377489 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.377449 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73686fb8-2565-4898-8b87-a933f73b46a0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.377489 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.377466 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.377489 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.377485 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73686fb8-2565-4898-8b87-a933f73b46a0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.377820 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.377501 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/73686fb8-2565-4898-8b87-a933f73b46a0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.377820 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.377540 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73686fb8-2565-4898-8b87-a933f73b46a0-config-out\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.377820 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.377569 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-web-config\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.377820 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.377593 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.377820 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.377613 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/73686fb8-2565-4898-8b87-a933f73b46a0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.377820 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.377647 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-config\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.377820 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.377673 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.377820 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.377713 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.377820 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.377769 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73686fb8-2565-4898-8b87-a933f73b46a0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.377820 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.377807 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.478566 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.478526 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73686fb8-2565-4898-8b87-a933f73b46a0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.478566 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.478568 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.478819 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.478594 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jn9wc\" (UniqueName: \"kubernetes.io/projected/73686fb8-2565-4898-8b87-a933f73b46a0-kube-api-access-jn9wc\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.478819 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.478628 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.478819 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.478658 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73686fb8-2565-4898-8b87-a933f73b46a0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.478819 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.478688 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.478819 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.478728 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73686fb8-2565-4898-8b87-a933f73b46a0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.478819 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.478749 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73686fb8-2565-4898-8b87-a933f73b46a0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.478819 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.478774 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.478819 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.478803 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73686fb8-2565-4898-8b87-a933f73b46a0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.479291 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.478829 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/73686fb8-2565-4898-8b87-a933f73b46a0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.479291 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.478856 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73686fb8-2565-4898-8b87-a933f73b46a0-config-out\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.479291 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.478875 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-web-config\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.479291 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.478898 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.479291 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.478913 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/73686fb8-2565-4898-8b87-a933f73b46a0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.479291 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.478931 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-config\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.479291 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.478951 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.479291 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.478980 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.479659 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.479397 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/73686fb8-2565-4898-8b87-a933f73b46a0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.479727 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.479668 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73686fb8-2565-4898-8b87-a933f73b46a0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.482412 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.482387 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/73686fb8-2565-4898-8b87-a933f73b46a0-config-out\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.482540 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.482422 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/73686fb8-2565-4898-8b87-a933f73b46a0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.483512 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.482643 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/73686fb8-2565-4898-8b87-a933f73b46a0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.483512 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.482820 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.483512 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.482941 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73686fb8-2565-4898-8b87-a933f73b46a0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.483512 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.483025 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.483512 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.483064 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.483512 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.483157 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.483512 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.483172 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/73686fb8-2565-4898-8b87-a933f73b46a0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.483512 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.483471 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-config\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.483880 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.483662 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73686fb8-2565-4898-8b87-a933f73b46a0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.484282 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.484260 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.484653 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.484632 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-web-config\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.485204 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.485180 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.485285 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.485237 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/73686fb8-2565-4898-8b87-a933f73b46a0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.487031 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.487013 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn9wc\" (UniqueName: \"kubernetes.io/projected/73686fb8-2565-4898-8b87-a933f73b46a0-kube-api-access-jn9wc\") pod \"prometheus-k8s-0\" (UID: \"73686fb8-2565-4898-8b87-a933f73b46a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.579444 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.579405 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:36:37.618829 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.618776 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"00f6ca9e-1b0b-4ca1-9901-9c1922cab33f","Type":"ContainerStarted","Data":"a9896bcfd12c440450f8ff8e7b18efc924075eb43f7ba7304b4ad898052a0a61"} Apr 21 15:36:37.621628 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.621596 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" event={"ID":"6139496d-4b2d-4915-972b-f5ce78080077","Type":"ContainerStarted","Data":"d2734be9669f4b335414c66ef42e0bd27d764d96fe15378d8c41ea2663509984"} Apr 21 15:36:37.651371 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.651315 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.608571465 podStartE2EDuration="6.651296361s" podCreationTimestamp="2026-04-21 15:36:31 +0000 UTC" firstStartedPulling="2026-04-21 15:36:33.069043897 +0000 UTC m=+74.369898086" lastFinishedPulling="2026-04-21 15:36:37.111768799 +0000 UTC m=+78.412622982" observedRunningTime="2026-04-21 15:36:37.650237286 +0000 UTC m=+78.951091488" watchObservedRunningTime="2026-04-21 15:36:37.651296361 +0000 UTC m=+78.952150565" Apr 21 15:36:37.732890 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:37.732859 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 15:36:37.736580 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:36:37.736552 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73686fb8_2565_4898_8b87_a933f73b46a0.slice/crio-552db067af78a331e3a22e90f31b2798564c54a7fde777cef057544e692268f2 WatchSource:0}: Error finding container 552db067af78a331e3a22e90f31b2798564c54a7fde777cef057544e692268f2: Status 404 returned error can't find the container with id 552db067af78a331e3a22e90f31b2798564c54a7fde777cef057544e692268f2 Apr 21 15:36:38.626430 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:38.626399 2570 generic.go:358] "Generic (PLEG): container finished" podID="73686fb8-2565-4898-8b87-a933f73b46a0" containerID="17e02c6516f00e026de2a5a85d385f43fe6df7f65e37ed07fd1f8f51276e783b" exitCode=0 Apr 21 15:36:38.626807 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:38.626480 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73686fb8-2565-4898-8b87-a933f73b46a0","Type":"ContainerDied","Data":"17e02c6516f00e026de2a5a85d385f43fe6df7f65e37ed07fd1f8f51276e783b"} Apr 21 15:36:38.626807 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:38.626534 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73686fb8-2565-4898-8b87-a933f73b46a0","Type":"ContainerStarted","Data":"552db067af78a331e3a22e90f31b2798564c54a7fde777cef057544e692268f2"} Apr 21 15:36:38.628180 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:38.628118 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" event={"ID":"6139496d-4b2d-4915-972b-f5ce78080077","Type":"ContainerStarted","Data":"012ce3ecab8fe2735e576a6525f7f10020f1bdef08e2f407d2235030e5e5a77d"} Apr 21 15:36:39.632942 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:39.632901 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" event={"ID":"6139496d-4b2d-4915-972b-f5ce78080077","Type":"ContainerStarted","Data":"6916dc45ee250ea958861e1849a4ddd77a79a7280d94cc60308c342191f4df2f"} Apr 21 15:36:39.632942 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:39.632947 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" event={"ID":"6139496d-4b2d-4915-972b-f5ce78080077","Type":"ContainerStarted","Data":"5d567fe066c7a15fecfea52cd8f390e8ebd0d38e893cbdce4b0fe1381d1c53e6"} Apr 21 15:36:39.658078 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:39.658020 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-db98c9989-mrq85" podStartSLOduration=1.757297214 podStartE2EDuration="3.658002016s" podCreationTimestamp="2026-04-21 15:36:36 +0000 UTC" firstStartedPulling="2026-04-21 15:36:36.645441095 +0000 UTC m=+77.946295275" lastFinishedPulling="2026-04-21 15:36:38.546145896 +0000 UTC m=+79.847000077" observedRunningTime="2026-04-21 15:36:39.657040164 +0000 UTC m=+80.957894367" watchObservedRunningTime="2026-04-21 15:36:39.658002016 +0000 UTC m=+80.958856230" Apr 21 15:36:41.642488 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:41.642454 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73686fb8-2565-4898-8b87-a933f73b46a0","Type":"ContainerStarted","Data":"6f4eec454a5fe57948747ed5292ccd09f54280d98f7ac71a1e10f69709e428dc"} Apr 21 15:36:41.642891 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:41.642495 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73686fb8-2565-4898-8b87-a933f73b46a0","Type":"ContainerStarted","Data":"980c6a8fb0088c9d4df4f44f357b2f50dcb7072e25b37cdc892dbb6475ec3fd9"} Apr 21 15:36:43.653775 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:43.653743 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73686fb8-2565-4898-8b87-a933f73b46a0","Type":"ContainerStarted","Data":"ca0426593b60807e6b7eefd9694b0507b4fecc4aff7d5dea8ec407fc6a4e9995"} Apr 21 15:36:43.653775 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:43.653780 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73686fb8-2565-4898-8b87-a933f73b46a0","Type":"ContainerStarted","Data":"d964777605c84fa54f52e6343d174d36d79e1a9843185df8c37bceb0f0d12f59"} Apr 21 15:36:43.654224 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:43.653789 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73686fb8-2565-4898-8b87-a933f73b46a0","Type":"ContainerStarted","Data":"cb16b14a0738aa14bd9e397517a4073b41bf17baed85f8db3f5e5c94f4a7e44f"} Apr 21 15:36:43.654224 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:43.653797 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"73686fb8-2565-4898-8b87-a933f73b46a0","Type":"ContainerStarted","Data":"fa86f27e2dbc69f28ecb32b50c0e35cbf42ac051d8ff9773ba0766198b7d6987"} Apr 21 15:36:43.692564 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:43.692507 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.404937365 podStartE2EDuration="6.692493435s" podCreationTimestamp="2026-04-21 15:36:37 +0000 UTC" firstStartedPulling="2026-04-21 15:36:38.627825656 +0000 UTC m=+79.928679836" lastFinishedPulling="2026-04-21 15:36:42.915381726 +0000 UTC m=+84.216235906" observedRunningTime="2026-04-21 15:36:43.691889316 +0000 UTC m=+84.992743519" watchObservedRunningTime="2026-04-21 15:36:43.692493435 +0000 UTC m=+84.993347636" Apr 21 15:36:44.635369 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:44.635325 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" podUID="70851abd-b4ff-4289-82b5-49d3c83c3007" containerName="registry" containerID="cri-o://af6fc97823587a656b73f99f228697431285e2f2cfc748990ad3d53d84627ca7" gracePeriod=30 Apr 21 15:36:44.870683 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:44.870661 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:36:44.948835 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:44.948743 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/70851abd-b4ff-4289-82b5-49d3c83c3007-installation-pull-secrets\") pod \"70851abd-b4ff-4289-82b5-49d3c83c3007\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " Apr 21 15:36:44.948835 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:44.948780 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9vfs\" (UniqueName: \"kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-kube-api-access-z9vfs\") pod \"70851abd-b4ff-4289-82b5-49d3c83c3007\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " Apr 21 15:36:44.948835 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:44.948807 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/70851abd-b4ff-4289-82b5-49d3c83c3007-ca-trust-extracted\") pod \"70851abd-b4ff-4289-82b5-49d3c83c3007\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " Apr 21 15:36:44.949093 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:44.948843 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/70851abd-b4ff-4289-82b5-49d3c83c3007-image-registry-private-configuration\") pod \"70851abd-b4ff-4289-82b5-49d3c83c3007\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " Apr 21 15:36:44.949093 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:44.948891 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-bound-sa-token\") pod \"70851abd-b4ff-4289-82b5-49d3c83c3007\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " Apr 21 15:36:44.949093 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:44.948935 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-certificates\") pod \"70851abd-b4ff-4289-82b5-49d3c83c3007\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " Apr 21 15:36:44.949093 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:44.948967 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls\") pod \"70851abd-b4ff-4289-82b5-49d3c83c3007\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " Apr 21 15:36:44.949093 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:44.949009 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70851abd-b4ff-4289-82b5-49d3c83c3007-trusted-ca\") pod \"70851abd-b4ff-4289-82b5-49d3c83c3007\" (UID: \"70851abd-b4ff-4289-82b5-49d3c83c3007\") " Apr 21 15:36:44.949556 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:44.949522 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "70851abd-b4ff-4289-82b5-49d3c83c3007" (UID: "70851abd-b4ff-4289-82b5-49d3c83c3007"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:36:44.949716 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:44.949530 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70851abd-b4ff-4289-82b5-49d3c83c3007-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "70851abd-b4ff-4289-82b5-49d3c83c3007" (UID: "70851abd-b4ff-4289-82b5-49d3c83c3007"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:36:44.951635 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:44.951604 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "70851abd-b4ff-4289-82b5-49d3c83c3007" (UID: "70851abd-b4ff-4289-82b5-49d3c83c3007"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:36:44.951781 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:44.951651 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "70851abd-b4ff-4289-82b5-49d3c83c3007" (UID: "70851abd-b4ff-4289-82b5-49d3c83c3007"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:36:44.951781 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:44.951655 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70851abd-b4ff-4289-82b5-49d3c83c3007-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "70851abd-b4ff-4289-82b5-49d3c83c3007" (UID: "70851abd-b4ff-4289-82b5-49d3c83c3007"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:36:44.951781 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:44.951731 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70851abd-b4ff-4289-82b5-49d3c83c3007-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "70851abd-b4ff-4289-82b5-49d3c83c3007" (UID: "70851abd-b4ff-4289-82b5-49d3c83c3007"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:36:44.951900 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:44.951799 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-kube-api-access-z9vfs" (OuterVolumeSpecName: "kube-api-access-z9vfs") pod "70851abd-b4ff-4289-82b5-49d3c83c3007" (UID: "70851abd-b4ff-4289-82b5-49d3c83c3007"). InnerVolumeSpecName "kube-api-access-z9vfs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:36:44.957916 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:44.957887 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70851abd-b4ff-4289-82b5-49d3c83c3007-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "70851abd-b4ff-4289-82b5-49d3c83c3007" (UID: "70851abd-b4ff-4289-82b5-49d3c83c3007"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:36:45.050341 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:45.050303 2570 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/70851abd-b4ff-4289-82b5-49d3c83c3007-installation-pull-secrets\") on node \"ip-10-0-133-237.ec2.internal\" DevicePath \"\"" Apr 21 15:36:45.050341 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:45.050335 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z9vfs\" (UniqueName: \"kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-kube-api-access-z9vfs\") on node \"ip-10-0-133-237.ec2.internal\" DevicePath \"\"" Apr 21 15:36:45.050341 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:45.050345 2570 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/70851abd-b4ff-4289-82b5-49d3c83c3007-ca-trust-extracted\") on node \"ip-10-0-133-237.ec2.internal\" DevicePath \"\"" Apr 21 15:36:45.050547 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:45.050355 2570 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/70851abd-b4ff-4289-82b5-49d3c83c3007-image-registry-private-configuration\") on node \"ip-10-0-133-237.ec2.internal\" DevicePath \"\"" Apr 21 15:36:45.050547 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:45.050365 2570 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-bound-sa-token\") on node \"ip-10-0-133-237.ec2.internal\" DevicePath \"\"" Apr 21 15:36:45.050547 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:45.050375 2570 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-certificates\") on node \"ip-10-0-133-237.ec2.internal\" DevicePath \"\"" Apr 21 15:36:45.050547 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:45.050383 2570 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/70851abd-b4ff-4289-82b5-49d3c83c3007-registry-tls\") on node \"ip-10-0-133-237.ec2.internal\" DevicePath \"\"" Apr 21 15:36:45.050547 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:45.050392 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/70851abd-b4ff-4289-82b5-49d3c83c3007-trusted-ca\") on node \"ip-10-0-133-237.ec2.internal\" DevicePath \"\"" Apr 21 15:36:45.660652 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:45.660561 2570 generic.go:358] "Generic (PLEG): container finished" podID="70851abd-b4ff-4289-82b5-49d3c83c3007" containerID="af6fc97823587a656b73f99f228697431285e2f2cfc748990ad3d53d84627ca7" exitCode=0 Apr 21 15:36:45.660652 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:45.660622 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" event={"ID":"70851abd-b4ff-4289-82b5-49d3c83c3007","Type":"ContainerDied","Data":"af6fc97823587a656b73f99f228697431285e2f2cfc748990ad3d53d84627ca7"} Apr 21 15:36:45.660652 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:45.660649 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" event={"ID":"70851abd-b4ff-4289-82b5-49d3c83c3007","Type":"ContainerDied","Data":"f658684bf45e7e7fc4c96c85bb8dc3a9e89686d55a35f26f6cadd1a71427caca"} Apr 21 15:36:45.660884 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:45.660653 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66ffb968db-h6xsp" Apr 21 15:36:45.660884 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:45.660664 2570 scope.go:117] "RemoveContainer" containerID="af6fc97823587a656b73f99f228697431285e2f2cfc748990ad3d53d84627ca7" Apr 21 15:36:45.668862 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:45.668847 2570 scope.go:117] "RemoveContainer" containerID="af6fc97823587a656b73f99f228697431285e2f2cfc748990ad3d53d84627ca7" Apr 21 15:36:45.669126 ip-10-0-133-237 kubenswrapper[2570]: E0421 15:36:45.669105 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af6fc97823587a656b73f99f228697431285e2f2cfc748990ad3d53d84627ca7\": container with ID starting with af6fc97823587a656b73f99f228697431285e2f2cfc748990ad3d53d84627ca7 not found: ID does not exist" containerID="af6fc97823587a656b73f99f228697431285e2f2cfc748990ad3d53d84627ca7" Apr 21 15:36:45.669278 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:45.669147 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af6fc97823587a656b73f99f228697431285e2f2cfc748990ad3d53d84627ca7"} err="failed to get container status \"af6fc97823587a656b73f99f228697431285e2f2cfc748990ad3d53d84627ca7\": rpc error: code = NotFound desc = could not find container \"af6fc97823587a656b73f99f228697431285e2f2cfc748990ad3d53d84627ca7\": container with ID starting with af6fc97823587a656b73f99f228697431285e2f2cfc748990ad3d53d84627ca7 not found: ID does not exist" Apr 21 15:36:45.686407 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:45.686378 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66ffb968db-h6xsp"] Apr 21 15:36:45.692920 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:45.692900 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-66ffb968db-h6xsp"] Apr 21 15:36:47.336201 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:47.336167 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70851abd-b4ff-4289-82b5-49d3c83c3007" path="/var/lib/kubelet/pods/70851abd-b4ff-4289-82b5-49d3c83c3007/volumes" Apr 21 15:36:47.580591 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:36:47.580537 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:37:00.578563 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:37:00.578532 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-4spst" Apr 21 15:37:37.579843 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:37:37.579801 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:37:37.598619 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:37:37.598593 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:37:37.813198 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:37:37.813160 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 15:40:19.193852 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:40:19.193819 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 15:42:41.335942 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:41.335865 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-pgmql"] Apr 21 15:42:41.336398 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:41.336184 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70851abd-b4ff-4289-82b5-49d3c83c3007" containerName="registry" Apr 21 15:42:41.336398 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:41.336198 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="70851abd-b4ff-4289-82b5-49d3c83c3007" containerName="registry" Apr 21 15:42:41.336398 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:41.336243 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="70851abd-b4ff-4289-82b5-49d3c83c3007" containerName="registry" Apr 21 15:42:41.338944 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:41.338928 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-pgmql" Apr 21 15:42:41.341174 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:41.341150 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 21 15:42:41.341331 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:41.341243 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 21 15:42:41.341652 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:41.341635 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-7hhth\"" Apr 21 15:42:41.341724 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:41.341664 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 21 15:42:41.345062 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:41.345040 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-pgmql"] Apr 21 15:42:41.418658 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:41.418623 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4j4k\" (UniqueName: \"kubernetes.io/projected/1cbadb8d-7973-4e35-bf67-ae3bf857bc4c-kube-api-access-q4j4k\") pod \"s3-init-pgmql\" (UID: \"1cbadb8d-7973-4e35-bf67-ae3bf857bc4c\") " pod="kserve/s3-init-pgmql" Apr 21 15:42:41.519164 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:41.519122 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4j4k\" (UniqueName: \"kubernetes.io/projected/1cbadb8d-7973-4e35-bf67-ae3bf857bc4c-kube-api-access-q4j4k\") pod \"s3-init-pgmql\" (UID: \"1cbadb8d-7973-4e35-bf67-ae3bf857bc4c\") " pod="kserve/s3-init-pgmql" Apr 21 15:42:41.534739 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:41.534711 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4j4k\" (UniqueName: \"kubernetes.io/projected/1cbadb8d-7973-4e35-bf67-ae3bf857bc4c-kube-api-access-q4j4k\") pod \"s3-init-pgmql\" (UID: \"1cbadb8d-7973-4e35-bf67-ae3bf857bc4c\") " pod="kserve/s3-init-pgmql" Apr 21 15:42:41.648236 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:41.648169 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-pgmql" Apr 21 15:42:41.760084 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:41.760052 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-pgmql"] Apr 21 15:42:41.762861 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:42:41.762834 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cbadb8d_7973_4e35_bf67_ae3bf857bc4c.slice/crio-bf45fe6288d94f21df944652da004dfc815823e9c2b128ecda74881764acc202 WatchSource:0}: Error finding container bf45fe6288d94f21df944652da004dfc815823e9c2b128ecda74881764acc202: Status 404 returned error can't find the container with id bf45fe6288d94f21df944652da004dfc815823e9c2b128ecda74881764acc202 Apr 21 15:42:41.764526 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:41.764508 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:42:42.591753 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:42.591715 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-pgmql" event={"ID":"1cbadb8d-7973-4e35-bf67-ae3bf857bc4c","Type":"ContainerStarted","Data":"bf45fe6288d94f21df944652da004dfc815823e9c2b128ecda74881764acc202"} Apr 21 15:42:46.603494 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:46.603453 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-pgmql" event={"ID":"1cbadb8d-7973-4e35-bf67-ae3bf857bc4c","Type":"ContainerStarted","Data":"c5f94589d1e4e379f5f2d80170d68d824d71f89525b0bb325166c7bbd13bb3fe"} Apr 21 15:42:46.620082 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:46.620028 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-pgmql" podStartSLOduration=1.144912609 podStartE2EDuration="5.620011492s" podCreationTimestamp="2026-04-21 15:42:41 +0000 UTC" firstStartedPulling="2026-04-21 15:42:41.76462767 +0000 UTC m=+443.065481851" lastFinishedPulling="2026-04-21 15:42:46.239726549 +0000 UTC m=+447.540580734" observedRunningTime="2026-04-21 15:42:46.619870457 +0000 UTC m=+447.920724660" watchObservedRunningTime="2026-04-21 15:42:46.620011492 +0000 UTC m=+447.920865694" Apr 21 15:42:49.612516 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:49.612475 2570 generic.go:358] "Generic (PLEG): container finished" podID="1cbadb8d-7973-4e35-bf67-ae3bf857bc4c" containerID="c5f94589d1e4e379f5f2d80170d68d824d71f89525b0bb325166c7bbd13bb3fe" exitCode=0 Apr 21 15:42:49.612872 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:49.612550 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-pgmql" event={"ID":"1cbadb8d-7973-4e35-bf67-ae3bf857bc4c","Type":"ContainerDied","Data":"c5f94589d1e4e379f5f2d80170d68d824d71f89525b0bb325166c7bbd13bb3fe"} Apr 21 15:42:50.727220 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:50.727199 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-pgmql" Apr 21 15:42:50.799505 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:50.799475 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4j4k\" (UniqueName: \"kubernetes.io/projected/1cbadb8d-7973-4e35-bf67-ae3bf857bc4c-kube-api-access-q4j4k\") pod \"1cbadb8d-7973-4e35-bf67-ae3bf857bc4c\" (UID: \"1cbadb8d-7973-4e35-bf67-ae3bf857bc4c\") " Apr 21 15:42:50.801490 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:50.801466 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cbadb8d-7973-4e35-bf67-ae3bf857bc4c-kube-api-access-q4j4k" (OuterVolumeSpecName: "kube-api-access-q4j4k") pod "1cbadb8d-7973-4e35-bf67-ae3bf857bc4c" (UID: "1cbadb8d-7973-4e35-bf67-ae3bf857bc4c"). InnerVolumeSpecName "kube-api-access-q4j4k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:42:50.900664 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:50.900592 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q4j4k\" (UniqueName: \"kubernetes.io/projected/1cbadb8d-7973-4e35-bf67-ae3bf857bc4c-kube-api-access-q4j4k\") on node \"ip-10-0-133-237.ec2.internal\" DevicePath \"\"" Apr 21 15:42:51.618428 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:51.618355 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-pgmql" Apr 21 15:42:51.618565 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:51.618355 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-pgmql" event={"ID":"1cbadb8d-7973-4e35-bf67-ae3bf857bc4c","Type":"ContainerDied","Data":"bf45fe6288d94f21df944652da004dfc815823e9c2b128ecda74881764acc202"} Apr 21 15:42:51.618565 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:42:51.618463 2570 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf45fe6288d94f21df944652da004dfc815823e9c2b128ecda74881764acc202" Apr 21 15:57:27.575090 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:27.575047 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7pvkx/must-gather-ffmkm"] Apr 21 15:57:27.575504 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:27.575329 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1cbadb8d-7973-4e35-bf67-ae3bf857bc4c" containerName="s3-init" Apr 21 15:57:27.575504 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:27.575339 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cbadb8d-7973-4e35-bf67-ae3bf857bc4c" containerName="s3-init" Apr 21 15:57:27.575504 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:27.575383 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="1cbadb8d-7973-4e35-bf67-ae3bf857bc4c" containerName="s3-init" Apr 21 15:57:27.578202 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:27.578187 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pvkx/must-gather-ffmkm" Apr 21 15:57:27.580268 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:27.580243 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7pvkx\"/\"default-dockercfg-mgs4b\"" Apr 21 15:57:27.580414 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:27.580322 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7pvkx\"/\"kube-root-ca.crt\"" Apr 21 15:57:27.580481 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:27.580415 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7pvkx\"/\"openshift-service-ca.crt\"" Apr 21 15:57:27.586937 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:27.586910 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7pvkx/must-gather-ffmkm"] Apr 21 15:57:27.654183 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:27.654152 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e4de76de-9ef7-41da-adc4-4bef0cec752a-must-gather-output\") pod \"must-gather-ffmkm\" (UID: \"e4de76de-9ef7-41da-adc4-4bef0cec752a\") " pod="openshift-must-gather-7pvkx/must-gather-ffmkm" Apr 21 15:57:27.654335 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:27.654202 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr9ns\" (UniqueName: \"kubernetes.io/projected/e4de76de-9ef7-41da-adc4-4bef0cec752a-kube-api-access-tr9ns\") pod \"must-gather-ffmkm\" (UID: \"e4de76de-9ef7-41da-adc4-4bef0cec752a\") " pod="openshift-must-gather-7pvkx/must-gather-ffmkm" Apr 21 15:57:27.754925 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:27.754876 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e4de76de-9ef7-41da-adc4-4bef0cec752a-must-gather-output\") pod \"must-gather-ffmkm\" (UID: \"e4de76de-9ef7-41da-adc4-4bef0cec752a\") " pod="openshift-must-gather-7pvkx/must-gather-ffmkm" Apr 21 15:57:27.755093 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:27.754944 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tr9ns\" (UniqueName: \"kubernetes.io/projected/e4de76de-9ef7-41da-adc4-4bef0cec752a-kube-api-access-tr9ns\") pod \"must-gather-ffmkm\" (UID: \"e4de76de-9ef7-41da-adc4-4bef0cec752a\") " pod="openshift-must-gather-7pvkx/must-gather-ffmkm" Apr 21 15:57:27.755316 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:27.755295 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e4de76de-9ef7-41da-adc4-4bef0cec752a-must-gather-output\") pod \"must-gather-ffmkm\" (UID: \"e4de76de-9ef7-41da-adc4-4bef0cec752a\") " pod="openshift-must-gather-7pvkx/must-gather-ffmkm" Apr 21 15:57:27.763325 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:27.763302 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr9ns\" (UniqueName: \"kubernetes.io/projected/e4de76de-9ef7-41da-adc4-4bef0cec752a-kube-api-access-tr9ns\") pod \"must-gather-ffmkm\" (UID: \"e4de76de-9ef7-41da-adc4-4bef0cec752a\") " pod="openshift-must-gather-7pvkx/must-gather-ffmkm" Apr 21 15:57:27.889071 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:27.888982 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pvkx/must-gather-ffmkm" Apr 21 15:57:27.999278 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:27.999248 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7pvkx/must-gather-ffmkm"] Apr 21 15:57:28.002013 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:57:28.001987 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4de76de_9ef7_41da_adc4_4bef0cec752a.slice/crio-922de0aaf78f7ab6e343806279b51ce7a8efe37f6706f6d68309529e3a927f29 WatchSource:0}: Error finding container 922de0aaf78f7ab6e343806279b51ce7a8efe37f6706f6d68309529e3a927f29: Status 404 returned error can't find the container with id 922de0aaf78f7ab6e343806279b51ce7a8efe37f6706f6d68309529e3a927f29 Apr 21 15:57:28.003741 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:28.003723 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:57:28.926916 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:28.926889 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pvkx/must-gather-ffmkm" event={"ID":"e4de76de-9ef7-41da-adc4-4bef0cec752a","Type":"ContainerStarted","Data":"922de0aaf78f7ab6e343806279b51ce7a8efe37f6706f6d68309529e3a927f29"} Apr 21 15:57:29.931719 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:29.931676 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pvkx/must-gather-ffmkm" event={"ID":"e4de76de-9ef7-41da-adc4-4bef0cec752a","Type":"ContainerStarted","Data":"34da3dfa162a2b7c6cc1d55d11576f93eef80b604bb98a402462bde355e99a9d"} Apr 21 15:57:29.931719 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:29.931714 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pvkx/must-gather-ffmkm" event={"ID":"e4de76de-9ef7-41da-adc4-4bef0cec752a","Type":"ContainerStarted","Data":"584dca8871c48ace9155687e947f9f3d95701a55772c358dd10071c871f0837f"} Apr 21 15:57:30.599592 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:30.599559 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-dmvhg_90b42886-9124-47a7-8a37-518ea2f64986/global-pull-secret-syncer/0.log" Apr 21 15:57:30.695338 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:30.695309 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-9222j_50b4859e-da58-4584-a53e-a4daaccafc4c/konnectivity-agent/0.log" Apr 21 15:57:30.788438 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:30.788410 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-237.ec2.internal_c9f0db16f43c11bf91bf71ea6d873f37/haproxy/0.log" Apr 21 15:57:33.736173 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:33.736115 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_00f6ca9e-1b0b-4ca1-9901-9c1922cab33f/alertmanager/0.log" Apr 21 15:57:33.763085 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:33.763042 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_00f6ca9e-1b0b-4ca1-9901-9c1922cab33f/config-reloader/0.log" Apr 21 15:57:33.789116 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:33.789080 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_00f6ca9e-1b0b-4ca1-9901-9c1922cab33f/kube-rbac-proxy-web/0.log" Apr 21 15:57:33.815285 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:33.815249 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_00f6ca9e-1b0b-4ca1-9901-9c1922cab33f/kube-rbac-proxy/0.log" Apr 21 15:57:33.840313 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:33.840284 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_00f6ca9e-1b0b-4ca1-9901-9c1922cab33f/kube-rbac-proxy-metric/0.log" Apr 21 15:57:33.866346 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:33.866276 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_00f6ca9e-1b0b-4ca1-9901-9c1922cab33f/prom-label-proxy/0.log" Apr 21 15:57:33.892237 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:33.892205 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_00f6ca9e-1b0b-4ca1-9901-9c1922cab33f/init-config-reloader/0.log" Apr 21 15:57:34.362237 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:34.362128 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rfzdj_ce65871c-6257-4a8d-9c07-cc6a8eab239e/node-exporter/0.log" Apr 21 15:57:34.412526 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:34.412492 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rfzdj_ce65871c-6257-4a8d-9c07-cc6a8eab239e/kube-rbac-proxy/0.log" Apr 21 15:57:34.467546 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:34.467519 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rfzdj_ce65871c-6257-4a8d-9c07-cc6a8eab239e/init-textfile/0.log" Apr 21 15:57:34.970528 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:34.970493 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73686fb8-2565-4898-8b87-a933f73b46a0/prometheus/0.log" Apr 21 15:57:35.040019 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:35.039981 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73686fb8-2565-4898-8b87-a933f73b46a0/config-reloader/0.log" Apr 21 15:57:35.105218 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:35.105184 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73686fb8-2565-4898-8b87-a933f73b46a0/thanos-sidecar/0.log" Apr 21 15:57:35.167304 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:35.167269 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73686fb8-2565-4898-8b87-a933f73b46a0/kube-rbac-proxy-web/0.log" Apr 21 15:57:35.219432 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:35.219403 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73686fb8-2565-4898-8b87-a933f73b46a0/kube-rbac-proxy/0.log" Apr 21 15:57:35.319124 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:35.319095 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73686fb8-2565-4898-8b87-a933f73b46a0/kube-rbac-proxy-thanos/0.log" Apr 21 15:57:35.389735 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:35.389688 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_73686fb8-2565-4898-8b87-a933f73b46a0/init-config-reloader/0.log" Apr 21 15:57:35.871083 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:35.870943 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-db98c9989-mrq85_6139496d-4b2d-4915-972b-f5ce78080077/telemeter-client/0.log" Apr 21 15:57:35.949357 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:35.949323 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-db98c9989-mrq85_6139496d-4b2d-4915-972b-f5ce78080077/reload/0.log" Apr 21 15:57:35.997583 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:35.997554 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-db98c9989-mrq85_6139496d-4b2d-4915-972b-f5ce78080077/kube-rbac-proxy/0.log" Apr 21 15:57:36.969557 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:36.969439 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7pvkx/must-gather-ffmkm" podStartSLOduration=9.087123996 podStartE2EDuration="9.969418946s" podCreationTimestamp="2026-04-21 15:57:27 +0000 UTC" firstStartedPulling="2026-04-21 15:57:28.003870246 +0000 UTC m=+1329.304724426" lastFinishedPulling="2026-04-21 15:57:28.886165193 +0000 UTC m=+1330.187019376" observedRunningTime="2026-04-21 15:57:29.955428521 +0000 UTC m=+1331.256282720" watchObservedRunningTime="2026-04-21 15:57:36.969418946 +0000 UTC m=+1338.270273148" Apr 21 15:57:36.970762 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:36.970736 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd"] Apr 21 15:57:36.974068 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:36.974050 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" Apr 21 15:57:36.984158 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:36.984119 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd"] Apr 21 15:57:37.032552 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.032516 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63ff2728-62b1-4f76-85c5-fe35da173731-lib-modules\") pod \"perf-node-gather-daemonset-rdmsd\" (UID: \"63ff2728-62b1-4f76-85c5-fe35da173731\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" Apr 21 15:57:37.032552 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.032555 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63ff2728-62b1-4f76-85c5-fe35da173731-sys\") pod \"perf-node-gather-daemonset-rdmsd\" (UID: \"63ff2728-62b1-4f76-85c5-fe35da173731\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" Apr 21 15:57:37.033004 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.032578 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/63ff2728-62b1-4f76-85c5-fe35da173731-proc\") pod \"perf-node-gather-daemonset-rdmsd\" (UID: \"63ff2728-62b1-4f76-85c5-fe35da173731\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" Apr 21 15:57:37.033004 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.032663 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8fw7\" (UniqueName: \"kubernetes.io/projected/63ff2728-62b1-4f76-85c5-fe35da173731-kube-api-access-n8fw7\") pod \"perf-node-gather-daemonset-rdmsd\" (UID: \"63ff2728-62b1-4f76-85c5-fe35da173731\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" Apr 21 15:57:37.033004 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.032744 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/63ff2728-62b1-4f76-85c5-fe35da173731-podres\") pod \"perf-node-gather-daemonset-rdmsd\" (UID: \"63ff2728-62b1-4f76-85c5-fe35da173731\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" Apr 21 15:57:37.133582 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.133543 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8fw7\" (UniqueName: \"kubernetes.io/projected/63ff2728-62b1-4f76-85c5-fe35da173731-kube-api-access-n8fw7\") pod \"perf-node-gather-daemonset-rdmsd\" (UID: \"63ff2728-62b1-4f76-85c5-fe35da173731\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" Apr 21 15:57:37.133760 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.133605 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/63ff2728-62b1-4f76-85c5-fe35da173731-podres\") pod \"perf-node-gather-daemonset-rdmsd\" (UID: \"63ff2728-62b1-4f76-85c5-fe35da173731\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" Apr 21 15:57:37.133760 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.133672 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63ff2728-62b1-4f76-85c5-fe35da173731-lib-modules\") pod \"perf-node-gather-daemonset-rdmsd\" (UID: \"63ff2728-62b1-4f76-85c5-fe35da173731\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" Apr 21 15:57:37.133760 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.133701 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63ff2728-62b1-4f76-85c5-fe35da173731-sys\") pod \"perf-node-gather-daemonset-rdmsd\" (UID: \"63ff2728-62b1-4f76-85c5-fe35da173731\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" Apr 21 15:57:37.133760 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.133727 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/63ff2728-62b1-4f76-85c5-fe35da173731-proc\") pod \"perf-node-gather-daemonset-rdmsd\" (UID: \"63ff2728-62b1-4f76-85c5-fe35da173731\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" Apr 21 15:57:37.133945 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.133796 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/63ff2728-62b1-4f76-85c5-fe35da173731-proc\") pod \"perf-node-gather-daemonset-rdmsd\" (UID: \"63ff2728-62b1-4f76-85c5-fe35da173731\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" Apr 21 15:57:37.133945 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.133799 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63ff2728-62b1-4f76-85c5-fe35da173731-lib-modules\") pod \"perf-node-gather-daemonset-rdmsd\" (UID: \"63ff2728-62b1-4f76-85c5-fe35da173731\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" Apr 21 15:57:37.133945 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.133802 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/63ff2728-62b1-4f76-85c5-fe35da173731-podres\") pod \"perf-node-gather-daemonset-rdmsd\" (UID: \"63ff2728-62b1-4f76-85c5-fe35da173731\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" Apr 21 15:57:37.133945 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.133833 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63ff2728-62b1-4f76-85c5-fe35da173731-sys\") pod \"perf-node-gather-daemonset-rdmsd\" (UID: \"63ff2728-62b1-4f76-85c5-fe35da173731\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" Apr 21 15:57:37.143241 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.143210 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8fw7\" (UniqueName: \"kubernetes.io/projected/63ff2728-62b1-4f76-85c5-fe35da173731-kube-api-access-n8fw7\") pod \"perf-node-gather-daemonset-rdmsd\" (UID: \"63ff2728-62b1-4f76-85c5-fe35da173731\") " pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" Apr 21 15:57:37.283657 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.283626 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" Apr 21 15:57:37.407576 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.407552 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd"] Apr 21 15:57:37.409958 ip-10-0-133-237 kubenswrapper[2570]: W0421 15:57:37.409915 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod63ff2728_62b1_4f76_85c5_fe35da173731.slice/crio-3c7f0006d2b6800b3632f18b10847f333c837df789b24b5bd1992d40c884b452 WatchSource:0}: Error finding container 3c7f0006d2b6800b3632f18b10847f333c837df789b24b5bd1992d40c884b452: Status 404 returned error can't find the container with id 3c7f0006d2b6800b3632f18b10847f333c837df789b24b5bd1992d40c884b452 Apr 21 15:57:37.962366 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.962286 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" event={"ID":"63ff2728-62b1-4f76-85c5-fe35da173731","Type":"ContainerStarted","Data":"61bcb40a3fea354e047a7acc34b5b99f49c8978ba37df8bf654653e066935521"} Apr 21 15:57:37.962366 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.962324 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" event={"ID":"63ff2728-62b1-4f76-85c5-fe35da173731","Type":"ContainerStarted","Data":"3c7f0006d2b6800b3632f18b10847f333c837df789b24b5bd1992d40c884b452"} Apr 21 15:57:37.962629 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.962410 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" Apr 21 15:57:37.987537 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:37.987495 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" podStartSLOduration=1.987482327 podStartE2EDuration="1.987482327s" podCreationTimestamp="2026-04-21 15:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:57:37.987275692 +0000 UTC m=+1339.288129897" watchObservedRunningTime="2026-04-21 15:57:37.987482327 +0000 UTC m=+1339.288336528" Apr 21 15:57:39.654699 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:39.654666 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4xm8n_63d08935-bd63-4c7f-83c9-df40083b472a/dns/0.log" Apr 21 15:57:39.693355 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:39.693328 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4xm8n_63d08935-bd63-4c7f-83c9-df40083b472a/kube-rbac-proxy/0.log" Apr 21 15:57:39.946588 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:39.946518 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6kcrm_d5081a65-e77f-4228-83e9-044b28aa3b8b/dns-node-resolver/0.log" Apr 21 15:57:40.729292 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:40.729269 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qsgrz_08d2130c-7332-485b-95f8-0728da25787a/node-ca/0.log" Apr 21 15:57:42.037995 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:42.037966 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9lpqk_b5a5676f-25d5-4f87-ad65-41d268c5e9f4/serve-healthcheck-canary/0.log" Apr 21 15:57:42.803615 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:42.803575 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tqfjd_45da1054-656b-482f-b6dd-c2df1f588ac7/kube-rbac-proxy/0.log" Apr 21 15:57:42.829911 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:42.829888 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tqfjd_45da1054-656b-482f-b6dd-c2df1f588ac7/exporter/0.log" Apr 21 15:57:42.852291 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:42.852263 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tqfjd_45da1054-656b-482f-b6dd-c2df1f588ac7/extractor/0.log" Apr 21 15:57:43.974215 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:43.974186 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7pvkx/perf-node-gather-daemonset-rdmsd" Apr 21 15:57:45.200954 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:45.200918 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-pgmql_1cbadb8d-7973-4e35-bf67-ae3bf857bc4c/s3-init/0.log" Apr 21 15:57:49.987878 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:49.987849 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-smkv2_7b0361fd-9757-471d-becd-b04ebf9ab715/migrator/0.log" Apr 21 15:57:50.026971 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:50.026936 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-smkv2_7b0361fd-9757-471d-becd-b04ebf9ab715/graceful-termination/0.log" Apr 21 15:57:51.520994 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:51.520967 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kltss_abc50d87-ddda-484f-bcda-07b2af6fbf70/kube-multus-additional-cni-plugins/0.log" Apr 21 15:57:51.547642 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:51.547613 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kltss_abc50d87-ddda-484f-bcda-07b2af6fbf70/egress-router-binary-copy/0.log" Apr 21 15:57:51.576074 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:51.576048 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kltss_abc50d87-ddda-484f-bcda-07b2af6fbf70/cni-plugins/0.log" Apr 21 15:57:51.609998 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:51.609975 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kltss_abc50d87-ddda-484f-bcda-07b2af6fbf70/bond-cni-plugin/0.log" Apr 21 15:57:51.706634 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:51.706601 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kltss_abc50d87-ddda-484f-bcda-07b2af6fbf70/routeoverride-cni/0.log" Apr 21 15:57:51.733999 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:51.733974 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kltss_abc50d87-ddda-484f-bcda-07b2af6fbf70/whereabouts-cni-bincopy/0.log" Apr 21 15:57:51.763855 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:51.763835 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kltss_abc50d87-ddda-484f-bcda-07b2af6fbf70/whereabouts-cni/0.log" Apr 21 15:57:52.263610 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:52.263580 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n4dx6_81a2659f-602c-4c12-bd0d-20488c10a56f/kube-multus/0.log" Apr 21 15:57:52.431028 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:52.430998 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x2rv7_54264bba-76e1-44c8-8581-4f2271e68bd7/network-metrics-daemon/0.log" Apr 21 15:57:52.457562 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:52.457536 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x2rv7_54264bba-76e1-44c8-8581-4f2271e68bd7/kube-rbac-proxy/0.log" Apr 21 15:57:53.399124 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:53.399095 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ndqn_7aae9a17-ae2e-4328-91d9-7fb4b43f79e2/ovn-controller/0.log" Apr 21 15:57:53.455639 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:53.455609 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ndqn_7aae9a17-ae2e-4328-91d9-7fb4b43f79e2/ovn-acl-logging/0.log" Apr 21 15:57:53.504950 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:53.504890 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ndqn_7aae9a17-ae2e-4328-91d9-7fb4b43f79e2/kube-rbac-proxy-node/0.log" Apr 21 15:57:53.543835 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:53.543794 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ndqn_7aae9a17-ae2e-4328-91d9-7fb4b43f79e2/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 15:57:53.588629 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:53.588590 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ndqn_7aae9a17-ae2e-4328-91d9-7fb4b43f79e2/northd/0.log" Apr 21 15:57:53.627555 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:53.627515 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ndqn_7aae9a17-ae2e-4328-91d9-7fb4b43f79e2/nbdb/0.log" Apr 21 15:57:53.652597 ip-10-0-133-237 kubenswrapper[2570]: I0421 15:57:53.652575 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8ndqn_7aae9a17-ae2e-4328-91d9-7fb4b43f79e2/sbdb/0.log"