Apr 16 13:56:45.658780 ip-10-0-136-109 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 13:56:45.658792 ip-10-0-136-109 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 13:56:45.658799 ip-10-0-136-109 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 13:56:45.659020 ip-10-0-136-109 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 13:56:57.013069 ip-10-0-136-109 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 13:56:57.013090 ip-10-0-136-109 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot cd9b8b10ee2c4ee990e8c33091a3816b -- Apr 16 13:59:20.657636 ip-10-0-136-109 systemd[1]: Starting Kubernetes Kubelet... Apr 16 13:59:21.110160 ip-10-0-136-109 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:21.110160 ip-10-0-136-109 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 13:59:21.110160 ip-10-0-136-109 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:21.110160 ip-10-0-136-109 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 13:59:21.110160 ip-10-0-136-109 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:21.112397 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.112287 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 13:59:21.115386 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115370 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:21.115386 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115387 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:21.115455 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115391 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:21.115455 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115394 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:21.115455 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115398 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:21.115455 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115401 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:21.115455 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115404 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:21.115455 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115406 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:21.115455 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115409 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:21.115455 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115412 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:21.115455 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115415 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:21.115455 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115418 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:21.115455 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115421 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:21.115455 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115425 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:21.115455 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115427 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:21.115455 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115431 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:21.115455 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115433 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:21.115455 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115436 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:21.115455 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115444 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:21.115455 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115447 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:21.115455 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115449 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:21.115922 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115454 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:21.115922 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115457 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:21.115922 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115461 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:21.115922 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115464 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:21.115922 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115468 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:21.115922 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115471 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:21.115922 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115473 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:21.115922 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115476 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:21.115922 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115479 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:21.115922 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115481 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:21.115922 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115484 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:21.115922 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115487 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:21.115922 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115490 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:21.115922 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115493 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:21.115922 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115495 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:21.115922 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115499 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:21.115922 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115501 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:21.115922 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115504 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:21.115922 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115507 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:21.115922 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115509 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:21.116450 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115511 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:21.116450 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115514 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:21.116450 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115517 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:21.116450 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115519 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:21.116450 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115523 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:21.116450 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115525 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:21.116450 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115528 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:21.116450 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115530 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:21.116450 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115532 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:21.116450 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115535 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:21.116450 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115538 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:21.116450 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115540 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:21.116450 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115543 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:21.116450 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115546 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:21.116450 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115549 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:21.116450 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115551 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:21.116450 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115554 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:21.116450 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115557 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:21.116450 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115559 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:21.116450 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115562 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:21.116937 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115564 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:21.116937 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115567 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:21.116937 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115569 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:21.116937 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115571 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:21.116937 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115574 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:21.116937 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115577 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:21.116937 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115579 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:21.116937 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115581 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:21.116937 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115584 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:21.116937 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115586 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:21.116937 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115589 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:21.116937 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115591 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:21.116937 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115594 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:21.116937 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115596 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:21.116937 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115599 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:21.116937 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115602 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:21.116937 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115604 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:21.116937 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115606 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:21.116937 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115609 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:21.116937 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115611 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:21.117420 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115614 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:21.117420 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115616 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:21.117420 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115619 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:21.117420 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115623 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:21.117420 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.115626 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:21.117420 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116023 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:21.117420 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116028 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:21.117420 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116031 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:21.117420 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116034 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:21.117420 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116036 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:21.117420 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116039 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:21.117420 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116042 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:21.117420 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116044 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:21.117420 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116047 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:21.117420 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116049 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:21.117420 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116051 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:21.117420 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116054 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:21.117420 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116057 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:21.117420 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116059 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:21.117420 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116062 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:21.117936 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116064 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:21.117936 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116067 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:21.117936 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116070 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:21.117936 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116072 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:21.117936 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116075 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:21.117936 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116077 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:21.117936 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116080 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:21.117936 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116082 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:21.117936 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116085 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:21.117936 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116088 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:21.117936 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116092 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:21.117936 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116095 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:21.117936 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116098 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:21.117936 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116101 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:21.117936 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116103 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:21.117936 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116107 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:21.117936 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116109 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:21.117936 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116112 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:21.117936 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116115 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:21.117936 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116118 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:21.118487 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116121 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:21.118487 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116124 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:21.118487 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116126 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:21.118487 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116129 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:21.118487 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116131 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:21.118487 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116133 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:21.118487 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116136 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:21.118487 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116138 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:21.118487 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116141 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:21.118487 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116143 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:21.118487 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116146 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:21.118487 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116148 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:21.118487 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116151 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:21.118487 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116155 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:21.118487 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116158 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:21.118487 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116161 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:21.118487 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116163 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:21.118487 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116166 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:21.118487 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116170 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:21.118487 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116173 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:21.118975 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116175 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:21.118975 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116178 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:21.118975 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116180 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:21.118975 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116182 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:21.118975 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116185 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:21.118975 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116187 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:21.118975 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116190 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:21.118975 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116192 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:21.118975 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116195 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:21.118975 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116200 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:21.118975 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116202 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:21.118975 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116205 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:21.118975 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116207 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:21.118975 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116210 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:21.118975 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116213 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:21.118975 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116215 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:21.118975 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116217 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:21.118975 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116220 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:21.118975 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116222 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:21.118975 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116225 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116227 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116229 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116232 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116234 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116237 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116239 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116242 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116244 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116247 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116249 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.116252 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117441 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117452 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117458 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117463 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117469 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117472 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117477 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117482 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117485 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 13:59:21.119481 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117488 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117493 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117497 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117500 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117504 2570 flags.go:64] FLAG: --cgroup-root="" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117507 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117510 2570 flags.go:64] FLAG: --client-ca-file="" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117513 2570 flags.go:64] FLAG: --cloud-config="" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117516 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117519 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117526 2570 flags.go:64] FLAG: --cluster-domain="" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117529 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117532 2570 flags.go:64] FLAG: --config-dir="" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117535 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117538 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117542 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117546 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117549 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117552 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117555 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117558 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117561 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117564 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117568 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117576 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 13:59:21.120248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117579 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117582 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117585 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117588 2570 flags.go:64] FLAG: --enable-server="true" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117591 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117596 2570 flags.go:64] FLAG: --event-burst="100" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117599 2570 flags.go:64] FLAG: --event-qps="50" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117602 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117605 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117609 2570 flags.go:64] FLAG: --eviction-hard="" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117613 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117616 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117620 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117623 2570 flags.go:64] FLAG: --eviction-soft="" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117626 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117629 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117632 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117635 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117638 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117641 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117644 2570 flags.go:64] FLAG: --feature-gates="" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117648 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117651 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117654 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117658 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117661 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 16 13:59:21.121000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117664 2570 flags.go:64] FLAG: --help="false" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117667 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-136-109.ec2.internal" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117670 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117673 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117676 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117680 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117683 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117686 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117689 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117692 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117695 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117698 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117701 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117704 2570 flags.go:64] FLAG: --kube-reserved="" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117707 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117711 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117714 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117717 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117720 2570 flags.go:64] FLAG: --lock-file="" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117723 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117726 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117729 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117735 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 13:59:21.121709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117738 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117741 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117744 2570 flags.go:64] FLAG: --logging-format="text" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117746 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117750 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117753 2570 flags.go:64] FLAG: --manifest-url="" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117756 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117761 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117763 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117768 2570 flags.go:64] FLAG: --max-pods="110" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117771 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117774 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117777 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117780 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117783 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117786 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117789 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117798 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117802 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117805 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117808 2570 flags.go:64] FLAG: --pod-cidr="" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117811 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117817 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117820 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 13:59:21.122341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117823 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117826 2570 flags.go:64] FLAG: --port="10250" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117830 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117833 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c2f501f236aec4f9" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117836 2570 flags.go:64] FLAG: --qos-reserved="" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117841 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117845 2570 flags.go:64] FLAG: --register-node="true" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117848 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117851 2570 flags.go:64] FLAG: --register-with-taints="" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117859 2570 flags.go:64] FLAG: --registry-burst="10" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117862 2570 flags.go:64] FLAG: --registry-qps="5" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117865 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117868 2570 flags.go:64] FLAG: --reserved-memory="" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117871 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117874 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117877 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117880 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117884 2570 flags.go:64] FLAG: --runonce="false" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117887 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117889 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117892 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117895 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117898 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117901 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117904 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117908 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 13:59:21.123041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117910 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117913 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117916 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117919 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117922 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117925 2570 flags.go:64] FLAG: --system-cgroups="" Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117928 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117934 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117937 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117940 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117944 2570 flags.go:64] FLAG: --tls-min-version="" Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117949 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117952 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117955 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117958 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117961 2570 flags.go:64] FLAG: --v="2" Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117966 2570 flags.go:64] FLAG: --version="false" Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117970 2570 flags.go:64] FLAG: --vmodule="" Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117974 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.117977 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118075 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118079 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118082 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118085 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:21.123799 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118087 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:21.124407 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118091 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:21.124407 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118095 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:21.124407 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118097 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:21.124407 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118100 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:21.124407 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118103 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:21.124407 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118105 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:21.124407 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118108 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:21.124407 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118110 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:21.124407 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118113 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:21.124407 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118115 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:21.124407 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118118 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:21.124407 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118121 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:21.124407 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118124 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:21.124407 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118126 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:21.124407 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118129 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:21.124407 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118131 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:21.124407 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118134 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:21.124407 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118137 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:21.124407 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118141 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:21.124964 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118144 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:21.124964 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118146 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:21.124964 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118149 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:21.124964 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118151 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:21.124964 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118154 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:21.124964 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118156 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:21.124964 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118159 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:21.124964 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118161 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:21.124964 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118164 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:21.124964 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118166 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:21.124964 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118169 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:21.124964 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118171 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:21.124964 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118174 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:21.124964 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118176 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:21.124964 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118179 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:21.124964 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118181 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:21.124964 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118184 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:21.124964 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118187 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:21.124964 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118189 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:21.125557 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118192 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:21.125557 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118194 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:21.125557 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118197 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:21.125557 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118199 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:21.125557 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118202 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:21.125557 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118204 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:21.125557 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118207 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:21.125557 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118209 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:21.125557 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118211 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:21.125557 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118214 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:21.125557 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118217 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:21.125557 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118219 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:21.125557 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118224 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:21.125557 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118226 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:21.125557 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118229 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:21.125557 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118231 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:21.125557 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118234 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:21.125557 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118236 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:21.125557 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118239 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:21.126030 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118243 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:21.126030 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118246 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:21.126030 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118249 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:21.126030 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118252 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:21.126030 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118254 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:21.126030 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118257 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:21.126030 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118259 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:21.126030 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118262 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:21.126030 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118265 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:21.126030 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118267 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:21.126030 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118270 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:21.126030 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118272 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:21.126030 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118275 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:21.126030 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118277 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:21.126030 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118280 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:21.126030 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118282 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:21.126030 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118285 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:21.126030 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118291 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:21.126030 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118294 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:21.126030 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118296 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:21.126635 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118298 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:21.126635 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118301 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:21.126635 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118304 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:21.126635 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.118307 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:21.126635 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.118866 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:21.127886 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.127865 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 13:59:21.127927 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.127888 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 13:59:21.127960 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.127950 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:21.127960 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.127956 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:21.127960 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.127959 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:21.128036 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.127962 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:21.128036 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.127965 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:21.128036 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.127968 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:21.128036 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.127971 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:21.128036 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.127974 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:21.128036 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.127977 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:21.128036 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.127979 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:21.128036 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.127982 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:21.128036 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.127985 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:21.128036 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.127987 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:21.128036 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.127990 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:21.128036 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.127992 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:21.128036 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.127995 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:21.128036 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.127998 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:21.128036 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128002 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:21.128036 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128006 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:21.128036 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128008 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:21.128036 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128011 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:21.128036 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128013 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:21.128530 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128020 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:21.128530 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128023 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:21.128530 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128026 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:21.128530 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128028 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:21.128530 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128031 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:21.128530 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128033 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:21.128530 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128036 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:21.128530 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128039 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:21.128530 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128043 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:21.128530 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128048 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:21.128530 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128051 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:21.128530 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128054 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:21.128530 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128058 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:21.128530 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128060 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:21.128530 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128063 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:21.128530 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128066 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:21.128530 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128068 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:21.128530 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128071 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:21.128530 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128073 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:21.129020 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128076 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:21.129020 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128078 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:21.129020 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128081 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:21.129020 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128083 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:21.129020 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128086 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:21.129020 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128089 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:21.129020 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128092 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:21.129020 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128094 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:21.129020 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128097 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:21.129020 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128099 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:21.129020 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128101 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:21.129020 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128104 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:21.129020 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128107 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:21.129020 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128109 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:21.129020 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128113 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:21.129020 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128116 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:21.129020 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128118 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:21.129020 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128121 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:21.129020 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128123 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:21.129020 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128126 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:21.129600 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128128 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:21.129600 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128131 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:21.129600 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128133 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:21.129600 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128136 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:21.129600 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128139 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:21.129600 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128141 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:21.129600 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128144 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:21.129600 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128146 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:21.129600 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128149 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:21.129600 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128151 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:21.129600 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128154 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:21.129600 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128156 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:21.129600 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128159 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:21.129600 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128161 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:21.129600 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128164 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:21.129600 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128167 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:21.129600 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128170 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:21.129600 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128173 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:21.129600 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128175 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:21.129600 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128177 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:21.130128 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128180 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:21.130128 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128182 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:21.130128 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128185 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:21.130128 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128187 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:21.130128 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128190 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:21.130128 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.128195 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:21.130128 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128336 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:21.130128 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128343 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:21.130128 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128346 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:21.130128 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128349 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:21.130128 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128352 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:21.130128 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128354 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:21.130128 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128357 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:21.130128 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128360 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:21.130128 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128362 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:21.130128 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128365 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:21.130552 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128368 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:21.130552 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128370 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:21.130552 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128373 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:21.130552 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128375 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:21.130552 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128378 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:21.130552 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128380 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:21.130552 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128383 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:21.130552 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128385 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:21.130552 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128388 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:21.130552 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128391 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:21.130552 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128393 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:21.130552 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128396 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:21.130552 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128399 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:21.130552 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128402 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:21.130552 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128404 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:21.130552 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128407 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:21.130552 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128409 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:21.130552 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128412 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:21.130552 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128415 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:21.130552 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128417 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:21.131059 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128420 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:21.131059 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128422 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:21.131059 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128426 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:21.131059 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128428 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:21.131059 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128431 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:21.131059 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128434 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:21.131059 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128436 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:21.131059 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128439 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:21.131059 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128442 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:21.131059 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128444 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:21.131059 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128447 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:21.131059 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128450 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:21.131059 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128452 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:21.131059 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128455 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:21.131059 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128457 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:21.131059 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128460 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:21.131059 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128462 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:21.131059 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128466 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:21.131059 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128470 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:21.131536 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128473 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:21.131536 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128476 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:21.131536 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128479 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:21.131536 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128482 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:21.131536 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128485 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:21.131536 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128489 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:21.131536 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128492 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:21.131536 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128495 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:21.131536 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128498 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:21.131536 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128500 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:21.131536 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128503 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:21.131536 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128506 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:21.131536 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128508 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:21.131536 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128511 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:21.131536 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128513 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:21.131536 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128516 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:21.131536 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128519 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:21.131536 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128521 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:21.131536 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128524 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:21.132013 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128527 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:21.132013 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128529 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:21.132013 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128532 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:21.132013 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128535 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:21.132013 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128538 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:21.132013 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128540 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:21.132013 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128543 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:21.132013 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128546 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:21.132013 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128548 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:21.132013 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128557 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:21.132013 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128561 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:21.132013 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128563 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:21.132013 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128566 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:21.132013 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128569 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:21.132013 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128571 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:21.132013 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128574 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:21.132013 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128576 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:21.132013 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:21.128579 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:21.132473 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.128583 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:21.132473 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.128701 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 13:59:21.132473 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.131387 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 13:59:21.132473 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.132343 2570 server.go:1019] "Starting client certificate rotation" Apr 16 13:59:21.132473 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.132435 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:21.132620 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.132477 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:21.158696 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.158674 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:21.161178 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.161160 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:21.177377 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.177353 2570 log.go:25] "Validated CRI v1 runtime API" Apr 16 13:59:21.182065 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.182047 2570 log.go:25] "Validated CRI v1 image API" Apr 16 13:59:21.183196 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.183179 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 13:59:21.188095 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.188079 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:21.188649 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.188630 2570 fs.go:135] Filesystem UUIDs: map[1c65ac9b-6256-4bfd-be53-5b3f89a050c2:/dev/nvme0n1p3 2c1abfc7-1525-4db7-9b39-dc43f7718f9c:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 13:59:21.188698 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.188649 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 13:59:21.194511 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.194394 2570 manager.go:217] Machine: {Timestamp:2026-04-16 13:59:21.193149235 +0000 UTC m=+0.417820461 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099654 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b0e0e4cdb54669cba7d118495e896 SystemUUID:ec2b0e0e-4cdb-5466-9cba-7d118495e896 BootID:cd9b8b10-ee2c-4ee9-90e8-c33091a3816b Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7f:a7:eb:56:4d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7f:a7:eb:56:4d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0a:12:a3:c3:b6:17 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 13:59:21.195186 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.195175 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 13:59:21.195279 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.195267 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 13:59:21.196474 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.196453 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 13:59:21.196624 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.196478 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-109.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 13:59:21.196671 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.196633 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 13:59:21.196671 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.196642 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 13:59:21.196671 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.196655 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:21.197421 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.197411 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:21.198197 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.198186 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:21.198299 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.198290 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 13:59:21.200598 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.200589 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 16 13:59:21.201335 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.201325 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 13:59:21.201372 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.201345 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 13:59:21.201372 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.201354 2570 kubelet.go:397] "Adding apiserver pod source" Apr 16 13:59:21.201372 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.201364 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 13:59:21.202428 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.202416 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:21.202472 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.202440 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:21.205459 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.205443 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 13:59:21.206118 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.206104 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-smspx" Apr 16 13:59:21.207145 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.207129 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 13:59:21.208731 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.208719 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 13:59:21.208793 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.208736 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 13:59:21.208793 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.208742 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 13:59:21.208793 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.208747 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 13:59:21.208793 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.208754 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 13:59:21.208793 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.208759 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 13:59:21.208793 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.208765 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 13:59:21.208793 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.208770 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 13:59:21.208793 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.208779 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 13:59:21.208793 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.208785 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 13:59:21.209032 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.208800 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 13:59:21.209032 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.208809 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 13:59:21.209700 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.209691 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 13:59:21.209700 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.209700 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 13:59:21.213352 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:21.213304 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-109.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 13:59:21.213352 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:21.213330 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 13:59:21.213496 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.213364 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-smspx" Apr 16 13:59:21.213496 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.213388 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 13:59:21.213496 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.213385 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-109.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 13:59:21.213496 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.213426 2570 server.go:1295] "Started kubelet" Apr 16 13:59:21.213613 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.213508 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 13:59:21.213613 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.213529 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 13:59:21.213613 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.213592 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 13:59:21.214239 ip-10-0-136-109 systemd[1]: Started Kubernetes Kubelet. Apr 16 13:59:21.215256 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.215235 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 13:59:21.216341 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.216326 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 16 13:59:21.222896 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.222878 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 13:59:21.223092 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.222902 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:21.224030 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.224010 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 13:59:21.224030 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.224031 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 13:59:21.224177 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.224113 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 13:59:21.224177 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.224175 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 16 13:59:21.224177 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.224183 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 16 13:59:21.224459 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.224438 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 13:59:21.224459 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.224460 2570 factory.go:55] Registering systemd factory Apr 16 13:59:21.224597 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.224470 2570 factory.go:223] Registration of the systemd container factory successfully Apr 16 13:59:21.224597 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:21.224552 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 13:59:21.224709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.224689 2570 factory.go:153] Registering CRI-O factory Apr 16 13:59:21.224709 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.224702 2570 factory.go:223] Registration of the crio container factory successfully Apr 16 13:59:21.224797 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.224721 2570 factory.go:103] Registering Raw factory Apr 16 13:59:21.224797 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.224736 2570 manager.go:1196] Started watching for new ooms in manager Apr 16 13:59:21.224908 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:21.224889 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-109.ec2.internal\" not found" Apr 16 13:59:21.225761 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.225743 2570 manager.go:319] Starting recovery of all containers Apr 16 13:59:21.225859 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.225795 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:21.232587 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:21.232563 2570 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-136-109.ec2.internal\" not found" node="ip-10-0-136-109.ec2.internal" Apr 16 13:59:21.239578 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.239465 2570 manager.go:324] Recovery completed Apr 16 13:59:21.243439 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.243424 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:21.246094 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.246075 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-109.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:21.246163 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.246106 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:21.246163 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.246116 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-109.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:21.246568 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.246555 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 13:59:21.246652 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.246567 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 13:59:21.246652 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.246586 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:21.249556 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.249539 2570 policy_none.go:49] "None policy: Start" Apr 16 13:59:21.249556 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.249555 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 13:59:21.249673 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.249566 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 16 13:59:21.309758 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.303882 2570 manager.go:341] "Starting Device Plugin manager" Apr 16 13:59:21.309758 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:21.303910 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 13:59:21.309758 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.303920 2570 server.go:85] "Starting device plugin registration server" Apr 16 13:59:21.309758 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.304187 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 13:59:21.309758 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.304198 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 13:59:21.309758 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.304305 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 13:59:21.309758 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.304413 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 13:59:21.309758 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.304426 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 13:59:21.309758 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:21.305024 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 13:59:21.309758 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:21.305066 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-109.ec2.internal\" not found" Apr 16 13:59:21.354666 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.354619 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 13:59:21.355834 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.355815 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 13:59:21.355955 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.355847 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 13:59:21.355955 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.355865 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 13:59:21.355955 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.355872 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 13:59:21.355955 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:21.355911 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 13:59:21.358237 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.358213 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:21.405179 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.405114 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:21.406090 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.406076 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-109.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:21.406149 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.406106 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:21.406149 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.406120 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-109.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:21.406149 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.406150 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-109.ec2.internal" Apr 16 13:59:21.414471 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.414455 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-109.ec2.internal" Apr 16 13:59:21.414534 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:21.414477 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-109.ec2.internal\": node \"ip-10-0-136-109.ec2.internal\" not found" Apr 16 13:59:21.433591 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:21.433566 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-109.ec2.internal\" not found" Apr 16 13:59:21.456145 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.456113 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-109.ec2.internal"] Apr 16 13:59:21.456228 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.456200 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:21.457036 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.457022 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-109.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:21.457120 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.457052 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:21.457120 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.457065 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-109.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:21.459307 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.459292 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:21.459428 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.459413 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal" Apr 16 13:59:21.459476 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.459464 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:21.460050 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.460029 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-109.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:21.460135 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.460060 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:21.460135 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.460070 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-109.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:21.460135 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.460036 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-109.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:21.460135 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.460132 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:21.460258 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.460143 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-109.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:21.462705 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.462685 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-109.ec2.internal" Apr 16 13:59:21.462801 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.462712 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:21.463384 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.463365 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-109.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:21.463468 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.463412 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:21.463468 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.463427 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-109.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:21.489077 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:21.489054 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-109.ec2.internal\" not found" node="ip-10-0-136-109.ec2.internal" Apr 16 13:59:21.493682 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:21.493664 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-109.ec2.internal\" not found" node="ip-10-0-136-109.ec2.internal" Apr 16 13:59:21.525991 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.525966 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9a325dd6c6725307a6ffbdfee2361b7c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal\" (UID: \"9a325dd6c6725307a6ffbdfee2361b7c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal" Apr 16 13:59:21.525991 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.525995 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a325dd6c6725307a6ffbdfee2361b7c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal\" (UID: \"9a325dd6c6725307a6ffbdfee2361b7c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal" Apr 16 13:59:21.526170 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.526018 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2a1992b488b3339bc008fb24c80291d9-config\") pod \"kube-apiserver-proxy-ip-10-0-136-109.ec2.internal\" (UID: \"2a1992b488b3339bc008fb24c80291d9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-109.ec2.internal" Apr 16 13:59:21.534220 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:21.534198 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-109.ec2.internal\" not found" Apr 16 13:59:21.626345 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.626296 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9a325dd6c6725307a6ffbdfee2361b7c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal\" (UID: \"9a325dd6c6725307a6ffbdfee2361b7c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal" Apr 16 13:59:21.626345 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.626344 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a325dd6c6725307a6ffbdfee2361b7c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal\" (UID: \"9a325dd6c6725307a6ffbdfee2361b7c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal" Apr 16 13:59:21.626566 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.626361 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2a1992b488b3339bc008fb24c80291d9-config\") pod \"kube-apiserver-proxy-ip-10-0-136-109.ec2.internal\" (UID: \"2a1992b488b3339bc008fb24c80291d9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-109.ec2.internal" Apr 16 13:59:21.626566 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.626387 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2a1992b488b3339bc008fb24c80291d9-config\") pod \"kube-apiserver-proxy-ip-10-0-136-109.ec2.internal\" (UID: \"2a1992b488b3339bc008fb24c80291d9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-109.ec2.internal" Apr 16 13:59:21.626566 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.626410 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a325dd6c6725307a6ffbdfee2361b7c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal\" (UID: \"9a325dd6c6725307a6ffbdfee2361b7c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal" Apr 16 13:59:21.626566 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.626408 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9a325dd6c6725307a6ffbdfee2361b7c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal\" (UID: \"9a325dd6c6725307a6ffbdfee2361b7c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal" Apr 16 13:59:21.634393 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:21.634375 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-109.ec2.internal\" not found" Apr 16 13:59:21.735336 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:21.735254 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-109.ec2.internal\" not found" Apr 16 13:59:21.793471 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.793437 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal" Apr 16 13:59:21.795965 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:21.795947 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-109.ec2.internal" Apr 16 13:59:21.835881 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:21.835853 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-109.ec2.internal\" not found" Apr 16 13:59:21.936423 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:21.936377 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-109.ec2.internal\" not found" Apr 16 13:59:22.036933 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:22.036860 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-109.ec2.internal\" not found" Apr 16 13:59:22.132467 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:22.132440 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 13:59:22.133037 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:22.132598 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:22.133037 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:22.132638 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:22.137574 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:22.137557 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-109.ec2.internal\" not found" Apr 16 13:59:22.215774 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:22.215729 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 13:54:21 +0000 UTC" deadline="2027-12-09 19:25:14.94404004 +0000 UTC" Apr 16 13:59:22.215774 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:22.215767 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14453h25m52.728276075s" Apr 16 13:59:22.223292 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:22.223276 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:22.232117 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:22.232101 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:22.237982 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:22.237967 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-109.ec2.internal\" not found" Apr 16 13:59:22.264628 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:22.264608 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-zdt7b" Apr 16 13:59:22.272541 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:22.272525 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-zdt7b" Apr 16 13:59:22.338396 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:22.338340 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-109.ec2.internal\" not found" Apr 16 13:59:22.366348 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:22.366326 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:22.438554 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:22.438528 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-109.ec2.internal\" not found" Apr 16 13:59:22.538998 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:22.538964 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-109.ec2.internal\" not found" Apr 16 13:59:22.639502 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:22.639432 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-109.ec2.internal\" not found" Apr 16 13:59:22.655158 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:22.655130 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:22.694745 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:22.694713 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a325dd6c6725307a6ffbdfee2361b7c.slice/crio-6df1bbe99b5f254b52f5d8334c11a84fbfff71842b64dd7e83cfe62e48914486 WatchSource:0}: Error finding container 6df1bbe99b5f254b52f5d8334c11a84fbfff71842b64dd7e83cfe62e48914486: Status 404 returned error can't find the container with id 6df1bbe99b5f254b52f5d8334c11a84fbfff71842b64dd7e83cfe62e48914486 Apr 16 13:59:22.695208 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:22.695188 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a1992b488b3339bc008fb24c80291d9.slice/crio-ef0a59be937b115098def6940824fbc82ee710807da03f9cf3332138402123db WatchSource:0}: Error finding container ef0a59be937b115098def6940824fbc82ee710807da03f9cf3332138402123db: Status 404 returned error can't find the container with id ef0a59be937b115098def6940824fbc82ee710807da03f9cf3332138402123db Apr 16 13:59:22.700479 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:22.700458 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:59:22.724622 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:22.724588 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal" Apr 16 13:59:22.736266 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:22.736247 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:22.737082 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:22.737070 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-109.ec2.internal" Apr 16 13:59:22.743902 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:22.743886 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:23.195331 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.195293 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:23.202228 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.202212 2570 apiserver.go:52] "Watching apiserver" Apr 16 13:59:23.209022 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.209004 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 13:59:23.210410 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.210391 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-x8zzl","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal","openshift-multus/multus-additional-cni-plugins-mvgp6","openshift-multus/multus-skgt6","openshift-multus/network-metrics-daemon-2dz2d","openshift-network-diagnostics/network-check-target-m5nn8","openshift-network-operator/iptables-alerter-gn66w","kube-system/kube-apiserver-proxy-ip-10-0-136-109.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq","openshift-cluster-node-tuning-operator/tuned-b8kz6","openshift-dns/node-resolver-zvl68","openshift-image-registry/node-ca-q4pbl","openshift-ovn-kubernetes/ovnkube-node-fkt9w"] Apr 16 13:59:23.214478 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.214464 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gn66w" Apr 16 13:59:23.216531 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.216512 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.216907 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.216886 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:23.217020 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.216994 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-nnzhc\"" Apr 16 13:59:23.217072 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.217046 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 13:59:23.217121 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.217104 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:23.218560 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.218543 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.219198 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.218991 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 13:59:23.219198 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.219075 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 13:59:23.219198 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.219092 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 13:59:23.219198 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.219157 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 13:59:23.219198 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.219168 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 13:59:23.219198 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.219191 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vsbf4\"" Apr 16 13:59:23.220562 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.220547 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 13:59:23.220819 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.220807 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:23.220887 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:23.220856 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 13:59:23.220952 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.220935 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-nnffk\"" Apr 16 13:59:23.223011 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.222993 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:23.223098 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:23.223067 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 13:59:23.225241 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.225227 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-x8zzl" Apr 16 13:59:23.227487 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.227471 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 13:59:23.227578 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.227474 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-67l4s\"" Apr 16 13:59:23.227578 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.227522 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 13:59:23.227674 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.227611 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.230473 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.230177 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 13:59:23.230473 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.230409 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 13:59:23.230622 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.230551 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-p46mk\"" Apr 16 13:59:23.230794 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.230726 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 13:59:23.232728 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.232709 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.232968 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.232953 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-host-var-lib-cni-multus\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.233049 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.232977 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-host-var-lib-kubelet\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.233049 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.232994 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-multus-conf-dir\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.233049 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233010 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/45edff06-17ec-4445-a612-10113a6f9a02-cni-binary-copy\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.233183 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233051 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnlt4\" (UniqueName: \"kubernetes.io/projected/45edff06-17ec-4445-a612-10113a6f9a02-kube-api-access-gnlt4\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.233183 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233081 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq4jh\" (UniqueName: \"kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh\") pod \"network-check-target-m5nn8\" (UID: \"1522dd59-b1b0-4b61-8eed-6b2da396ebac\") " pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:23.233183 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233106 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-cnibin\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.233183 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233127 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sndzr\" (UniqueName: \"kubernetes.io/projected/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-kube-api-access-sndzr\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.233183 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233143 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-host-run-multus-certs\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.233183 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233160 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-host-run-k8s-cni-cncf-io\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.233183 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233174 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-host-run-netns\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.233522 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233187 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-etc-kubernetes\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.233522 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233230 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs\") pod \"network-metrics-daemon-2dz2d\" (UID: \"f570a9dc-9480-415b-9633-11fb3c3a05eb\") " pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:23.233522 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233268 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3ad29733-1e32-4cb7-9641-906b311b4961-konnectivity-ca\") pod \"konnectivity-agent-x8zzl\" (UID: \"3ad29733-1e32-4cb7-9641-906b311b4961\") " pod="kube-system/konnectivity-agent-x8zzl" Apr 16 13:59:23.233522 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233294 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7d916fa4-9672-4e7a-be82-02e78c5a0df3-iptables-alerter-script\") pod \"iptables-alerter-gn66w\" (UID: \"7d916fa4-9672-4e7a-be82-02e78c5a0df3\") " pod="openshift-network-operator/iptables-alerter-gn66w" Apr 16 13:59:23.233522 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233340 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npd47\" (UniqueName: \"kubernetes.io/projected/7d916fa4-9672-4e7a-be82-02e78c5a0df3-kube-api-access-npd47\") pod \"iptables-alerter-gn66w\" (UID: \"7d916fa4-9672-4e7a-be82-02e78c5a0df3\") " pod="openshift-network-operator/iptables-alerter-gn66w" Apr 16 13:59:23.233522 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233366 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.233522 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233386 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-hostroot\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.233522 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233407 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.233522 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233456 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-system-cni-dir\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.233522 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233506 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-multus-socket-dir-parent\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.233921 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233550 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-host-var-lib-cni-bin\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.233921 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233578 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/45edff06-17ec-4445-a612-10113a6f9a02-multus-daemon-config\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.233921 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233605 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn5mn\" (UniqueName: \"kubernetes.io/projected/f570a9dc-9480-415b-9633-11fb3c3a05eb-kube-api-access-nn5mn\") pod \"network-metrics-daemon-2dz2d\" (UID: \"f570a9dc-9480-415b-9633-11fb3c3a05eb\") " pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:23.233921 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233629 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7d916fa4-9672-4e7a-be82-02e78c5a0df3-host-slash\") pod \"iptables-alerter-gn66w\" (UID: \"7d916fa4-9672-4e7a-be82-02e78c5a0df3\") " pod="openshift-network-operator/iptables-alerter-gn66w" Apr 16 13:59:23.233921 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233653 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-os-release\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.233921 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233676 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-multus-cni-dir\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.233921 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233699 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-system-cni-dir\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.233921 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233733 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-cni-binary-copy\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.233921 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233763 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.233921 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233790 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-cnibin\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.233921 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233812 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-os-release\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.233921 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.233833 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3ad29733-1e32-4cb7-9641-906b311b4961-agent-certs\") pod \"konnectivity-agent-x8zzl\" (UID: \"3ad29733-1e32-4cb7-9641-906b311b4961\") " pod="kube-system/konnectivity-agent-x8zzl" Apr 16 13:59:23.234680 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.234663 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:23.234846 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.234828 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:23.234907 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.234855 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-7cvl4\"" Apr 16 13:59:23.234907 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.234892 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zvl68" Apr 16 13:59:23.235042 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.235028 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q4pbl" Apr 16 13:59:23.236929 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.236916 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 13:59:23.237000 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.236962 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 13:59:23.237103 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.237089 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 13:59:23.237182 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.237113 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-h5c4g\"" Apr 16 13:59:23.237231 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.237181 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 13:59:23.237231 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.237193 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.237359 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.237307 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-8vhv7\"" Apr 16 13:59:23.237428 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.237381 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 13:59:23.239248 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.239227 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2ffqx\"" Apr 16 13:59:23.239404 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.239234 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 13:59:23.239404 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.239257 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 13:59:23.239531 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.239517 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 13:59:23.239567 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.239546 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 13:59:23.240350 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.240331 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 13:59:23.240429 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.240337 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 13:59:23.274697 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.274671 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:22 +0000 UTC" deadline="2027-09-14 22:11:27.337544528 +0000 UTC" Apr 16 13:59:23.274697 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.274695 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12392h12m4.06285201s" Apr 16 13:59:23.325157 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.325137 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 13:59:23.334229 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.334202 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnlt4\" (UniqueName: \"kubernetes.io/projected/45edff06-17ec-4445-a612-10113a6f9a02-kube-api-access-gnlt4\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.334375 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.334250 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq4jh\" (UniqueName: \"kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh\") pod \"network-check-target-m5nn8\" (UID: \"1522dd59-b1b0-4b61-8eed-6b2da396ebac\") " pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:23.334375 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.334287 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-run-netns\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.334375 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.334343 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-env-overrides\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.334506 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.334470 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-device-dir\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.334506 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.334496 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d62a9568-15dd-4b2a-b879-e1ae35037432-tmp-dir\") pod \"node-resolver-zvl68\" (UID: \"d62a9568-15dd-4b2a-b879-e1ae35037432\") " pod="openshift-dns/node-resolver-zvl68" Apr 16 13:59:23.334578 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.334519 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-cnibin\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.334578 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.334537 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-host-run-multus-certs\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.334672 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.334640 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-host-run-multus-certs\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.334829 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.334807 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-run-ovn\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.334955 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.334837 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-cnibin\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.334955 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.334936 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-etc-selinux\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.335041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.334980 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-kubernetes\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.335041 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.335013 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-lib-modules\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.335644 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.335044 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm56j\" (UniqueName: \"kubernetes.io/projected/d62a9568-15dd-4b2a-b879-e1ae35037432-kube-api-access-xm56j\") pod \"node-resolver-zvl68\" (UID: \"d62a9568-15dd-4b2a-b879-e1ae35037432\") " pod="openshift-dns/node-resolver-zvl68" Apr 16 13:59:23.335734 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.335687 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-host-run-netns\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.335806 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.335737 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs\") pod \"network-metrics-daemon-2dz2d\" (UID: \"f570a9dc-9480-415b-9633-11fb3c3a05eb\") " pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:23.335806 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.335781 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3ad29733-1e32-4cb7-9641-906b311b4961-konnectivity-ca\") pod \"konnectivity-agent-x8zzl\" (UID: \"3ad29733-1e32-4cb7-9641-906b311b4961\") " pod="kube-system/konnectivity-agent-x8zzl" Apr 16 13:59:23.335910 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.335836 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-cni-bin\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.335955 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.335919 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7d916fa4-9672-4e7a-be82-02e78c5a0df3-iptables-alerter-script\") pod \"iptables-alerter-gn66w\" (UID: \"7d916fa4-9672-4e7a-be82-02e78c5a0df3\") " pod="openshift-network-operator/iptables-alerter-gn66w" Apr 16 13:59:23.336004 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:23.335958 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:23.336050 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:23.336038 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs podName:f570a9dc-9480-415b-9633-11fb3c3a05eb nodeName:}" failed. No retries permitted until 2026-04-16 13:59:23.836012652 +0000 UTC m=+3.060683873 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs") pod "network-metrics-daemon-2dz2d" (UID: "f570a9dc-9480-415b-9633-11fb3c3a05eb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:23.336233 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.336207 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-host-run-netns\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.336303 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.335959 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.337428 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.336439 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d-serviceca\") pod \"node-ca-q4pbl\" (UID: \"3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d\") " pod="openshift-image-registry/node-ca-q4pbl" Apr 16 13:59:23.337428 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.336500 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqwzz\" (UniqueName: \"kubernetes.io/projected/3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d-kube-api-access-tqwzz\") pod \"node-ca-q4pbl\" (UID: \"3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d\") " pod="openshift-image-registry/node-ca-q4pbl" Apr 16 13:59:23.337428 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.336534 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.337428 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.336548 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-log-socket\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.337428 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.336600 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd5d9\" (UniqueName: \"kubernetes.io/projected/d47f2738-9503-4e5e-8359-c1d73e1fc168-kube-api-access-qd5d9\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.337428 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.336642 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-multus-socket-dir-parent\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.337428 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.336678 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-sys-fs\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.337428 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.336718 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-systemd\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.337428 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.336746 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-os-release\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.337428 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.336780 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-run-systemd\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.337428 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.336811 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-tuned\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.337428 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.336845 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-cnibin\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.337428 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.336879 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3ad29733-1e32-4cb7-9641-906b311b4961-agent-certs\") pod \"konnectivity-agent-x8zzl\" (UID: \"3ad29733-1e32-4cb7-9641-906b311b4961\") " pod="kube-system/konnectivity-agent-x8zzl" Apr 16 13:59:23.337428 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.336906 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-modprobe-d\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.337428 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.336944 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-multus-conf-dir\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.337428 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.336977 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-systemd-units\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.337428 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.336998 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3ad29733-1e32-4cb7-9641-906b311b4961-konnectivity-ca\") pod \"konnectivity-agent-x8zzl\" (UID: \"3ad29733-1e32-4cb7-9641-906b311b4961\") " pod="kube-system/konnectivity-agent-x8zzl" Apr 16 13:59:23.338200 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.337015 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-run-openvswitch\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.338200 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.337052 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzkh9\" (UniqueName: \"kubernetes.io/projected/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-kube-api-access-xzkh9\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.338200 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.337064 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7d916fa4-9672-4e7a-be82-02e78c5a0df3-iptables-alerter-script\") pod \"iptables-alerter-gn66w\" (UID: \"7d916fa4-9672-4e7a-be82-02e78c5a0df3\") " pod="openshift-network-operator/iptables-alerter-gn66w" Apr 16 13:59:23.338200 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.337106 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-sysctl-d\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.338200 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.337148 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-run\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.338200 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.337170 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-multus-socket-dir-parent\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.338200 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.337208 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/45edff06-17ec-4445-a612-10113a6f9a02-cni-binary-copy\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.338200 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.337242 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-cni-netd\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.338200 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.337246 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-multus-conf-dir\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.338200 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.337271 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d62a9568-15dd-4b2a-b879-e1ae35037432-hosts-file\") pod \"node-resolver-zvl68\" (UID: \"d62a9568-15dd-4b2a-b879-e1ae35037432\") " pod="openshift-dns/node-resolver-zvl68" Apr 16 13:59:23.338200 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.337286 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-os-release\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.338200 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.337305 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-cnibin\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.338200 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.337307 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sndzr\" (UniqueName: \"kubernetes.io/projected/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-kube-api-access-sndzr\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.338200 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.337361 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-ovnkube-script-lib\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.338200 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.337400 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-sysctl-conf\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.338200 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.337790 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 13:59:23.338200 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.337870 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/45edff06-17ec-4445-a612-10113a6f9a02-cni-binary-copy\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.338863 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.337935 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-host-run-k8s-cni-cncf-io\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.338863 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.337968 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-etc-kubernetes\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.338863 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.337998 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-registration-dir\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.338863 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338048 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-sys\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.338863 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338084 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npd47\" (UniqueName: \"kubernetes.io/projected/7d916fa4-9672-4e7a-be82-02e78c5a0df3-kube-api-access-npd47\") pod \"iptables-alerter-gn66w\" (UID: \"7d916fa4-9672-4e7a-be82-02e78c5a0df3\") " pod="openshift-network-operator/iptables-alerter-gn66w" Apr 16 13:59:23.338863 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338117 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-hostroot\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.338863 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338150 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-run-ovn-kubernetes\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.338863 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338180 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-ovn-node-metrics-cert\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.338863 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338208 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.338863 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338291 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-system-cni-dir\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.338863 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338342 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-host-var-lib-cni-bin\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.338863 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338346 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-etc-kubernetes\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.338863 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338387 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/45edff06-17ec-4445-a612-10113a6f9a02-multus-daemon-config\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.338863 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338501 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn5mn\" (UniqueName: \"kubernetes.io/projected/f570a9dc-9480-415b-9633-11fb3c3a05eb-kube-api-access-nn5mn\") pod \"network-metrics-daemon-2dz2d\" (UID: \"f570a9dc-9480-415b-9633-11fb3c3a05eb\") " pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:23.338863 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338554 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-host-run-k8s-cni-cncf-io\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.338863 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338564 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-slash\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.338863 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338608 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-ovnkube-config\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.339609 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338605 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-hostroot\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.339609 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338640 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xb6b\" (UniqueName: \"kubernetes.io/projected/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-kube-api-access-2xb6b\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.339609 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338708 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-system-cni-dir\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.339609 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338807 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-host-var-lib-cni-bin\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.339609 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338847 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7d916fa4-9672-4e7a-be82-02e78c5a0df3-host-slash\") pod \"iptables-alerter-gn66w\" (UID: \"7d916fa4-9672-4e7a-be82-02e78c5a0df3\") " pod="openshift-network-operator/iptables-alerter-gn66w" Apr 16 13:59:23.339609 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338921 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-multus-cni-dir\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.339609 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.338929 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7d916fa4-9672-4e7a-be82-02e78c5a0df3-host-slash\") pod \"iptables-alerter-gn66w\" (UID: \"7d916fa4-9672-4e7a-be82-02e78c5a0df3\") " pod="openshift-network-operator/iptables-alerter-gn66w" Apr 16 13:59:23.339609 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339005 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d-host\") pod \"node-ca-q4pbl\" (UID: \"3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d\") " pod="openshift-image-registry/node-ca-q4pbl" Apr 16 13:59:23.339609 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339091 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-var-lib-openvswitch\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.339609 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339107 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-multus-cni-dir\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.339609 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339174 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-host\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.339609 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339199 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d47f2738-9503-4e5e-8359-c1d73e1fc168-tmp\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.339609 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339235 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-system-cni-dir\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.339609 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339304 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-cni-binary-copy\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.339609 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339327 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-system-cni-dir\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.339609 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339328 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.339609 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339376 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/45edff06-17ec-4445-a612-10113a6f9a02-multus-daemon-config\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.340335 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339383 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.340335 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:23.339442 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:23.340335 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:23.339460 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:23.340335 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:23.339478 2570 projected.go:194] Error preparing data for projected volume kube-api-access-jq4jh for pod openshift-network-diagnostics/network-check-target-m5nn8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:23.340335 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339499 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-os-release\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.340335 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339563 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-etc-openvswitch\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.340335 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:23.339598 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh podName:1522dd59-b1b0-4b61-8eed-6b2da396ebac nodeName:}" failed. No retries permitted until 2026-04-16 13:59:23.839569388 +0000 UTC m=+3.064240618 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jq4jh" (UniqueName: "kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh") pod "network-check-target-m5nn8" (UID: "1522dd59-b1b0-4b61-8eed-6b2da396ebac") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:23.340335 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339682 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-os-release\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.340335 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339721 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.340335 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339811 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-cni-binary-copy\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.340335 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339829 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-sysconfig\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.340335 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339901 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-host-var-lib-cni-multus\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.340335 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339949 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.340335 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339961 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-host-var-lib-kubelet\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.340335 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.339992 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-host-var-lib-cni-multus\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.340998 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.340491 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-kubelet\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.340998 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.340506 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/45edff06-17ec-4445-a612-10113a6f9a02-host-var-lib-kubelet\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.340998 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.340576 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-node-log\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.340998 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.340602 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.340998 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.340619 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-socket-dir\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.340998 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.340635 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-var-lib-kubelet\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.341777 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.341755 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3ad29733-1e32-4cb7-9641-906b311b4961-agent-certs\") pod \"konnectivity-agent-x8zzl\" (UID: \"3ad29733-1e32-4cb7-9641-906b311b4961\") " pod="kube-system/konnectivity-agent-x8zzl" Apr 16 13:59:23.341953 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.341938 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnlt4\" (UniqueName: \"kubernetes.io/projected/45edff06-17ec-4445-a612-10113a6f9a02-kube-api-access-gnlt4\") pod \"multus-skgt6\" (UID: \"45edff06-17ec-4445-a612-10113a6f9a02\") " pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.345255 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.345232 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npd47\" (UniqueName: \"kubernetes.io/projected/7d916fa4-9672-4e7a-be82-02e78c5a0df3-kube-api-access-npd47\") pod \"iptables-alerter-gn66w\" (UID: \"7d916fa4-9672-4e7a-be82-02e78c5a0df3\") " pod="openshift-network-operator/iptables-alerter-gn66w" Apr 16 13:59:23.345352 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.345234 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sndzr\" (UniqueName: \"kubernetes.io/projected/1cbe0c2f-4375-424a-a6f9-acf5ed5f216c-kube-api-access-sndzr\") pod \"multus-additional-cni-plugins-mvgp6\" (UID: \"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c\") " pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.345874 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.345859 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn5mn\" (UniqueName: \"kubernetes.io/projected/f570a9dc-9480-415b-9633-11fb3c3a05eb-kube-api-access-nn5mn\") pod \"network-metrics-daemon-2dz2d\" (UID: \"f570a9dc-9480-415b-9633-11fb3c3a05eb\") " pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:23.359968 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.359920 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal" event={"ID":"9a325dd6c6725307a6ffbdfee2361b7c","Type":"ContainerStarted","Data":"6df1bbe99b5f254b52f5d8334c11a84fbfff71842b64dd7e83cfe62e48914486"} Apr 16 13:59:23.360856 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.360835 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-109.ec2.internal" event={"ID":"2a1992b488b3339bc008fb24c80291d9","Type":"ContainerStarted","Data":"ef0a59be937b115098def6940824fbc82ee710807da03f9cf3332138402123db"} Apr 16 13:59:23.418215 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.418187 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:23.441291 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441261 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-etc-openvswitch\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.441434 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441296 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.441434 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441369 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.441434 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441377 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-etc-openvswitch\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.441434 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441401 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-sysconfig\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.441434 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441423 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-kubelet\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.441580 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441439 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-sysconfig\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.441580 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441450 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-node-log\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.441580 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441476 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.441580 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441476 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-kubelet\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.441580 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441492 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-socket-dir\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.441580 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441506 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-var-lib-kubelet\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.441580 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441504 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-node-log\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.441580 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441527 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.441580 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441551 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-run-netns\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.441580 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441568 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-var-lib-kubelet\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.441580 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441578 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-env-overrides\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.441898 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441602 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-device-dir\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.441898 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441595 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-run-netns\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.441898 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441627 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d62a9568-15dd-4b2a-b879-e1ae35037432-tmp-dir\") pod \"node-resolver-zvl68\" (UID: \"d62a9568-15dd-4b2a-b879-e1ae35037432\") " pod="openshift-dns/node-resolver-zvl68" Apr 16 13:59:23.441898 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441655 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-run-ovn\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.441898 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441680 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-run-ovn\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.441898 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441682 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-etc-selinux\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.441898 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441661 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-device-dir\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.441898 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441657 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-socket-dir\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.441898 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441708 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-kubernetes\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.441898 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441746 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-kubernetes\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.441898 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441764 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-lib-modules\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.441898 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441817 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xm56j\" (UniqueName: \"kubernetes.io/projected/d62a9568-15dd-4b2a-b879-e1ae35037432-kube-api-access-xm56j\") pod \"node-resolver-zvl68\" (UID: \"d62a9568-15dd-4b2a-b879-e1ae35037432\") " pod="openshift-dns/node-resolver-zvl68" Apr 16 13:59:23.441898 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441872 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-lib-modules\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.441898 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441817 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-etc-selinux\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.441898 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441862 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-cni-bin\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.442537 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441911 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d62a9568-15dd-4b2a-b879-e1ae35037432-tmp-dir\") pod \"node-resolver-zvl68\" (UID: \"d62a9568-15dd-4b2a-b879-e1ae35037432\") " pod="openshift-dns/node-resolver-zvl68" Apr 16 13:59:23.442537 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441930 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d-serviceca\") pod \"node-ca-q4pbl\" (UID: \"3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d\") " pod="openshift-image-registry/node-ca-q4pbl" Apr 16 13:59:23.442537 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441955 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqwzz\" (UniqueName: \"kubernetes.io/projected/3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d-kube-api-access-tqwzz\") pod \"node-ca-q4pbl\" (UID: \"3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d\") " pod="openshift-image-registry/node-ca-q4pbl" Apr 16 13:59:23.442537 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441916 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-cni-bin\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.442537 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.441980 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-log-socket\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.442537 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442004 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qd5d9\" (UniqueName: \"kubernetes.io/projected/d47f2738-9503-4e5e-8359-c1d73e1fc168-kube-api-access-qd5d9\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.442537 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442028 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-sys-fs\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.442537 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442050 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-systemd\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.442537 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442079 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-env-overrides\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.442537 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442090 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-log-socket\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.442537 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442108 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-systemd\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.442537 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442127 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-run-systemd\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.442537 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442150 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-tuned\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.442537 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442154 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-sys-fs\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.442537 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442171 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-modprobe-d\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.442537 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442193 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-run-systemd\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.442537 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442208 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-systemd-units\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.442537 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442234 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-run-openvswitch\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.443308 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442256 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzkh9\" (UniqueName: \"kubernetes.io/projected/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-kube-api-access-xzkh9\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.443308 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442295 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-systemd-units\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.443308 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442304 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-run-openvswitch\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.443308 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442325 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-modprobe-d\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.443308 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442341 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-sysctl-d\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.443308 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442363 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-run\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.443308 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442380 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-cni-netd\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.443308 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442407 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d-serviceca\") pod \"node-ca-q4pbl\" (UID: \"3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d\") " pod="openshift-image-registry/node-ca-q4pbl" Apr 16 13:59:23.443308 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442395 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d62a9568-15dd-4b2a-b879-e1ae35037432-hosts-file\") pod \"node-resolver-zvl68\" (UID: \"d62a9568-15dd-4b2a-b879-e1ae35037432\") " pod="openshift-dns/node-resolver-zvl68" Apr 16 13:59:23.443308 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442448 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-cni-netd\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.443308 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442454 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-sysctl-d\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.443308 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442462 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d62a9568-15dd-4b2a-b879-e1ae35037432-hosts-file\") pod \"node-resolver-zvl68\" (UID: \"d62a9568-15dd-4b2a-b879-e1ae35037432\") " pod="openshift-dns/node-resolver-zvl68" Apr 16 13:59:23.443308 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442460 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-ovnkube-script-lib\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.443308 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442498 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-sysctl-conf\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.443308 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442498 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-run\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.443308 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442535 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-registration-dir\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.443308 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442562 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-sys\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.443308 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442595 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-run-ovn-kubernetes\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.443824 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442602 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-sysctl-conf\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.443824 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442599 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-registration-dir\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.443824 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442623 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-ovn-node-metrics-cert\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.443824 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442638 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-sys\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.443824 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442652 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-run-ovn-kubernetes\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.443824 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442655 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-slash\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.443824 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442694 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-host-slash\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.443824 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442695 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-ovnkube-config\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.443824 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442735 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xb6b\" (UniqueName: \"kubernetes.io/projected/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-kube-api-access-2xb6b\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.443824 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442772 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d-host\") pod \"node-ca-q4pbl\" (UID: \"3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d\") " pod="openshift-image-registry/node-ca-q4pbl" Apr 16 13:59:23.443824 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442796 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-var-lib-openvswitch\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.443824 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442821 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-host\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.443824 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442856 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-var-lib-openvswitch\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.443824 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442859 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d47f2738-9503-4e5e-8359-c1d73e1fc168-tmp\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.443824 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442953 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d47f2738-9503-4e5e-8359-c1d73e1fc168-host\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.443824 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442956 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d-host\") pod \"node-ca-q4pbl\" (UID: \"3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d\") " pod="openshift-image-registry/node-ca-q4pbl" Apr 16 13:59:23.443824 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.442998 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-ovnkube-script-lib\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.443824 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.443177 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-ovnkube-config\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.444420 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.444402 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d47f2738-9503-4e5e-8359-c1d73e1fc168-etc-tuned\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.444714 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.444698 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-ovn-node-metrics-cert\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.444749 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.444705 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d47f2738-9503-4e5e-8359-c1d73e1fc168-tmp\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.449576 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.449507 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqwzz\" (UniqueName: \"kubernetes.io/projected/3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d-kube-api-access-tqwzz\") pod \"node-ca-q4pbl\" (UID: \"3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d\") " pod="openshift-image-registry/node-ca-q4pbl" Apr 16 13:59:23.449669 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.449647 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd5d9\" (UniqueName: \"kubernetes.io/projected/d47f2738-9503-4e5e-8359-c1d73e1fc168-kube-api-access-qd5d9\") pod \"tuned-b8kz6\" (UID: \"d47f2738-9503-4e5e-8359-c1d73e1fc168\") " pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.449963 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.449938 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzkh9\" (UniqueName: \"kubernetes.io/projected/8ad82b8c-5f9d-40e3-bf04-ee7dff525d90-kube-api-access-xzkh9\") pod \"ovnkube-node-fkt9w\" (UID: \"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90\") " pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.450218 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.450196 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm56j\" (UniqueName: \"kubernetes.io/projected/d62a9568-15dd-4b2a-b879-e1ae35037432-kube-api-access-xm56j\") pod \"node-resolver-zvl68\" (UID: \"d62a9568-15dd-4b2a-b879-e1ae35037432\") " pod="openshift-dns/node-resolver-zvl68" Apr 16 13:59:23.450860 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.450846 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xb6b\" (UniqueName: \"kubernetes.io/projected/591845d7-ab61-4e19-8a4c-e0c14a2f6c24-kube-api-access-2xb6b\") pod \"aws-ebs-csi-driver-node-nzjqq\" (UID: \"591845d7-ab61-4e19-8a4c-e0c14a2f6c24\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.525510 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.525472 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gn66w" Apr 16 13:59:23.533152 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.533122 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mvgp6" Apr 16 13:59:23.533661 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:23.533546 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d916fa4_9672_4e7a_be82_02e78c5a0df3.slice/crio-5089fdebb5fca936ecca28f6da4d574370967f2bdc65a7bc0fe122311598556e WatchSource:0}: Error finding container 5089fdebb5fca936ecca28f6da4d574370967f2bdc65a7bc0fe122311598556e: Status 404 returned error can't find the container with id 5089fdebb5fca936ecca28f6da4d574370967f2bdc65a7bc0fe122311598556e Apr 16 13:59:23.540037 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.540017 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-skgt6" Apr 16 13:59:23.541976 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:23.541950 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cbe0c2f_4375_424a_a6f9_acf5ed5f216c.slice/crio-edd1ddf8d5ee4ba32388c352bd59e33f6179478d3832d47faa56f6176c3d23b8 WatchSource:0}: Error finding container edd1ddf8d5ee4ba32388c352bd59e33f6179478d3832d47faa56f6176c3d23b8: Status 404 returned error can't find the container with id edd1ddf8d5ee4ba32388c352bd59e33f6179478d3832d47faa56f6176c3d23b8 Apr 16 13:59:23.543949 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.543920 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-x8zzl" Apr 16 13:59:23.548863 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:23.548803 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45edff06_17ec_4445_a612_10113a6f9a02.slice/crio-70144251236b66af1aacfc2d2a57340034d999a2d3a4eec5265245f036641c0c WatchSource:0}: Error finding container 70144251236b66af1aacfc2d2a57340034d999a2d3a4eec5265245f036641c0c: Status 404 returned error can't find the container with id 70144251236b66af1aacfc2d2a57340034d999a2d3a4eec5265245f036641c0c Apr 16 13:59:23.550084 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.550008 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" Apr 16 13:59:23.553708 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:23.553684 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ad29733_1e32_4cb7_9641_906b311b4961.slice/crio-851d8a66197ffd085cfa65c05ebd7a21d587b81d9c15ae7636bcc46f57dc012b WatchSource:0}: Error finding container 851d8a66197ffd085cfa65c05ebd7a21d587b81d9c15ae7636bcc46f57dc012b: Status 404 returned error can't find the container with id 851d8a66197ffd085cfa65c05ebd7a21d587b81d9c15ae7636bcc46f57dc012b Apr 16 13:59:23.555646 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.555623 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" Apr 16 13:59:23.558761 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:23.558739 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod591845d7_ab61_4e19_8a4c_e0c14a2f6c24.slice/crio-c75fbdafee3985533d48febae1bd9a206560fdef3f6ff5e42502ae78aeab4fd6 WatchSource:0}: Error finding container c75fbdafee3985533d48febae1bd9a206560fdef3f6ff5e42502ae78aeab4fd6: Status 404 returned error can't find the container with id c75fbdafee3985533d48febae1bd9a206560fdef3f6ff5e42502ae78aeab4fd6 Apr 16 13:59:23.560417 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.560384 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zvl68" Apr 16 13:59:23.563284 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:23.563259 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd47f2738_9503_4e5e_8359_c1d73e1fc168.slice/crio-faad8b240fbb1c8b0cd0984cee5c3bd2be275bfb89de603037060fbd9b7567a0 WatchSource:0}: Error finding container faad8b240fbb1c8b0cd0984cee5c3bd2be275bfb89de603037060fbd9b7567a0: Status 404 returned error can't find the container with id faad8b240fbb1c8b0cd0984cee5c3bd2be275bfb89de603037060fbd9b7567a0 Apr 16 13:59:23.566550 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.566530 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q4pbl" Apr 16 13:59:23.569204 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:23.569181 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd62a9568_15dd_4b2a_b879_e1ae35037432.slice/crio-90045d2a6ac948963bb3abfb76ea493dc2eac9eaf5518ff9d5d8270b06aae0cb WatchSource:0}: Error finding container 90045d2a6ac948963bb3abfb76ea493dc2eac9eaf5518ff9d5d8270b06aae0cb: Status 404 returned error can't find the container with id 90045d2a6ac948963bb3abfb76ea493dc2eac9eaf5518ff9d5d8270b06aae0cb Apr 16 13:59:23.570656 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.570478 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:23.578530 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:23.578508 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f616ae2_8c1b_4e05_b95a_1e9e5ed4db5d.slice/crio-f5be60daf60049360e84f398964771ee433dcda9e7cdb763c7d62a77c36c2105 WatchSource:0}: Error finding container f5be60daf60049360e84f398964771ee433dcda9e7cdb763c7d62a77c36c2105: Status 404 returned error can't find the container with id f5be60daf60049360e84f398964771ee433dcda9e7cdb763c7d62a77c36c2105 Apr 16 13:59:23.582162 ip-10-0-136-109 kubenswrapper[2570]: W0416 13:59:23.582139 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ad82b8c_5f9d_40e3_bf04_ee7dff525d90.slice/crio-dd50367dcc1eecf7239a57c547831bca8551cc6c47821111d291c23127e76e76 WatchSource:0}: Error finding container dd50367dcc1eecf7239a57c547831bca8551cc6c47821111d291c23127e76e76: Status 404 returned error can't find the container with id dd50367dcc1eecf7239a57c547831bca8551cc6c47821111d291c23127e76e76 Apr 16 13:59:23.845334 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.845239 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq4jh\" (UniqueName: \"kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh\") pod \"network-check-target-m5nn8\" (UID: \"1522dd59-b1b0-4b61-8eed-6b2da396ebac\") " pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:23.845334 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:23.845284 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs\") pod \"network-metrics-daemon-2dz2d\" (UID: \"f570a9dc-9480-415b-9633-11fb3c3a05eb\") " pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:23.845562 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:23.845397 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:23.845562 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:23.845461 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs podName:f570a9dc-9480-415b-9633-11fb3c3a05eb nodeName:}" failed. No retries permitted until 2026-04-16 13:59:24.84544573 +0000 UTC m=+4.070116943 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs") pod "network-metrics-daemon-2dz2d" (UID: "f570a9dc-9480-415b-9633-11fb3c3a05eb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:23.845562 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:23.845407 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:23.845562 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:23.845511 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:23.845562 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:23.845523 2570 projected.go:194] Error preparing data for projected volume kube-api-access-jq4jh for pod openshift-network-diagnostics/network-check-target-m5nn8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:23.845562 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:23.845562 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh podName:1522dd59-b1b0-4b61-8eed-6b2da396ebac nodeName:}" failed. No retries permitted until 2026-04-16 13:59:24.845550388 +0000 UTC m=+4.070221612 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jq4jh" (UniqueName: "kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh") pod "network-check-target-m5nn8" (UID: "1522dd59-b1b0-4b61-8eed-6b2da396ebac") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:24.275628 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:24.275587 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:22 +0000 UTC" deadline="2028-01-27 21:04:26.294227998 +0000 UTC" Apr 16 13:59:24.275628 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:24.275624 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15631h5m2.018607213s" Apr 16 13:59:24.364134 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:24.364092 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" event={"ID":"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90","Type":"ContainerStarted","Data":"dd50367dcc1eecf7239a57c547831bca8551cc6c47821111d291c23127e76e76"} Apr 16 13:59:24.365561 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:24.365538 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q4pbl" event={"ID":"3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d","Type":"ContainerStarted","Data":"f5be60daf60049360e84f398964771ee433dcda9e7cdb763c7d62a77c36c2105"} Apr 16 13:59:24.366539 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:24.366517 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zvl68" event={"ID":"d62a9568-15dd-4b2a-b879-e1ae35037432","Type":"ContainerStarted","Data":"90045d2a6ac948963bb3abfb76ea493dc2eac9eaf5518ff9d5d8270b06aae0cb"} Apr 16 13:59:24.367986 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:24.367959 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" event={"ID":"d47f2738-9503-4e5e-8359-c1d73e1fc168","Type":"ContainerStarted","Data":"faad8b240fbb1c8b0cd0984cee5c3bd2be275bfb89de603037060fbd9b7567a0"} Apr 16 13:59:24.369079 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:24.369056 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" event={"ID":"591845d7-ab61-4e19-8a4c-e0c14a2f6c24","Type":"ContainerStarted","Data":"c75fbdafee3985533d48febae1bd9a206560fdef3f6ff5e42502ae78aeab4fd6"} Apr 16 13:59:24.370452 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:24.370251 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-x8zzl" event={"ID":"3ad29733-1e32-4cb7-9641-906b311b4961","Type":"ContainerStarted","Data":"851d8a66197ffd085cfa65c05ebd7a21d587b81d9c15ae7636bcc46f57dc012b"} Apr 16 13:59:24.371745 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:24.371721 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-skgt6" event={"ID":"45edff06-17ec-4445-a612-10113a6f9a02","Type":"ContainerStarted","Data":"70144251236b66af1aacfc2d2a57340034d999a2d3a4eec5265245f036641c0c"} Apr 16 13:59:24.374197 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:24.374136 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvgp6" event={"ID":"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c","Type":"ContainerStarted","Data":"edd1ddf8d5ee4ba32388c352bd59e33f6179478d3832d47faa56f6176c3d23b8"} Apr 16 13:59:24.375290 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:24.375269 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gn66w" event={"ID":"7d916fa4-9672-4e7a-be82-02e78c5a0df3","Type":"ContainerStarted","Data":"5089fdebb5fca936ecca28f6da4d574370967f2bdc65a7bc0fe122311598556e"} Apr 16 13:59:24.852739 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:24.852648 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq4jh\" (UniqueName: \"kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh\") pod \"network-check-target-m5nn8\" (UID: \"1522dd59-b1b0-4b61-8eed-6b2da396ebac\") " pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:24.852739 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:24.852716 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs\") pod \"network-metrics-daemon-2dz2d\" (UID: \"f570a9dc-9480-415b-9633-11fb3c3a05eb\") " pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:24.852961 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:24.852825 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:24.852961 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:24.852855 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:24.852961 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:24.852861 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:24.852961 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:24.852869 2570 projected.go:194] Error preparing data for projected volume kube-api-access-jq4jh for pod openshift-network-diagnostics/network-check-target-m5nn8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:24.852961 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:24.852933 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh podName:1522dd59-b1b0-4b61-8eed-6b2da396ebac nodeName:}" failed. No retries permitted until 2026-04-16 13:59:26.852912102 +0000 UTC m=+6.077583330 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jq4jh" (UniqueName: "kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh") pod "network-check-target-m5nn8" (UID: "1522dd59-b1b0-4b61-8eed-6b2da396ebac") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:24.852961 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:24.852954 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs podName:f570a9dc-9480-415b-9633-11fb3c3a05eb nodeName:}" failed. No retries permitted until 2026-04-16 13:59:26.852943724 +0000 UTC m=+6.077614956 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs") pod "network-metrics-daemon-2dz2d" (UID: "f570a9dc-9480-415b-9633-11fb3c3a05eb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:25.356651 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:25.356619 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:25.357078 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:25.356619 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:25.357078 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:25.356769 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 13:59:25.357078 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:25.356857 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 13:59:26.868285 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:26.868246 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs\") pod \"network-metrics-daemon-2dz2d\" (UID: \"f570a9dc-9480-415b-9633-11fb3c3a05eb\") " pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:26.868745 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:26.868352 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq4jh\" (UniqueName: \"kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh\") pod \"network-check-target-m5nn8\" (UID: \"1522dd59-b1b0-4b61-8eed-6b2da396ebac\") " pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:26.868745 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:26.868514 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:26.868745 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:26.868727 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs podName:f570a9dc-9480-415b-9633-11fb3c3a05eb nodeName:}" failed. No retries permitted until 2026-04-16 13:59:30.868669164 +0000 UTC m=+10.093340399 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs") pod "network-metrics-daemon-2dz2d" (UID: "f570a9dc-9480-415b-9633-11fb3c3a05eb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:26.868745 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:26.868515 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:26.868993 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:26.868754 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:26.868993 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:26.868768 2570 projected.go:194] Error preparing data for projected volume kube-api-access-jq4jh for pod openshift-network-diagnostics/network-check-target-m5nn8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:26.868993 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:26.868812 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh podName:1522dd59-b1b0-4b61-8eed-6b2da396ebac nodeName:}" failed. No retries permitted until 2026-04-16 13:59:30.868798664 +0000 UTC m=+10.093469884 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-jq4jh" (UniqueName: "kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh") pod "network-check-target-m5nn8" (UID: "1522dd59-b1b0-4b61-8eed-6b2da396ebac") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:27.356613 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:27.356490 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:27.356613 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:27.356532 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:27.356793 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:27.356629 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 13:59:27.356793 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:27.356769 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 13:59:29.356887 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:29.356387 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:29.356887 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:29.356403 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:29.356887 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:29.356540 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 13:59:29.356887 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:29.356650 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 13:59:30.902064 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:30.902030 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq4jh\" (UniqueName: \"kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh\") pod \"network-check-target-m5nn8\" (UID: \"1522dd59-b1b0-4b61-8eed-6b2da396ebac\") " pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:30.902517 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:30.902092 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs\") pod \"network-metrics-daemon-2dz2d\" (UID: \"f570a9dc-9480-415b-9633-11fb3c3a05eb\") " pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:30.902517 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:30.902199 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:30.902517 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:30.902208 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:30.902517 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:30.902221 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:30.902517 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:30.902235 2570 projected.go:194] Error preparing data for projected volume kube-api-access-jq4jh for pod openshift-network-diagnostics/network-check-target-m5nn8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:30.902517 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:30.902272 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs podName:f570a9dc-9480-415b-9633-11fb3c3a05eb nodeName:}" failed. No retries permitted until 2026-04-16 13:59:38.902249171 +0000 UTC m=+18.126920398 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs") pod "network-metrics-daemon-2dz2d" (UID: "f570a9dc-9480-415b-9633-11fb3c3a05eb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:30.902517 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:30.902293 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh podName:1522dd59-b1b0-4b61-8eed-6b2da396ebac nodeName:}" failed. No retries permitted until 2026-04-16 13:59:38.902283702 +0000 UTC m=+18.126954922 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-jq4jh" (UniqueName: "kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh") pod "network-check-target-m5nn8" (UID: "1522dd59-b1b0-4b61-8eed-6b2da396ebac") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:31.357366 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:31.357229 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:31.357537 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:31.357365 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 13:59:31.357537 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:31.357444 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:31.357652 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:31.357550 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 13:59:32.075173 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:32.075134 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-sxl7w"] Apr 16 13:59:32.077825 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:32.077799 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:32.077973 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:32.077871 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 13:59:32.212795 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:32.212757 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c3956767-9c3a-4525-b3d3-d3e177d9479f-kubelet-config\") pod \"global-pull-secret-syncer-sxl7w\" (UID: \"c3956767-9c3a-4525-b3d3-d3e177d9479f\") " pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:32.213010 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:32.212825 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c3956767-9c3a-4525-b3d3-d3e177d9479f-dbus\") pod \"global-pull-secret-syncer-sxl7w\" (UID: \"c3956767-9c3a-4525-b3d3-d3e177d9479f\") " pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:32.213010 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:32.212945 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret\") pod \"global-pull-secret-syncer-sxl7w\" (UID: \"c3956767-9c3a-4525-b3d3-d3e177d9479f\") " pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:32.313500 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:32.313461 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret\") pod \"global-pull-secret-syncer-sxl7w\" (UID: \"c3956767-9c3a-4525-b3d3-d3e177d9479f\") " pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:32.313680 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:32.313551 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c3956767-9c3a-4525-b3d3-d3e177d9479f-kubelet-config\") pod \"global-pull-secret-syncer-sxl7w\" (UID: \"c3956767-9c3a-4525-b3d3-d3e177d9479f\") " pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:32.313680 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:32.313599 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c3956767-9c3a-4525-b3d3-d3e177d9479f-dbus\") pod \"global-pull-secret-syncer-sxl7w\" (UID: \"c3956767-9c3a-4525-b3d3-d3e177d9479f\") " pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:32.313680 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:32.313606 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:32.313680 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:32.313674 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret podName:c3956767-9c3a-4525-b3d3-d3e177d9479f nodeName:}" failed. No retries permitted until 2026-04-16 13:59:32.813655615 +0000 UTC m=+12.038326831 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret") pod "global-pull-secret-syncer-sxl7w" (UID: "c3956767-9c3a-4525-b3d3-d3e177d9479f") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:32.313680 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:32.313669 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c3956767-9c3a-4525-b3d3-d3e177d9479f-kubelet-config\") pod \"global-pull-secret-syncer-sxl7w\" (UID: \"c3956767-9c3a-4525-b3d3-d3e177d9479f\") " pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:32.313919 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:32.313776 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c3956767-9c3a-4525-b3d3-d3e177d9479f-dbus\") pod \"global-pull-secret-syncer-sxl7w\" (UID: \"c3956767-9c3a-4525-b3d3-d3e177d9479f\") " pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:32.816608 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:32.816566 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret\") pod \"global-pull-secret-syncer-sxl7w\" (UID: \"c3956767-9c3a-4525-b3d3-d3e177d9479f\") " pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:32.816779 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:32.816743 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:32.816852 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:32.816827 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret podName:c3956767-9c3a-4525-b3d3-d3e177d9479f nodeName:}" failed. No retries permitted until 2026-04-16 13:59:33.816805066 +0000 UTC m=+13.041476285 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret") pod "global-pull-secret-syncer-sxl7w" (UID: "c3956767-9c3a-4525-b3d3-d3e177d9479f") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:33.356091 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:33.356058 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:33.356577 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:33.356058 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:33.356577 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:33.356190 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 13:59:33.356577 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:33.356060 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:33.356577 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:33.356286 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 13:59:33.356577 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:33.356362 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 13:59:33.825540 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:33.825447 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret\") pod \"global-pull-secret-syncer-sxl7w\" (UID: \"c3956767-9c3a-4525-b3d3-d3e177d9479f\") " pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:33.825695 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:33.825565 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:33.825695 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:33.825632 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret podName:c3956767-9c3a-4525-b3d3-d3e177d9479f nodeName:}" failed. No retries permitted until 2026-04-16 13:59:35.825617652 +0000 UTC m=+15.050288878 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret") pod "global-pull-secret-syncer-sxl7w" (UID: "c3956767-9c3a-4525-b3d3-d3e177d9479f") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:35.356596 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:35.356558 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:35.357020 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:35.356558 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:35.357020 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:35.356690 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 13:59:35.357020 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:35.356700 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:35.357020 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:35.356845 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 13:59:35.357020 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:35.356946 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 13:59:35.839894 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:35.839799 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret\") pod \"global-pull-secret-syncer-sxl7w\" (UID: \"c3956767-9c3a-4525-b3d3-d3e177d9479f\") " pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:35.840039 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:35.839944 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:35.840039 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:35.840010 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret podName:c3956767-9c3a-4525-b3d3-d3e177d9479f nodeName:}" failed. No retries permitted until 2026-04-16 13:59:39.839992845 +0000 UTC m=+19.064664078 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret") pod "global-pull-secret-syncer-sxl7w" (UID: "c3956767-9c3a-4525-b3d3-d3e177d9479f") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:37.356257 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:37.356218 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:37.356257 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:37.356240 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:37.356758 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:37.356219 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:37.356758 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:37.356356 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 13:59:37.356758 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:37.356484 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 13:59:37.356758 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:37.356560 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 13:59:38.967846 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:38.967792 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq4jh\" (UniqueName: \"kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh\") pod \"network-check-target-m5nn8\" (UID: \"1522dd59-b1b0-4b61-8eed-6b2da396ebac\") " pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:38.967846 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:38.967855 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs\") pod \"network-metrics-daemon-2dz2d\" (UID: \"f570a9dc-9480-415b-9633-11fb3c3a05eb\") " pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:38.968515 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:38.967959 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:38.968515 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:38.967983 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:38.968515 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:38.968018 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:38.968515 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:38.968028 2570 projected.go:194] Error preparing data for projected volume kube-api-access-jq4jh for pod openshift-network-diagnostics/network-check-target-m5nn8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:38.968515 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:38.968033 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs podName:f570a9dc-9480-415b-9633-11fb3c3a05eb nodeName:}" failed. No retries permitted until 2026-04-16 13:59:54.968013882 +0000 UTC m=+34.192685097 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs") pod "network-metrics-daemon-2dz2d" (UID: "f570a9dc-9480-415b-9633-11fb3c3a05eb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:38.968515 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:38.968076 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh podName:1522dd59-b1b0-4b61-8eed-6b2da396ebac nodeName:}" failed. No retries permitted until 2026-04-16 13:59:54.968062202 +0000 UTC m=+34.192733417 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-jq4jh" (UniqueName: "kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh") pod "network-check-target-m5nn8" (UID: "1522dd59-b1b0-4b61-8eed-6b2da396ebac") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:39.359662 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:39.359587 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:39.359896 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:39.359587 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:39.359896 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:39.359713 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 13:59:39.359896 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:39.359586 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:39.359896 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:39.359785 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 13:59:39.359896 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:39.359875 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 13:59:39.873898 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:39.873865 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret\") pod \"global-pull-secret-syncer-sxl7w\" (UID: \"c3956767-9c3a-4525-b3d3-d3e177d9479f\") " pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:39.874075 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:39.873985 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:39.874075 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:39.874045 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret podName:c3956767-9c3a-4525-b3d3-d3e177d9479f nodeName:}" failed. No retries permitted until 2026-04-16 13:59:47.874028211 +0000 UTC m=+27.098699428 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret") pod "global-pull-secret-syncer-sxl7w" (UID: "c3956767-9c3a-4525-b3d3-d3e177d9479f") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:41.358115 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:41.357183 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:41.358115 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:41.357281 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:41.358115 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:41.357852 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 13:59:41.358115 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:41.357726 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 13:59:41.358115 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:41.357309 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:41.358115 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:41.357966 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 13:59:41.402759 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:41.402729 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-109.ec2.internal" event={"ID":"2a1992b488b3339bc008fb24c80291d9","Type":"ContainerStarted","Data":"bb1e059d10d73f5b3eca7ec0fedfc200fa7648420494e123527bb98471ac2711"} Apr 16 13:59:41.406084 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:41.406058 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 13:59:41.406874 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:41.406571 2570 generic.go:358] "Generic (PLEG): container finished" podID="8ad82b8c-5f9d-40e3-bf04-ee7dff525d90" containerID="d1149f8985c1f5969fae648d939b002c6799ed680d06a41219e643512f3cefd2" exitCode=1 Apr 16 13:59:41.406874 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:41.406643 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" event={"ID":"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90","Type":"ContainerStarted","Data":"ae062c220892120f2547ed44fdbf0bd610c0d3961ac7b7eef0aaeededdc65055"} Apr 16 13:59:41.406874 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:41.406670 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" event={"ID":"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90","Type":"ContainerStarted","Data":"60822f89e85884d0760757cd8b270fea305756ef27c51239ae64c6f8c9c8b18c"} Apr 16 13:59:41.406874 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:41.406682 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" event={"ID":"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90","Type":"ContainerStarted","Data":"68ca06f778b2687484b1bb3253097e21022a5f783adfbdc16c94f4d3da7d42ed"} Apr 16 13:59:41.406874 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:41.406695 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" event={"ID":"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90","Type":"ContainerStarted","Data":"090377c2e20454fcbb7ebe41bf36985191f1311ae9dcc00e6bff437c83592b30"} Apr 16 13:59:41.406874 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:41.406707 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" event={"ID":"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90","Type":"ContainerDied","Data":"d1149f8985c1f5969fae648d939b002c6799ed680d06a41219e643512f3cefd2"} Apr 16 13:59:41.406874 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:41.406720 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" event={"ID":"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90","Type":"ContainerStarted","Data":"37696e70a867f3b4bf5a42e284d0fce29daa3a10832d5601ea114e0d70523cf9"} Apr 16 13:59:41.409677 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:41.409648 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" event={"ID":"d47f2738-9503-4e5e-8359-c1d73e1fc168","Type":"ContainerStarted","Data":"a847bb4491995599652720523787f64d1d102c862a12686c4012e456b9d1097b"} Apr 16 13:59:41.411143 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:41.411124 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-skgt6" event={"ID":"45edff06-17ec-4445-a612-10113a6f9a02","Type":"ContainerStarted","Data":"80816fd2c92e802c64e703fe46d8e57b678249bf610f316bc939a1331f58dcc0"} Apr 16 13:59:41.416994 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:41.416962 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-109.ec2.internal" podStartSLOduration=19.416950997 podStartE2EDuration="19.416950997s" podCreationTimestamp="2026-04-16 13:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:41.416789639 +0000 UTC m=+20.641460873" watchObservedRunningTime="2026-04-16 13:59:41.416950997 +0000 UTC m=+20.641622211" Apr 16 13:59:41.432704 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:41.432662 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-skgt6" podStartSLOduration=3.488573008 podStartE2EDuration="20.432650498s" podCreationTimestamp="2026-04-16 13:59:21 +0000 UTC" firstStartedPulling="2026-04-16 13:59:23.552028141 +0000 UTC m=+2.776699353" lastFinishedPulling="2026-04-16 13:59:40.496105626 +0000 UTC m=+19.720776843" observedRunningTime="2026-04-16 13:59:41.432104653 +0000 UTC m=+20.656775888" watchObservedRunningTime="2026-04-16 13:59:41.432650498 +0000 UTC m=+20.657321732" Apr 16 13:59:41.452585 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:41.452540 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-b8kz6" podStartSLOduration=3.8275481190000002 podStartE2EDuration="20.452526643s" podCreationTimestamp="2026-04-16 13:59:21 +0000 UTC" firstStartedPulling="2026-04-16 13:59:23.565190342 +0000 UTC m=+2.789861561" lastFinishedPulling="2026-04-16 13:59:40.190168857 +0000 UTC m=+19.414840085" observedRunningTime="2026-04-16 13:59:41.452096058 +0000 UTC m=+20.676767293" watchObservedRunningTime="2026-04-16 13:59:41.452526643 +0000 UTC m=+20.677197878" Apr 16 13:59:42.378966 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:42.378801 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 13:59:42.414596 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:42.414565 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" event={"ID":"591845d7-ab61-4e19-8a4c-e0c14a2f6c24","Type":"ContainerStarted","Data":"f0785f1154a80d3126789e3b9ea07159cf98b96e74a7c78f2e6cd3769a67fb15"} Apr 16 13:59:42.414596 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:42.414603 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" event={"ID":"591845d7-ab61-4e19-8a4c-e0c14a2f6c24","Type":"ContainerStarted","Data":"c57ab97b0fcff9b928c921f68030a30a2b7fed720ee2991ce0a905c77907720c"} Apr 16 13:59:42.415740 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:42.415717 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-x8zzl" event={"ID":"3ad29733-1e32-4cb7-9641-906b311b4961","Type":"ContainerStarted","Data":"051e268de45c72870dc4e900f5db73871a227d5da9db2ec2055ef0646f3fa0bf"} Apr 16 13:59:42.417021 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:42.416994 2570 generic.go:358] "Generic (PLEG): container finished" podID="1cbe0c2f-4375-424a-a6f9-acf5ed5f216c" containerID="efc0b48a6377b02d6e26cdb7eb0b80ee9bfc244490d998f24068c638af63d17f" exitCode=0 Apr 16 13:59:42.417121 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:42.417019 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvgp6" event={"ID":"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c","Type":"ContainerDied","Data":"efc0b48a6377b02d6e26cdb7eb0b80ee9bfc244490d998f24068c638af63d17f"} Apr 16 13:59:42.418294 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:42.418257 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gn66w" event={"ID":"7d916fa4-9672-4e7a-be82-02e78c5a0df3","Type":"ContainerStarted","Data":"8c68259be851def47f9b16d55eabff9046dc209f43335c66042c32ec29d729f7"} Apr 16 13:59:42.419635 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:42.419589 2570 generic.go:358] "Generic (PLEG): container finished" podID="9a325dd6c6725307a6ffbdfee2361b7c" containerID="e9777ad3529ea6a86799a158e2e4dbb07e6b30eb4c964578eb3e3a62a710324e" exitCode=0 Apr 16 13:59:42.419691 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:42.419669 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal" event={"ID":"9a325dd6c6725307a6ffbdfee2361b7c","Type":"ContainerDied","Data":"e9777ad3529ea6a86799a158e2e4dbb07e6b30eb4c964578eb3e3a62a710324e"} Apr 16 13:59:42.421014 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:42.420997 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q4pbl" event={"ID":"3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d","Type":"ContainerStarted","Data":"0c167827e6e91ae91ce2f16465f52d72c0cbdd74184ed3535f09ee339c685a8b"} Apr 16 13:59:42.422299 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:42.422277 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zvl68" event={"ID":"d62a9568-15dd-4b2a-b879-e1ae35037432","Type":"ContainerStarted","Data":"65b3bd23c25c5ab6559a13ea5d1df848e9d1512eca955c3b8cf06dbe9a322ce7"} Apr 16 13:59:42.429819 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:42.429776 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-x8zzl" podStartSLOduration=13.510670912 podStartE2EDuration="21.429762699s" podCreationTimestamp="2026-04-16 13:59:21 +0000 UTC" firstStartedPulling="2026-04-16 13:59:23.55575356 +0000 UTC m=+2.780424772" lastFinishedPulling="2026-04-16 13:59:31.474845333 +0000 UTC m=+10.699516559" observedRunningTime="2026-04-16 13:59:42.429369216 +0000 UTC m=+21.654040452" watchObservedRunningTime="2026-04-16 13:59:42.429762699 +0000 UTC m=+21.654433936" Apr 16 13:59:42.456045 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:42.455628 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zvl68" podStartSLOduration=4.837273083 podStartE2EDuration="21.455608657s" podCreationTimestamp="2026-04-16 13:59:21 +0000 UTC" firstStartedPulling="2026-04-16 13:59:23.571834413 +0000 UTC m=+2.796505627" lastFinishedPulling="2026-04-16 13:59:40.190169985 +0000 UTC m=+19.414841201" observedRunningTime="2026-04-16 13:59:42.442741667 +0000 UTC m=+21.667412902" watchObservedRunningTime="2026-04-16 13:59:42.455608657 +0000 UTC m=+21.680279893" Apr 16 13:59:42.456045 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:42.455755 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-q4pbl" podStartSLOduration=4.592577459 podStartE2EDuration="21.455746735s" podCreationTimestamp="2026-04-16 13:59:21 +0000 UTC" firstStartedPulling="2026-04-16 13:59:23.580912684 +0000 UTC m=+2.805583906" lastFinishedPulling="2026-04-16 13:59:40.444081969 +0000 UTC m=+19.668753182" observedRunningTime="2026-04-16 13:59:42.455159757 +0000 UTC m=+21.679830993" watchObservedRunningTime="2026-04-16 13:59:42.455746735 +0000 UTC m=+21.680417969" Apr 16 13:59:42.485142 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:42.485080 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-gn66w" podStartSLOduration=4.547835213 podStartE2EDuration="21.485062902s" podCreationTimestamp="2026-04-16 13:59:21 +0000 UTC" firstStartedPulling="2026-04-16 13:59:23.535483175 +0000 UTC m=+2.760154389" lastFinishedPulling="2026-04-16 13:59:40.472710859 +0000 UTC m=+19.697382078" observedRunningTime="2026-04-16 13:59:42.484197509 +0000 UTC m=+21.708868772" watchObservedRunningTime="2026-04-16 13:59:42.485062902 +0000 UTC m=+21.709734138" Apr 16 13:59:43.317228 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:43.317109 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T13:59:42.378963771Z","UUID":"240653d3-3f08-46b0-8ce6-efb6d2c934ba","Handler":null,"Name":"","Endpoint":""} Apr 16 13:59:43.321391 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:43.321020 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 13:59:43.321391 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:43.321064 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 13:59:43.356800 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:43.356693 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:43.356800 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:43.356693 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:43.357016 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:43.356829 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 13:59:43.357016 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:43.356904 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 13:59:43.357016 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:43.356955 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:43.357157 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:43.357032 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 13:59:43.425991 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:43.425955 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" event={"ID":"591845d7-ab61-4e19-8a4c-e0c14a2f6c24","Type":"ContainerStarted","Data":"3174e44d494e5d0b157ddc5b5d15971dc76a228e4a3ca75d6fafcc7ffe726051"} Apr 16 13:59:43.427970 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:43.427933 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal" event={"ID":"9a325dd6c6725307a6ffbdfee2361b7c","Type":"ContainerStarted","Data":"c2eba064536f2fde7c3a95a51dfe5a0f3d3263e3f759955c9e413355e8bbc91b"} Apr 16 13:59:43.431123 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:43.431093 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 13:59:43.431518 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:43.431493 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" event={"ID":"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90","Type":"ContainerStarted","Data":"d87c00b837348392f7077e0f7031c92a71900ff1686f5b63f80565615477d9f1"} Apr 16 13:59:43.443236 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:43.443191 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nzjqq" podStartSLOduration=2.873625193 podStartE2EDuration="22.443175196s" podCreationTimestamp="2026-04-16 13:59:21 +0000 UTC" firstStartedPulling="2026-04-16 13:59:23.561263929 +0000 UTC m=+2.785935148" lastFinishedPulling="2026-04-16 13:59:43.130813928 +0000 UTC m=+22.355485151" observedRunningTime="2026-04-16 13:59:43.44255499 +0000 UTC m=+22.667226226" watchObservedRunningTime="2026-04-16 13:59:43.443175196 +0000 UTC m=+22.667846435" Apr 16 13:59:43.455449 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:43.455407 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-109.ec2.internal" podStartSLOduration=21.455390415 podStartE2EDuration="21.455390415s" podCreationTimestamp="2026-04-16 13:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:43.455176264 +0000 UTC m=+22.679847499" watchObservedRunningTime="2026-04-16 13:59:43.455390415 +0000 UTC m=+22.680061649" Apr 16 13:59:45.356370 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:45.356340 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:45.356807 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:45.356373 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:45.356807 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:45.356340 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:45.356807 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:45.356479 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 13:59:45.356807 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:45.356571 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 13:59:45.356807 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:45.356657 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 13:59:46.442066 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:46.441895 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 13:59:46.442586 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:46.442389 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" event={"ID":"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90","Type":"ContainerStarted","Data":"f5695d5e34711c64fd3981f44ceeb8d35e0633f8381fd150108d7834d5f72d21"} Apr 16 13:59:46.442801 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:46.442746 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:46.442801 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:46.442774 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:46.442962 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:46.442846 2570 scope.go:117] "RemoveContainer" containerID="d1149f8985c1f5969fae648d939b002c6799ed680d06a41219e643512f3cefd2" Apr 16 13:59:46.461429 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:46.461368 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:46.461540 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:46.461434 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:46.911080 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:46.911048 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-x8zzl" Apr 16 13:59:46.911826 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:46.911808 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-x8zzl" Apr 16 13:59:47.356263 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:47.356233 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:47.356432 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:47.356233 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:47.356432 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:47.356347 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 13:59:47.356432 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:47.356361 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:47.356612 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:47.356429 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 13:59:47.356612 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:47.356518 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 13:59:47.447167 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:47.447144 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 13:59:47.447846 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:47.447489 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" event={"ID":"8ad82b8c-5f9d-40e3-bf04-ee7dff525d90","Type":"ContainerStarted","Data":"68a2710d1ae3574b368c0515db3d7b9cfc68dbf8c04046737a2b495762395583"} Apr 16 13:59:47.447846 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:47.447587 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:59:47.449044 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:47.449023 2570 generic.go:358] "Generic (PLEG): container finished" podID="1cbe0c2f-4375-424a-a6f9-acf5ed5f216c" containerID="6bf90e5a7cd96e1be16a9014b2afde0e84821ac52b937d884a34cad557f6ea54" exitCode=0 Apr 16 13:59:47.449140 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:47.449092 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvgp6" event={"ID":"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c","Type":"ContainerDied","Data":"6bf90e5a7cd96e1be16a9014b2afde0e84821ac52b937d884a34cad557f6ea54"} Apr 16 13:59:47.449281 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:47.449264 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-x8zzl" Apr 16 13:59:47.450031 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:47.449726 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-x8zzl" Apr 16 13:59:47.477504 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:47.477444 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" podStartSLOduration=9.538483138 podStartE2EDuration="26.477425077s" podCreationTimestamp="2026-04-16 13:59:21 +0000 UTC" firstStartedPulling="2026-04-16 13:59:23.584774945 +0000 UTC m=+2.809446163" lastFinishedPulling="2026-04-16 13:59:40.523716883 +0000 UTC m=+19.748388102" observedRunningTime="2026-04-16 13:59:47.475622773 +0000 UTC m=+26.700294009" watchObservedRunningTime="2026-04-16 13:59:47.477425077 +0000 UTC m=+26.702096316" Apr 16 13:59:47.934950 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:47.934913 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret\") pod \"global-pull-secret-syncer-sxl7w\" (UID: \"c3956767-9c3a-4525-b3d3-d3e177d9479f\") " pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:47.935098 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:47.935053 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:47.935136 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:47.935119 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret podName:c3956767-9c3a-4525-b3d3-d3e177d9479f nodeName:}" failed. No retries permitted until 2026-04-16 14:00:03.935103491 +0000 UTC m=+43.159774709 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret") pod "global-pull-secret-syncer-sxl7w" (UID: "c3956767-9c3a-4525-b3d3-d3e177d9479f") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:48.104527 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:48.104303 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 13:59:48.489325 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:48.488791 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-m5nn8"] Apr 16 13:59:48.489325 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:48.488925 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:48.489325 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:48.489017 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 13:59:48.489909 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:48.489722 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-sxl7w"] Apr 16 13:59:48.489909 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:48.489834 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:48.490000 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:48.489930 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 13:59:48.490363 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:48.490338 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2dz2d"] Apr 16 13:59:48.490474 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:48.490456 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:48.490583 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:48.490564 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 13:59:49.455251 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:49.455223 2570 generic.go:358] "Generic (PLEG): container finished" podID="1cbe0c2f-4375-424a-a6f9-acf5ed5f216c" containerID="cfad80404732dc073afc15b81463c7399e26cc2dc931919ca48a69fd415db317" exitCode=0 Apr 16 13:59:49.455445 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:49.455323 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvgp6" event={"ID":"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c","Type":"ContainerDied","Data":"cfad80404732dc073afc15b81463c7399e26cc2dc931919ca48a69fd415db317"} Apr 16 13:59:50.356400 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:50.356367 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:50.356400 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:50.356388 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:50.357006 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:50.356372 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:50.357006 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:50.356497 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 13:59:50.357006 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:50.356565 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 13:59:50.357006 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:50.356654 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 13:59:51.461683 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:51.461647 2570 generic.go:358] "Generic (PLEG): container finished" podID="1cbe0c2f-4375-424a-a6f9-acf5ed5f216c" containerID="4fbcbc3815ce070e84a575ae4e528cc587c79bc81cabd6b8873a976cfa8bcd10" exitCode=0 Apr 16 13:59:51.461683 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:51.461686 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvgp6" event={"ID":"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c","Type":"ContainerDied","Data":"4fbcbc3815ce070e84a575ae4e528cc587c79bc81cabd6b8873a976cfa8bcd10"} Apr 16 13:59:52.000982 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:52.000953 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zvl68_d62a9568-15dd-4b2a-b879-e1ae35037432/dns-node-resolver/0.log" Apr 16 13:59:52.357128 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:52.357043 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:52.357128 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:52.357069 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:52.357334 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:52.357152 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:52.357334 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:52.357170 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 13:59:52.357334 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:52.357249 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 13:59:52.357473 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:52.357340 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 13:59:52.983985 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:52.983953 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-q4pbl_3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d/node-ca/0.log" Apr 16 13:59:54.356345 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:54.356294 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:54.356777 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:54.356301 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:54.356777 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:54.356446 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 13:59:54.356777 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:54.356473 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 13:59:54.356777 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:54.356330 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:54.356777 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:54.356542 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 13:59:54.985891 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:54.985856 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq4jh\" (UniqueName: \"kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh\") pod \"network-check-target-m5nn8\" (UID: \"1522dd59-b1b0-4b61-8eed-6b2da396ebac\") " pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:54.986095 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:54.985909 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs\") pod \"network-metrics-daemon-2dz2d\" (UID: \"f570a9dc-9480-415b-9633-11fb3c3a05eb\") " pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:54.986095 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:54.986029 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:54.986095 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:54.986040 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:54.986095 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:54.986063 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:54.986095 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:54.986077 2570 projected.go:194] Error preparing data for projected volume kube-api-access-jq4jh for pod openshift-network-diagnostics/network-check-target-m5nn8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:54.986095 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:54.986092 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs podName:f570a9dc-9480-415b-9633-11fb3c3a05eb nodeName:}" failed. No retries permitted until 2026-04-16 14:00:26.986073557 +0000 UTC m=+66.210744776 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs") pod "network-metrics-daemon-2dz2d" (UID: "f570a9dc-9480-415b-9633-11fb3c3a05eb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:54.986359 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:54.986124 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh podName:1522dd59-b1b0-4b61-8eed-6b2da396ebac nodeName:}" failed. No retries permitted until 2026-04-16 14:00:26.986112301 +0000 UTC m=+66.210783519 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-jq4jh" (UniqueName: "kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh") pod "network-check-target-m5nn8" (UID: "1522dd59-b1b0-4b61-8eed-6b2da396ebac") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:56.356691 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:56.356654 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:56.357186 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:56.356658 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:56.357186 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:56.356794 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 13:59:56.357186 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:56.356815 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:56.357186 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:56.356904 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 13:59:56.357186 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:56.356992 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 13:59:57.479127 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:57.478944 2570 generic.go:358] "Generic (PLEG): container finished" podID="1cbe0c2f-4375-424a-a6f9-acf5ed5f216c" containerID="f971c196c49a299c34476f70dcd952ae2e3c2fec0608b2670b6de560825cc23d" exitCode=0 Apr 16 13:59:57.479127 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:57.479018 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvgp6" event={"ID":"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c","Type":"ContainerDied","Data":"f971c196c49a299c34476f70dcd952ae2e3c2fec0608b2670b6de560825cc23d"} Apr 16 13:59:58.357087 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:58.357057 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 13:59:58.357268 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:58.357095 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 13:59:58.357268 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:58.357176 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 13:59:58.357268 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:58.357243 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 13:59:58.357425 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:58.357357 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 13:59:58.357425 ip-10-0-136-109 kubenswrapper[2570]: E0416 13:59:58.357411 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 13:59:58.483815 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:58.483781 2570 generic.go:358] "Generic (PLEG): container finished" podID="1cbe0c2f-4375-424a-a6f9-acf5ed5f216c" containerID="56f250b1dc350e27d74fb70b76b812c36f02ab593ba6808707809c59b6d607f5" exitCode=0 Apr 16 13:59:58.484173 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:58.483849 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvgp6" event={"ID":"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c","Type":"ContainerDied","Data":"56f250b1dc350e27d74fb70b76b812c36f02ab593ba6808707809c59b6d607f5"} Apr 16 13:59:59.488845 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:59.488813 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mvgp6" event={"ID":"1cbe0c2f-4375-424a-a6f9-acf5ed5f216c","Type":"ContainerStarted","Data":"562966b6038c8d66aac60efa2b9a126c29dc20c6556e5a92467a230b46bbb1ac"} Apr 16 13:59:59.510461 ip-10-0-136-109 kubenswrapper[2570]: I0416 13:59:59.510418 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mvgp6" podStartSLOduration=4.884222724 podStartE2EDuration="38.510405843s" podCreationTimestamp="2026-04-16 13:59:21 +0000 UTC" firstStartedPulling="2026-04-16 13:59:23.543808042 +0000 UTC m=+2.768479258" lastFinishedPulling="2026-04-16 13:59:57.169991149 +0000 UTC m=+36.394662377" observedRunningTime="2026-04-16 13:59:59.50988508 +0000 UTC m=+38.734556312" watchObservedRunningTime="2026-04-16 13:59:59.510405843 +0000 UTC m=+38.735077077" Apr 16 14:00:00.356403 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:00.356374 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 14:00:00.356576 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:00.356419 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 14:00:00.356576 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:00.356426 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 14:00:00.356576 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:00.356499 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 14:00:00.356576 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:00.356553 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 14:00:00.356724 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:00.356626 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 14:00:02.356866 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:02.356833 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 14:00:02.357233 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:02.356833 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 14:00:02.357233 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:02.356833 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 14:00:02.357233 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:02.357029 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 14:00:02.357233 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:02.356934 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 14:00:02.357233 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:02.357113 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 14:00:03.951300 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:03.951246 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret\") pod \"global-pull-secret-syncer-sxl7w\" (UID: \"c3956767-9c3a-4525-b3d3-d3e177d9479f\") " pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 14:00:03.951692 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:03.951438 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:00:03.951692 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:03.951504 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret podName:c3956767-9c3a-4525-b3d3-d3e177d9479f nodeName:}" failed. No retries permitted until 2026-04-16 14:00:35.95148874 +0000 UTC m=+75.176159958 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret") pod "global-pull-secret-syncer-sxl7w" (UID: "c3956767-9c3a-4525-b3d3-d3e177d9479f") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:00:04.356626 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:04.356536 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 14:00:04.356766 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:04.356537 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 14:00:04.356766 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:04.356647 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 14:00:04.356766 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:04.356537 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 14:00:04.356864 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:04.356777 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 14:00:04.356864 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:04.356717 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 14:00:06.356037 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:06.356003 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 14:00:06.356526 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:06.356003 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 14:00:06.356526 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:06.356102 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 14:00:06.356526 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:06.356014 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 14:00:06.356526 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:06.356169 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 14:00:06.356526 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:06.356270 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 14:00:08.356977 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:08.356938 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 14:00:08.357402 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:08.356987 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 14:00:08.357402 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:08.356936 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 14:00:08.357402 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:08.357054 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 14:00:08.357402 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:08.357125 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 14:00:08.357402 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:08.357194 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 14:00:10.356771 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:10.356741 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 14:00:10.357212 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:10.356741 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 14:00:10.357212 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:10.356851 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 14:00:10.357212 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:10.356902 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 14:00:10.357212 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:10.356740 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 14:00:10.357212 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:10.356975 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 14:00:12.356646 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.356602 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 14:00:12.357050 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.356602 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 14:00:12.357050 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:12.356708 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5nn8" podUID="1522dd59-b1b0-4b61-8eed-6b2da396ebac" Apr 16 14:00:12.357050 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.356621 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 14:00:12.357050 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:12.356807 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2dz2d" podUID="f570a9dc-9480-415b-9633-11fb3c3a05eb" Apr 16 14:00:12.357050 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:12.356867 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-sxl7w" podUID="c3956767-9c3a-4525-b3d3-d3e177d9479f" Apr 16 14:00:12.628073 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.627992 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-109.ec2.internal" event="NodeReady" Apr 16 14:00:12.628239 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.628119 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 14:00:12.642005 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.641957 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-109.ec2.internal" event="NodeReady" Apr 16 14:00:12.665489 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.665459 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-59d866bf84-bkpt6"] Apr 16 14:00:12.687597 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.687570 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-g94vr"] Apr 16 14:00:12.687760 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.687743 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.690194 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.690139 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:00:12.690359 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.690292 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-hnvqt\"" Apr 16 14:00:12.690444 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.690382 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:00:12.690685 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.690663 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:00:12.696176 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.696157 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:00:12.706981 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.706961 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2z9p5"] Apr 16 14:00:12.707137 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.707122 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g94vr" Apr 16 14:00:12.709092 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.709045 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-installation-pull-secrets\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.709180 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.709088 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt6sg\" (UniqueName: \"kubernetes.io/projected/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-kube-api-access-dt6sg\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.709180 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.709140 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-bound-sa-token\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.709180 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.709169 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-registry-tls\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.709346 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.709225 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-trusted-ca\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.709346 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.709279 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-image-registry-private-configuration\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.709346 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.709308 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-ca-trust-extracted\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.709468 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.709365 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-registry-certificates\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.709685 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.709668 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 14:00:12.710166 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.710149 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-pfkw7\"" Apr 16 14:00:12.710240 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.710218 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 14:00:12.735283 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.735257 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-59d866bf84-bkpt6"] Apr 16 14:00:12.735283 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.735284 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g94vr"] Apr 16 14:00:12.735460 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.735294 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2z9p5"] Apr 16 14:00:12.735460 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.735305 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-lmzjv"] Apr 16 14:00:12.735460 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.735413 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2z9p5" Apr 16 14:00:12.738163 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.738139 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 14:00:12.738295 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.738275 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gjmb2\"" Apr 16 14:00:12.739893 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.739878 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 14:00:12.742082 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.742066 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 14:00:12.756155 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.756135 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-lmzjv"] Apr 16 14:00:12.756276 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.756200 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-lmzjv" Apr 16 14:00:12.759539 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.759520 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:00:12.759748 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.759737 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:00:12.767371 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.767352 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:00:12.767550 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.767538 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:00:12.768144 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.768127 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-h5f2n\"" Apr 16 14:00:12.810500 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.810473 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-bound-sa-token\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.810639 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.810503 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ade50be-58d9-4908-9196-52d293c0182d-tmp-dir\") pod \"dns-default-g94vr\" (UID: \"4ade50be-58d9-4908-9196-52d293c0182d\") " pod="openshift-dns/dns-default-g94vr" Apr 16 14:00:12.810639 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.810522 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s95z\" (UniqueName: \"kubernetes.io/projected/d9aec4e9-fdb6-48ec-9e84-bfcd44149787-kube-api-access-6s95z\") pod \"ingress-canary-2z9p5\" (UID: \"d9aec4e9-fdb6-48ec-9e84-bfcd44149787\") " pod="openshift-ingress-canary/ingress-canary-2z9p5" Apr 16 14:00:12.810639 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.810539 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/472bbe6c-6852-49ae-b248-56beee337ffa-data-volume\") pod \"insights-runtime-extractor-lmzjv\" (UID: \"472bbe6c-6852-49ae-b248-56beee337ffa\") " pod="openshift-insights/insights-runtime-extractor-lmzjv" Apr 16 14:00:12.810639 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.810556 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-registry-tls\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.810639 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.810583 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t22tj\" (UniqueName: \"kubernetes.io/projected/4ade50be-58d9-4908-9196-52d293c0182d-kube-api-access-t22tj\") pod \"dns-default-g94vr\" (UID: \"4ade50be-58d9-4908-9196-52d293c0182d\") " pod="openshift-dns/dns-default-g94vr" Apr 16 14:00:12.810851 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.810723 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9aec4e9-fdb6-48ec-9e84-bfcd44149787-cert\") pod \"ingress-canary-2z9p5\" (UID: \"d9aec4e9-fdb6-48ec-9e84-bfcd44149787\") " pod="openshift-ingress-canary/ingress-canary-2z9p5" Apr 16 14:00:12.810851 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.810831 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ade50be-58d9-4908-9196-52d293c0182d-metrics-tls\") pod \"dns-default-g94vr\" (UID: \"4ade50be-58d9-4908-9196-52d293c0182d\") " pod="openshift-dns/dns-default-g94vr" Apr 16 14:00:12.810923 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.810898 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-trusted-ca\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.810961 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.810934 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-image-registry-private-configuration\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.810961 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.810957 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-ca-trust-extracted\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.811059 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.810978 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-registry-certificates\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.811059 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.811007 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-installation-pull-secrets\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.811059 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.811030 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/472bbe6c-6852-49ae-b248-56beee337ffa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lmzjv\" (UID: \"472bbe6c-6852-49ae-b248-56beee337ffa\") " pod="openshift-insights/insights-runtime-extractor-lmzjv" Apr 16 14:00:12.811059 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.811056 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/472bbe6c-6852-49ae-b248-56beee337ffa-crio-socket\") pod \"insights-runtime-extractor-lmzjv\" (UID: \"472bbe6c-6852-49ae-b248-56beee337ffa\") " pod="openshift-insights/insights-runtime-extractor-lmzjv" Apr 16 14:00:12.811248 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.811078 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlgdt\" (UniqueName: \"kubernetes.io/projected/472bbe6c-6852-49ae-b248-56beee337ffa-kube-api-access-nlgdt\") pod \"insights-runtime-extractor-lmzjv\" (UID: \"472bbe6c-6852-49ae-b248-56beee337ffa\") " pod="openshift-insights/insights-runtime-extractor-lmzjv" Apr 16 14:00:12.811248 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.811128 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dt6sg\" (UniqueName: \"kubernetes.io/projected/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-kube-api-access-dt6sg\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.811248 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.811191 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ade50be-58d9-4908-9196-52d293c0182d-config-volume\") pod \"dns-default-g94vr\" (UID: \"4ade50be-58d9-4908-9196-52d293c0182d\") " pod="openshift-dns/dns-default-g94vr" Apr 16 14:00:12.811248 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.811217 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/472bbe6c-6852-49ae-b248-56beee337ffa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lmzjv\" (UID: \"472bbe6c-6852-49ae-b248-56beee337ffa\") " pod="openshift-insights/insights-runtime-extractor-lmzjv" Apr 16 14:00:12.811478 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.811365 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-ca-trust-extracted\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.811836 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.811804 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-registry-certificates\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.811976 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.811958 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-trusted-ca\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.814782 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.814750 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-image-registry-private-configuration\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.814874 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.814758 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-registry-tls\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.814874 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.814822 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-installation-pull-secrets\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.819738 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.819720 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-bound-sa-token\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.821129 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.821109 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt6sg\" (UniqueName: \"kubernetes.io/projected/11d21f1f-6ef5-4db0-9edf-20ae92adb2a7-kube-api-access-dt6sg\") pod \"image-registry-59d866bf84-bkpt6\" (UID: \"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7\") " pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:12.912281 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.912253 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/472bbe6c-6852-49ae-b248-56beee337ffa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lmzjv\" (UID: \"472bbe6c-6852-49ae-b248-56beee337ffa\") " pod="openshift-insights/insights-runtime-extractor-lmzjv" Apr 16 14:00:12.912505 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.912294 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/472bbe6c-6852-49ae-b248-56beee337ffa-crio-socket\") pod \"insights-runtime-extractor-lmzjv\" (UID: \"472bbe6c-6852-49ae-b248-56beee337ffa\") " pod="openshift-insights/insights-runtime-extractor-lmzjv" Apr 16 14:00:12.912505 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.912338 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nlgdt\" (UniqueName: \"kubernetes.io/projected/472bbe6c-6852-49ae-b248-56beee337ffa-kube-api-access-nlgdt\") pod \"insights-runtime-extractor-lmzjv\" (UID: \"472bbe6c-6852-49ae-b248-56beee337ffa\") " pod="openshift-insights/insights-runtime-extractor-lmzjv" Apr 16 14:00:12.912505 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.912365 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ade50be-58d9-4908-9196-52d293c0182d-config-volume\") pod \"dns-default-g94vr\" (UID: \"4ade50be-58d9-4908-9196-52d293c0182d\") " pod="openshift-dns/dns-default-g94vr" Apr 16 14:00:12.912505 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.912386 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/472bbe6c-6852-49ae-b248-56beee337ffa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lmzjv\" (UID: \"472bbe6c-6852-49ae-b248-56beee337ffa\") " pod="openshift-insights/insights-runtime-extractor-lmzjv" Apr 16 14:00:12.912505 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.912417 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/472bbe6c-6852-49ae-b248-56beee337ffa-crio-socket\") pod \"insights-runtime-extractor-lmzjv\" (UID: \"472bbe6c-6852-49ae-b248-56beee337ffa\") " pod="openshift-insights/insights-runtime-extractor-lmzjv" Apr 16 14:00:12.912505 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.912435 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ade50be-58d9-4908-9196-52d293c0182d-tmp-dir\") pod \"dns-default-g94vr\" (UID: \"4ade50be-58d9-4908-9196-52d293c0182d\") " pod="openshift-dns/dns-default-g94vr" Apr 16 14:00:12.912505 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.912460 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6s95z\" (UniqueName: \"kubernetes.io/projected/d9aec4e9-fdb6-48ec-9e84-bfcd44149787-kube-api-access-6s95z\") pod \"ingress-canary-2z9p5\" (UID: \"d9aec4e9-fdb6-48ec-9e84-bfcd44149787\") " pod="openshift-ingress-canary/ingress-canary-2z9p5" Apr 16 14:00:12.912851 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.912509 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/472bbe6c-6852-49ae-b248-56beee337ffa-data-volume\") pod \"insights-runtime-extractor-lmzjv\" (UID: \"472bbe6c-6852-49ae-b248-56beee337ffa\") " pod="openshift-insights/insights-runtime-extractor-lmzjv" Apr 16 14:00:12.912851 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.912549 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t22tj\" (UniqueName: \"kubernetes.io/projected/4ade50be-58d9-4908-9196-52d293c0182d-kube-api-access-t22tj\") pod \"dns-default-g94vr\" (UID: \"4ade50be-58d9-4908-9196-52d293c0182d\") " pod="openshift-dns/dns-default-g94vr" Apr 16 14:00:12.912851 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.912577 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9aec4e9-fdb6-48ec-9e84-bfcd44149787-cert\") pod \"ingress-canary-2z9p5\" (UID: \"d9aec4e9-fdb6-48ec-9e84-bfcd44149787\") " pod="openshift-ingress-canary/ingress-canary-2z9p5" Apr 16 14:00:12.912851 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.912623 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ade50be-58d9-4908-9196-52d293c0182d-metrics-tls\") pod \"dns-default-g94vr\" (UID: \"4ade50be-58d9-4908-9196-52d293c0182d\") " pod="openshift-dns/dns-default-g94vr" Apr 16 14:00:12.913050 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.912868 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/472bbe6c-6852-49ae-b248-56beee337ffa-data-volume\") pod \"insights-runtime-extractor-lmzjv\" (UID: \"472bbe6c-6852-49ae-b248-56beee337ffa\") " pod="openshift-insights/insights-runtime-extractor-lmzjv" Apr 16 14:00:12.914903 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.914881 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/472bbe6c-6852-49ae-b248-56beee337ffa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lmzjv\" (UID: \"472bbe6c-6852-49ae-b248-56beee337ffa\") " pod="openshift-insights/insights-runtime-extractor-lmzjv" Apr 16 14:00:12.914995 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.914922 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9aec4e9-fdb6-48ec-9e84-bfcd44149787-cert\") pod \"ingress-canary-2z9p5\" (UID: \"d9aec4e9-fdb6-48ec-9e84-bfcd44149787\") " pod="openshift-ingress-canary/ingress-canary-2z9p5" Apr 16 14:00:12.920109 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.920079 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlgdt\" (UniqueName: \"kubernetes.io/projected/472bbe6c-6852-49ae-b248-56beee337ffa-kube-api-access-nlgdt\") pod \"insights-runtime-extractor-lmzjv\" (UID: \"472bbe6c-6852-49ae-b248-56beee337ffa\") " pod="openshift-insights/insights-runtime-extractor-lmzjv" Apr 16 14:00:12.920564 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.920547 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s95z\" (UniqueName: \"kubernetes.io/projected/d9aec4e9-fdb6-48ec-9e84-bfcd44149787-kube-api-access-6s95z\") pod \"ingress-canary-2z9p5\" (UID: \"d9aec4e9-fdb6-48ec-9e84-bfcd44149787\") " pod="openshift-ingress-canary/ingress-canary-2z9p5" Apr 16 14:00:12.925348 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.925309 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/472bbe6c-6852-49ae-b248-56beee337ffa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lmzjv\" (UID: \"472bbe6c-6852-49ae-b248-56beee337ffa\") " pod="openshift-insights/insights-runtime-extractor-lmzjv" Apr 16 14:00:12.925432 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.925392 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4ade50be-58d9-4908-9196-52d293c0182d-tmp-dir\") pod \"dns-default-g94vr\" (UID: \"4ade50be-58d9-4908-9196-52d293c0182d\") " pod="openshift-dns/dns-default-g94vr" Apr 16 14:00:12.925571 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.925548 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ade50be-58d9-4908-9196-52d293c0182d-config-volume\") pod \"dns-default-g94vr\" (UID: \"4ade50be-58d9-4908-9196-52d293c0182d\") " pod="openshift-dns/dns-default-g94vr" Apr 16 14:00:12.927097 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.927082 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ade50be-58d9-4908-9196-52d293c0182d-metrics-tls\") pod \"dns-default-g94vr\" (UID: \"4ade50be-58d9-4908-9196-52d293c0182d\") " pod="openshift-dns/dns-default-g94vr" Apr 16 14:00:12.927152 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.927108 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t22tj\" (UniqueName: \"kubernetes.io/projected/4ade50be-58d9-4908-9196-52d293c0182d-kube-api-access-t22tj\") pod \"dns-default-g94vr\" (UID: \"4ade50be-58d9-4908-9196-52d293c0182d\") " pod="openshift-dns/dns-default-g94vr" Apr 16 14:00:12.998685 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:12.998649 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:13.015648 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:13.015620 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g94vr" Apr 16 14:00:13.044374 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:13.044345 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2z9p5" Apr 16 14:00:13.064254 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:13.064222 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-lmzjv" Apr 16 14:00:13.208148 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:13.208093 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g94vr"] Apr 16 14:00:13.210530 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:13.210506 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-59d866bf84-bkpt6"] Apr 16 14:00:13.218925 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:13.218902 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2z9p5"] Apr 16 14:00:13.219699 ip-10-0-136-109 kubenswrapper[2570]: W0416 14:00:13.219660 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ade50be_58d9_4908_9196_52d293c0182d.slice/crio-44598ef7dfb6701276b9114014524d456bc936191e308bfece298cace335f2f3 WatchSource:0}: Error finding container 44598ef7dfb6701276b9114014524d456bc936191e308bfece298cace335f2f3: Status 404 returned error can't find the container with id 44598ef7dfb6701276b9114014524d456bc936191e308bfece298cace335f2f3 Apr 16 14:00:13.220376 ip-10-0-136-109 kubenswrapper[2570]: W0416 14:00:13.220354 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11d21f1f_6ef5_4db0_9edf_20ae92adb2a7.slice/crio-69295a862231ae2e8ac925fb5d7c61fc97a9a087aed9b28f6d2a1fcbddf1888a WatchSource:0}: Error finding container 69295a862231ae2e8ac925fb5d7c61fc97a9a087aed9b28f6d2a1fcbddf1888a: Status 404 returned error can't find the container with id 69295a862231ae2e8ac925fb5d7c61fc97a9a087aed9b28f6d2a1fcbddf1888a Apr 16 14:00:13.222854 ip-10-0-136-109 kubenswrapper[2570]: W0416 14:00:13.222780 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9aec4e9_fdb6_48ec_9e84_bfcd44149787.slice/crio-e8b0444d70abcf9c03f7e76ae104388bcb1b86c3590c334834fa1d4410898f5a WatchSource:0}: Error finding container e8b0444d70abcf9c03f7e76ae104388bcb1b86c3590c334834fa1d4410898f5a: Status 404 returned error can't find the container with id e8b0444d70abcf9c03f7e76ae104388bcb1b86c3590c334834fa1d4410898f5a Apr 16 14:00:13.235961 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:13.235914 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-lmzjv"] Apr 16 14:00:13.247121 ip-10-0-136-109 kubenswrapper[2570]: W0416 14:00:13.247095 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod472bbe6c_6852_49ae_b248_56beee337ffa.slice/crio-8b9b47b2c5b792ac0fdd7e68b733ebe9303fdcf4557154fad9b6a72fbb9d3a8a WatchSource:0}: Error finding container 8b9b47b2c5b792ac0fdd7e68b733ebe9303fdcf4557154fad9b6a72fbb9d3a8a: Status 404 returned error can't find the container with id 8b9b47b2c5b792ac0fdd7e68b733ebe9303fdcf4557154fad9b6a72fbb9d3a8a Apr 16 14:00:13.513736 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:13.513648 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lmzjv" event={"ID":"472bbe6c-6852-49ae-b248-56beee337ffa","Type":"ContainerStarted","Data":"ee1874d17778295c34d351f424a662172edb67d43f95024944aa5d80d5a05d6a"} Apr 16 14:00:13.513736 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:13.513682 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lmzjv" event={"ID":"472bbe6c-6852-49ae-b248-56beee337ffa","Type":"ContainerStarted","Data":"8b9b47b2c5b792ac0fdd7e68b733ebe9303fdcf4557154fad9b6a72fbb9d3a8a"} Apr 16 14:00:13.514634 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:13.514605 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2z9p5" event={"ID":"d9aec4e9-fdb6-48ec-9e84-bfcd44149787","Type":"ContainerStarted","Data":"e8b0444d70abcf9c03f7e76ae104388bcb1b86c3590c334834fa1d4410898f5a"} Apr 16 14:00:13.515629 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:13.515609 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g94vr" event={"ID":"4ade50be-58d9-4908-9196-52d293c0182d","Type":"ContainerStarted","Data":"44598ef7dfb6701276b9114014524d456bc936191e308bfece298cace335f2f3"} Apr 16 14:00:13.516795 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:13.516776 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" event={"ID":"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7","Type":"ContainerStarted","Data":"02caca04d12c1e1b9b20f93a74fb63fd4c630ef54243902d5b14e611e1d1e701"} Apr 16 14:00:13.516795 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:13.516798 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" event={"ID":"11d21f1f-6ef5-4db0-9edf-20ae92adb2a7","Type":"ContainerStarted","Data":"69295a862231ae2e8ac925fb5d7c61fc97a9a087aed9b28f6d2a1fcbddf1888a"} Apr 16 14:00:13.516942 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:13.516923 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:13.540553 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:13.540498 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" podStartSLOduration=2.5404726159999997 podStartE2EDuration="2.540472616s" podCreationTimestamp="2026-04-16 14:00:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:00:13.539651087 +0000 UTC m=+52.764322323" watchObservedRunningTime="2026-04-16 14:00:13.540472616 +0000 UTC m=+52.765143852" Apr 16 14:00:14.024051 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:14.024021 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kqz98"] Apr 16 14:00:14.054552 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:14.054523 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kqz98"] Apr 16 14:00:14.054686 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:14.054651 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kqz98" Apr 16 14:00:14.056936 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:14.056913 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 14:00:14.057041 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:14.056952 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-brd56\"" Apr 16 14:00:14.122811 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:14.122777 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/98f256c6-561e-44bb-8dcb-e35ac6f8bab0-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-kqz98\" (UID: \"98f256c6-561e-44bb-8dcb-e35ac6f8bab0\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kqz98" Apr 16 14:00:14.223381 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:14.223337 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/98f256c6-561e-44bb-8dcb-e35ac6f8bab0-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-kqz98\" (UID: \"98f256c6-561e-44bb-8dcb-e35ac6f8bab0\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kqz98" Apr 16 14:00:14.223518 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:14.223416 2570 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 14:00:14.223518 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:14.223484 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98f256c6-561e-44bb-8dcb-e35ac6f8bab0-tls-certificates podName:98f256c6-561e-44bb-8dcb-e35ac6f8bab0 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:14.723467881 +0000 UTC m=+53.948139094 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/98f256c6-561e-44bb-8dcb-e35ac6f8bab0-tls-certificates") pod "prometheus-operator-admission-webhook-9cb97cd87-kqz98" (UID: "98f256c6-561e-44bb-8dcb-e35ac6f8bab0") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 14:00:14.356152 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:14.356060 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 14:00:14.356307 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:14.356062 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 14:00:14.356307 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:14.356226 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 14:00:14.358630 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:14.358599 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:00:14.358741 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:14.358639 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:00:14.358741 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:14.358701 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lcmqf\"" Apr 16 14:00:14.358853 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:14.358757 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:00:14.358905 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:14.358853 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:00:14.358905 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:14.358871 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-snjtl\"" Apr 16 14:00:14.728986 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:14.728944 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/98f256c6-561e-44bb-8dcb-e35ac6f8bab0-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-kqz98\" (UID: \"98f256c6-561e-44bb-8dcb-e35ac6f8bab0\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kqz98" Apr 16 14:00:14.731680 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:14.731650 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/98f256c6-561e-44bb-8dcb-e35ac6f8bab0-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-kqz98\" (UID: \"98f256c6-561e-44bb-8dcb-e35ac6f8bab0\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kqz98" Apr 16 14:00:14.963543 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:14.963494 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kqz98" Apr 16 14:00:15.732937 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:15.732882 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kqz98"] Apr 16 14:00:15.829965 ip-10-0-136-109 kubenswrapper[2570]: W0416 14:00:15.829921 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98f256c6_561e_44bb_8dcb_e35ac6f8bab0.slice/crio-cb6f22aa2a1c6d4ce8678b8630d35e47d874d33c515c10c047fffa104d5be48d WatchSource:0}: Error finding container cb6f22aa2a1c6d4ce8678b8630d35e47d874d33c515c10c047fffa104d5be48d: Status 404 returned error can't find the container with id cb6f22aa2a1c6d4ce8678b8630d35e47d874d33c515c10c047fffa104d5be48d Apr 16 14:00:16.525554 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:16.525522 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2z9p5" event={"ID":"d9aec4e9-fdb6-48ec-9e84-bfcd44149787","Type":"ContainerStarted","Data":"2f7498a8f01f2e03a5cdbbcbe12173fccc4d24db9f2172f7aa6d8feb20f70986"} Apr 16 14:00:16.526513 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:16.526485 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kqz98" event={"ID":"98f256c6-561e-44bb-8dcb-e35ac6f8bab0","Type":"ContainerStarted","Data":"cb6f22aa2a1c6d4ce8678b8630d35e47d874d33c515c10c047fffa104d5be48d"} Apr 16 14:00:17.531454 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:17.531423 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lmzjv" event={"ID":"472bbe6c-6852-49ae-b248-56beee337ffa","Type":"ContainerStarted","Data":"fd2c35e308b92c54932b77e805d101914143e6719ffd7da4d775dd380baf49b7"} Apr 16 14:00:17.532877 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:17.532857 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g94vr" event={"ID":"4ade50be-58d9-4908-9196-52d293c0182d","Type":"ContainerStarted","Data":"625ff9d2c0886ff4ef5920d272f9c7f2a5b1cad7ecc07c4124f5b278e6d795f7"} Apr 16 14:00:17.532877 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:17.532878 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g94vr" event={"ID":"4ade50be-58d9-4908-9196-52d293c0182d","Type":"ContainerStarted","Data":"c25d95553bb44f9d8f3679c4b7c3a43ea538801a2cab3d6f4f7ff381632770e9"} Apr 16 14:00:17.548180 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:17.548128 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2z9p5" podStartSLOduration=2.352165072 podStartE2EDuration="5.548114388s" podCreationTimestamp="2026-04-16 14:00:12 +0000 UTC" firstStartedPulling="2026-04-16 14:00:13.22430747 +0000 UTC m=+52.448978684" lastFinishedPulling="2026-04-16 14:00:16.420256784 +0000 UTC m=+55.644928000" observedRunningTime="2026-04-16 14:00:17.547342584 +0000 UTC m=+56.772013819" watchObservedRunningTime="2026-04-16 14:00:17.548114388 +0000 UTC m=+56.772785623" Apr 16 14:00:17.564512 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:17.564460 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-g94vr" podStartSLOduration=2.155108744 podStartE2EDuration="5.564444343s" podCreationTimestamp="2026-04-16 14:00:12 +0000 UTC" firstStartedPulling="2026-04-16 14:00:13.221969495 +0000 UTC m=+52.446640709" lastFinishedPulling="2026-04-16 14:00:16.631305081 +0000 UTC m=+55.855976308" observedRunningTime="2026-04-16 14:00:17.56412446 +0000 UTC m=+56.788795721" watchObservedRunningTime="2026-04-16 14:00:17.564444343 +0000 UTC m=+56.789115617" Apr 16 14:00:18.535896 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:18.535851 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-g94vr" Apr 16 14:00:19.471862 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:19.471835 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fkt9w" Apr 16 14:00:19.540023 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:19.539986 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lmzjv" event={"ID":"472bbe6c-6852-49ae-b248-56beee337ffa","Type":"ContainerStarted","Data":"9878593db406ff13c35971b57be9191de6dc1734b79b13c94689bc8f97024bbf"} Apr 16 14:00:19.541449 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:19.541421 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kqz98" event={"ID":"98f256c6-561e-44bb-8dcb-e35ac6f8bab0","Type":"ContainerStarted","Data":"a9700c52ad709272a7656c554c87e54c47209e16fc45f7c230613bb472644d3e"} Apr 16 14:00:19.564474 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:19.564424 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-lmzjv" podStartSLOduration=1.9073487820000001 podStartE2EDuration="7.564411203s" podCreationTimestamp="2026-04-16 14:00:12 +0000 UTC" firstStartedPulling="2026-04-16 14:00:13.331002479 +0000 UTC m=+52.555673692" lastFinishedPulling="2026-04-16 14:00:18.9880649 +0000 UTC m=+58.212736113" observedRunningTime="2026-04-16 14:00:19.563654116 +0000 UTC m=+58.788325352" watchObservedRunningTime="2026-04-16 14:00:19.564411203 +0000 UTC m=+58.789082438" Apr 16 14:00:19.580332 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:19.580273 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kqz98" podStartSLOduration=2.633524905 podStartE2EDuration="5.580259443s" podCreationTimestamp="2026-04-16 14:00:14 +0000 UTC" firstStartedPulling="2026-04-16 14:00:16.038051385 +0000 UTC m=+55.262722603" lastFinishedPulling="2026-04-16 14:00:18.984785925 +0000 UTC m=+58.209457141" observedRunningTime="2026-04-16 14:00:19.579890247 +0000 UTC m=+58.804561487" watchObservedRunningTime="2026-04-16 14:00:19.580259443 +0000 UTC m=+58.804930710" Apr 16 14:00:20.544560 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:20.544526 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kqz98" Apr 16 14:00:20.552476 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:20.550744 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-kqz98" Apr 16 14:00:21.084299 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.084268 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-xgrnx"] Apr 16 14:00:21.088706 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.088680 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-xgrnx" Apr 16 14:00:21.091043 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.091023 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 14:00:21.092969 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.092941 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 14:00:21.093073 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.093009 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-857mf\"" Apr 16 14:00:21.093073 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.093033 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:00:21.093250 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.093234 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:00:21.093337 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.093242 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:00:21.097027 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.097006 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-xgrnx"] Apr 16 14:00:21.179156 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.179093 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4348b7d4-6755-4435-9405-2298a8d123bc-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-xgrnx\" (UID: \"4348b7d4-6755-4435-9405-2298a8d123bc\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xgrnx" Apr 16 14:00:21.179376 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.179202 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r665z\" (UniqueName: \"kubernetes.io/projected/4348b7d4-6755-4435-9405-2298a8d123bc-kube-api-access-r665z\") pod \"prometheus-operator-78f957474d-xgrnx\" (UID: \"4348b7d4-6755-4435-9405-2298a8d123bc\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xgrnx" Apr 16 14:00:21.179376 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.179244 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4348b7d4-6755-4435-9405-2298a8d123bc-metrics-client-ca\") pod \"prometheus-operator-78f957474d-xgrnx\" (UID: \"4348b7d4-6755-4435-9405-2298a8d123bc\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xgrnx" Apr 16 14:00:21.179376 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.179340 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4348b7d4-6755-4435-9405-2298a8d123bc-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-xgrnx\" (UID: \"4348b7d4-6755-4435-9405-2298a8d123bc\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xgrnx" Apr 16 14:00:21.280516 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.280457 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4348b7d4-6755-4435-9405-2298a8d123bc-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-xgrnx\" (UID: \"4348b7d4-6755-4435-9405-2298a8d123bc\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xgrnx" Apr 16 14:00:21.280714 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.280528 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4348b7d4-6755-4435-9405-2298a8d123bc-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-xgrnx\" (UID: \"4348b7d4-6755-4435-9405-2298a8d123bc\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xgrnx" Apr 16 14:00:21.280714 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.280579 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r665z\" (UniqueName: \"kubernetes.io/projected/4348b7d4-6755-4435-9405-2298a8d123bc-kube-api-access-r665z\") pod \"prometheus-operator-78f957474d-xgrnx\" (UID: \"4348b7d4-6755-4435-9405-2298a8d123bc\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xgrnx" Apr 16 14:00:21.280714 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.280606 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4348b7d4-6755-4435-9405-2298a8d123bc-metrics-client-ca\") pod \"prometheus-operator-78f957474d-xgrnx\" (UID: \"4348b7d4-6755-4435-9405-2298a8d123bc\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xgrnx" Apr 16 14:00:21.283385 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.283366 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 14:00:21.283473 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.283457 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 14:00:21.283512 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.283462 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:00:21.290045 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.290027 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:00:21.291691 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.291672 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4348b7d4-6755-4435-9405-2298a8d123bc-metrics-client-ca\") pod \"prometheus-operator-78f957474d-xgrnx\" (UID: \"4348b7d4-6755-4435-9405-2298a8d123bc\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xgrnx" Apr 16 14:00:21.293212 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.293192 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4348b7d4-6755-4435-9405-2298a8d123bc-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-xgrnx\" (UID: \"4348b7d4-6755-4435-9405-2298a8d123bc\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xgrnx" Apr 16 14:00:21.293350 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.293333 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4348b7d4-6755-4435-9405-2298a8d123bc-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-xgrnx\" (UID: \"4348b7d4-6755-4435-9405-2298a8d123bc\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xgrnx" Apr 16 14:00:21.300597 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.300578 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:00:21.310574 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.310554 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r665z\" (UniqueName: \"kubernetes.io/projected/4348b7d4-6755-4435-9405-2298a8d123bc-kube-api-access-r665z\") pod \"prometheus-operator-78f957474d-xgrnx\" (UID: \"4348b7d4-6755-4435-9405-2298a8d123bc\") " pod="openshift-monitoring/prometheus-operator-78f957474d-xgrnx" Apr 16 14:00:21.402305 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.402229 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-857mf\"" Apr 16 14:00:21.410040 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.410020 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-xgrnx" Apr 16 14:00:21.536604 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.536562 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-xgrnx"] Apr 16 14:00:21.540179 ip-10-0-136-109 kubenswrapper[2570]: W0416 14:00:21.540151 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4348b7d4_6755_4435_9405_2298a8d123bc.slice/crio-d80cc7f88cd007d9d4daef536d498aa7550f17d8ed4c5a75fc3b8afa8dad8923 WatchSource:0}: Error finding container d80cc7f88cd007d9d4daef536d498aa7550f17d8ed4c5a75fc3b8afa8dad8923: Status 404 returned error can't find the container with id d80cc7f88cd007d9d4daef536d498aa7550f17d8ed4c5a75fc3b8afa8dad8923 Apr 16 14:00:21.548024 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:21.547974 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-xgrnx" event={"ID":"4348b7d4-6755-4435-9405-2298a8d123bc","Type":"ContainerStarted","Data":"d80cc7f88cd007d9d4daef536d498aa7550f17d8ed4c5a75fc3b8afa8dad8923"} Apr 16 14:00:24.557136 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:24.557043 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-xgrnx" event={"ID":"4348b7d4-6755-4435-9405-2298a8d123bc","Type":"ContainerStarted","Data":"3bcaaf33b67ab73e03f056043f7b433325dc7176243739e2def9419821355841"} Apr 16 14:00:24.557136 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:24.557086 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-xgrnx" event={"ID":"4348b7d4-6755-4435-9405-2298a8d123bc","Type":"ContainerStarted","Data":"e857b4e4430501694d672341a95e06074e63d8bd2277d0b7f52c1495abbeef19"} Apr 16 14:00:24.573130 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:24.573078 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-xgrnx" podStartSLOduration=0.948024996 podStartE2EDuration="3.573064807s" podCreationTimestamp="2026-04-16 14:00:21 +0000 UTC" firstStartedPulling="2026-04-16 14:00:21.541984636 +0000 UTC m=+60.766655849" lastFinishedPulling="2026-04-16 14:00:24.167024446 +0000 UTC m=+63.391695660" observedRunningTime="2026-04-16 14:00:24.572547707 +0000 UTC m=+63.797218942" watchObservedRunningTime="2026-04-16 14:00:24.573064807 +0000 UTC m=+63.797736041" Apr 16 14:00:26.476818 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.476783 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-gc5pn"] Apr 16 14:00:26.499182 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.499152 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.501674 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.501634 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:00:26.501674 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.501635 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:00:26.501978 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.501963 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:00:26.502044 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.501980 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-nmjsm\"" Apr 16 14:00:26.619886 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.619848 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bace3793-d54a-47c6-a30b-d75e319d7753-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.620077 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.619906 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bace3793-d54a-47c6-a30b-d75e319d7753-node-exporter-tls\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.620077 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.619947 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bace3793-d54a-47c6-a30b-d75e319d7753-root\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.620077 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.619977 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bace3793-d54a-47c6-a30b-d75e319d7753-node-exporter-wtmp\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.620077 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.620011 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bace3793-d54a-47c6-a30b-d75e319d7753-node-exporter-textfile\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.620234 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.620077 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bace3793-d54a-47c6-a30b-d75e319d7753-node-exporter-accelerators-collector-config\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.620234 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.620102 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bace3793-d54a-47c6-a30b-d75e319d7753-sys\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.620234 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.620119 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgqxs\" (UniqueName: \"kubernetes.io/projected/bace3793-d54a-47c6-a30b-d75e319d7753-kube-api-access-wgqxs\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.620234 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.620147 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bace3793-d54a-47c6-a30b-d75e319d7753-metrics-client-ca\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.720750 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.720707 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bace3793-d54a-47c6-a30b-d75e319d7753-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.720958 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.720774 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bace3793-d54a-47c6-a30b-d75e319d7753-node-exporter-tls\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.720958 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.720816 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bace3793-d54a-47c6-a30b-d75e319d7753-root\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.720958 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.720859 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bace3793-d54a-47c6-a30b-d75e319d7753-node-exporter-wtmp\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.720958 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.720890 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bace3793-d54a-47c6-a30b-d75e319d7753-node-exporter-textfile\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.720958 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.720910 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bace3793-d54a-47c6-a30b-d75e319d7753-root\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.720958 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.720927 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bace3793-d54a-47c6-a30b-d75e319d7753-node-exporter-accelerators-collector-config\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.720958 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:26.720933 2570 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 14:00:26.721369 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.720989 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bace3793-d54a-47c6-a30b-d75e319d7753-sys\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.721369 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:26.721004 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bace3793-d54a-47c6-a30b-d75e319d7753-node-exporter-tls podName:bace3793-d54a-47c6-a30b-d75e319d7753 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:27.220983248 +0000 UTC m=+66.445654464 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/bace3793-d54a-47c6-a30b-d75e319d7753-node-exporter-tls") pod "node-exporter-gc5pn" (UID: "bace3793-d54a-47c6-a30b-d75e319d7753") : secret "node-exporter-tls" not found Apr 16 14:00:26.721369 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.721026 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bace3793-d54a-47c6-a30b-d75e319d7753-sys\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.721369 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.721034 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgqxs\" (UniqueName: \"kubernetes.io/projected/bace3793-d54a-47c6-a30b-d75e319d7753-kube-api-access-wgqxs\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.721369 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.721065 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bace3793-d54a-47c6-a30b-d75e319d7753-node-exporter-wtmp\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.721369 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.721068 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bace3793-d54a-47c6-a30b-d75e319d7753-metrics-client-ca\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.721369 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.721255 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bace3793-d54a-47c6-a30b-d75e319d7753-node-exporter-textfile\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.721672 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.721649 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bace3793-d54a-47c6-a30b-d75e319d7753-node-exporter-accelerators-collector-config\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.721755 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.721738 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bace3793-d54a-47c6-a30b-d75e319d7753-metrics-client-ca\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.723110 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.723091 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bace3793-d54a-47c6-a30b-d75e319d7753-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:26.729469 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:26.729407 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgqxs\" (UniqueName: \"kubernetes.io/projected/bace3793-d54a-47c6-a30b-d75e319d7753-kube-api-access-wgqxs\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:27.024160 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:27.024070 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq4jh\" (UniqueName: \"kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh\") pod \"network-check-target-m5nn8\" (UID: \"1522dd59-b1b0-4b61-8eed-6b2da396ebac\") " pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 14:00:27.024160 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:27.024124 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs\") pod \"network-metrics-daemon-2dz2d\" (UID: \"f570a9dc-9480-415b-9633-11fb3c3a05eb\") " pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 14:00:27.026608 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:27.026589 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:00:27.026649 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:27.026590 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:00:27.036565 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:27.036542 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:00:27.036783 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:27.036767 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f570a9dc-9480-415b-9633-11fb3c3a05eb-metrics-certs\") pod \"network-metrics-daemon-2dz2d\" (UID: \"f570a9dc-9480-415b-9633-11fb3c3a05eb\") " pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 14:00:27.065036 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:27.065011 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq4jh\" (UniqueName: \"kubernetes.io/projected/1522dd59-b1b0-4b61-8eed-6b2da396ebac-kube-api-access-jq4jh\") pod \"network-check-target-m5nn8\" (UID: \"1522dd59-b1b0-4b61-8eed-6b2da396ebac\") " pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 14:00:27.225444 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:27.225406 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bace3793-d54a-47c6-a30b-d75e319d7753-node-exporter-tls\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:27.227720 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:27.227695 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bace3793-d54a-47c6-a30b-d75e319d7753-node-exporter-tls\") pod \"node-exporter-gc5pn\" (UID: \"bace3793-d54a-47c6-a30b-d75e319d7753\") " pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:27.276656 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:27.276569 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lcmqf\"" Apr 16 14:00:27.282487 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:27.282466 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-snjtl\"" Apr 16 14:00:27.285273 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:27.285259 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 14:00:27.290982 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:27.290960 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2dz2d" Apr 16 14:00:27.408479 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:27.408452 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gc5pn" Apr 16 14:00:27.410554 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:27.410446 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-m5nn8"] Apr 16 14:00:27.414395 ip-10-0-136-109 kubenswrapper[2570]: W0416 14:00:27.414366 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1522dd59_b1b0_4b61_8eed_6b2da396ebac.slice/crio-9de2c168a9c72a308cce096fec400762dbe3bfaa8bcf22bd9232489f7d841438 WatchSource:0}: Error finding container 9de2c168a9c72a308cce096fec400762dbe3bfaa8bcf22bd9232489f7d841438: Status 404 returned error can't find the container with id 9de2c168a9c72a308cce096fec400762dbe3bfaa8bcf22bd9232489f7d841438 Apr 16 14:00:27.418300 ip-10-0-136-109 kubenswrapper[2570]: W0416 14:00:27.418265 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbace3793_d54a_47c6_a30b_d75e319d7753.slice/crio-29900499f74f4e102335608748ec4f3734da6018f36b97cf84a67c205c81d830 WatchSource:0}: Error finding container 29900499f74f4e102335608748ec4f3734da6018f36b97cf84a67c205c81d830: Status 404 returned error can't find the container with id 29900499f74f4e102335608748ec4f3734da6018f36b97cf84a67c205c81d830 Apr 16 14:00:27.429820 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:27.429800 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2dz2d"] Apr 16 14:00:27.432225 ip-10-0-136-109 kubenswrapper[2570]: W0416 14:00:27.432206 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf570a9dc_9480_415b_9633_11fb3c3a05eb.slice/crio-1bebbc3deb7e88cc9780f561ccdd4e6fdfa973a2d4d563c32f7daab66167a3a0 WatchSource:0}: Error finding container 1bebbc3deb7e88cc9780f561ccdd4e6fdfa973a2d4d563c32f7daab66167a3a0: Status 404 returned error can't find the container with id 1bebbc3deb7e88cc9780f561ccdd4e6fdfa973a2d4d563c32f7daab66167a3a0 Apr 16 14:00:27.564903 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:27.564814 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gc5pn" event={"ID":"bace3793-d54a-47c6-a30b-d75e319d7753","Type":"ContainerStarted","Data":"29900499f74f4e102335608748ec4f3734da6018f36b97cf84a67c205c81d830"} Apr 16 14:00:27.566022 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:27.566003 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2dz2d" event={"ID":"f570a9dc-9480-415b-9633-11fb3c3a05eb","Type":"ContainerStarted","Data":"1bebbc3deb7e88cc9780f561ccdd4e6fdfa973a2d4d563c32f7daab66167a3a0"} Apr 16 14:00:27.566987 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:27.566970 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-m5nn8" event={"ID":"1522dd59-b1b0-4b61-8eed-6b2da396ebac","Type":"ContainerStarted","Data":"9de2c168a9c72a308cce096fec400762dbe3bfaa8bcf22bd9232489f7d841438"} Apr 16 14:00:28.543641 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:28.543612 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-g94vr" Apr 16 14:00:29.575375 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:29.575324 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gc5pn" event={"ID":"bace3793-d54a-47c6-a30b-d75e319d7753","Type":"ContainerStarted","Data":"bda71eddcb1f8d237b145ef898ac0a9930c70d7db43bf0ebb877994d59bfb514"} Apr 16 14:00:29.577548 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:29.577515 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2dz2d" event={"ID":"f570a9dc-9480-415b-9633-11fb3c3a05eb","Type":"ContainerStarted","Data":"b25f04372701fedd2ddc6efbb345525f48780ec437567d925da5cad5d9895b5f"} Apr 16 14:00:29.577684 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:29.577557 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2dz2d" event={"ID":"f570a9dc-9480-415b-9633-11fb3c3a05eb","Type":"ContainerStarted","Data":"a355ffb66cb43879d44b025b1278ef4f319b2f8d6f854d6b7e08ecef414968ae"} Apr 16 14:00:29.608603 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:29.608549 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2dz2d" podStartSLOduration=67.15085034 podStartE2EDuration="1m8.608524142s" podCreationTimestamp="2026-04-16 13:59:21 +0000 UTC" firstStartedPulling="2026-04-16 14:00:27.434027785 +0000 UTC m=+66.658698998" lastFinishedPulling="2026-04-16 14:00:28.891701582 +0000 UTC m=+68.116372800" observedRunningTime="2026-04-16 14:00:29.608023113 +0000 UTC m=+68.832694350" watchObservedRunningTime="2026-04-16 14:00:29.608524142 +0000 UTC m=+68.833195415" Apr 16 14:00:30.582806 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:30.582769 2570 generic.go:358] "Generic (PLEG): container finished" podID="bace3793-d54a-47c6-a30b-d75e319d7753" containerID="bda71eddcb1f8d237b145ef898ac0a9930c70d7db43bf0ebb877994d59bfb514" exitCode=0 Apr 16 14:00:30.583272 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:30.582859 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gc5pn" event={"ID":"bace3793-d54a-47c6-a30b-d75e319d7753","Type":"ContainerDied","Data":"bda71eddcb1f8d237b145ef898ac0a9930c70d7db43bf0ebb877994d59bfb514"} Apr 16 14:00:31.250048 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.250009 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-28lxr"] Apr 16 14:00:31.254895 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.254870 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-28lxr" Apr 16 14:00:31.257466 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.257439 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 14:00:31.257609 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.257498 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-ngmk9\"" Apr 16 14:00:31.263509 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.263228 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-28lxr"] Apr 16 14:00:31.287674 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.287642 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-84d7b4569-pt6vc"] Apr 16 14:00:31.291204 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.291176 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.293702 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.293678 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-sb62k\"" Apr 16 14:00:31.294166 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.294144 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 14:00:31.294353 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.294166 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 14:00:31.294950 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.294932 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 14:00:31.295042 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.294978 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 14:00:31.295481 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.295462 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 14:00:31.296047 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.295466 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 14:00:31.296047 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.295948 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 14:00:31.302509 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.301909 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84d7b4569-pt6vc"] Apr 16 14:00:31.304804 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.304784 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 14:00:31.362344 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.362305 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-console-oauth-config\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.362454 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.362391 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-oauth-serving-cert\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.362454 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.362421 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-console-serving-cert\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.362454 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.362440 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-trusted-ca-bundle\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.362627 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.362489 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-console-config\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.362627 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.362511 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-service-ca\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.362627 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.362606 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/efa5b441-3877-44fb-8920-f8ce3027583c-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-28lxr\" (UID: \"efa5b441-3877-44fb-8920-f8ce3027583c\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-28lxr" Apr 16 14:00:31.362784 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.362646 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ddfh\" (UniqueName: \"kubernetes.io/projected/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-kube-api-access-8ddfh\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.463270 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.463222 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-console-serving-cert\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.463270 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.463269 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-trusted-ca-bundle\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.463559 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.463350 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-console-config\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.463559 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.463377 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-service-ca\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.463559 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.463419 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/efa5b441-3877-44fb-8920-f8ce3027583c-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-28lxr\" (UID: \"efa5b441-3877-44fb-8920-f8ce3027583c\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-28lxr" Apr 16 14:00:31.463559 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.463452 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ddfh\" (UniqueName: \"kubernetes.io/projected/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-kube-api-access-8ddfh\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.463559 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.463500 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-console-oauth-config\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.463559 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.463548 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-oauth-serving-cert\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.464135 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.464109 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-trusted-ca-bundle\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.464206 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.464135 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-oauth-serving-cert\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.464807 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.464782 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-service-ca\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.464930 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:31.464844 2570 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 14:00:31.464930 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:31.464901 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efa5b441-3877-44fb-8920-f8ce3027583c-monitoring-plugin-cert podName:efa5b441-3877-44fb-8920-f8ce3027583c nodeName:}" failed. No retries permitted until 2026-04-16 14:00:31.964882621 +0000 UTC m=+71.189553843 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/efa5b441-3877-44fb-8920-f8ce3027583c-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-28lxr" (UID: "efa5b441-3877-44fb-8920-f8ce3027583c") : secret "monitoring-plugin-cert" not found Apr 16 14:00:31.465048 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.464946 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-console-config\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.467098 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.467075 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-console-serving-cert\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.467554 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.467533 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-console-oauth-config\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.475372 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.475345 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ddfh\" (UniqueName: \"kubernetes.io/projected/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-kube-api-access-8ddfh\") pod \"console-84d7b4569-pt6vc\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.587982 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.587857 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gc5pn" event={"ID":"bace3793-d54a-47c6-a30b-d75e319d7753","Type":"ContainerStarted","Data":"bb27a518171c82c5c496df04d120029ff3cfc4af91fdb78256c3da11c9b23c4b"} Apr 16 14:00:31.587982 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.587890 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gc5pn" event={"ID":"bace3793-d54a-47c6-a30b-d75e319d7753","Type":"ContainerStarted","Data":"0fb2f2265c50b53bf8de0363628f94fa650f7bcf3f12a8b4996f77610838be03"} Apr 16 14:00:31.589247 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.589222 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-m5nn8" event={"ID":"1522dd59-b1b0-4b61-8eed-6b2da396ebac","Type":"ContainerStarted","Data":"fc1a46aa307fdaaace63013e95038222b25104262d702c9374dd8151c24e2650"} Apr 16 14:00:31.589363 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.589351 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 14:00:31.604076 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.604048 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:31.605592 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.605460 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-gc5pn" podStartSLOduration=3.623082033 podStartE2EDuration="5.605445483s" podCreationTimestamp="2026-04-16 14:00:26 +0000 UTC" firstStartedPulling="2026-04-16 14:00:27.419911395 +0000 UTC m=+66.644582609" lastFinishedPulling="2026-04-16 14:00:29.402274826 +0000 UTC m=+68.626946059" observedRunningTime="2026-04-16 14:00:31.604640829 +0000 UTC m=+70.829312077" watchObservedRunningTime="2026-04-16 14:00:31.605445483 +0000 UTC m=+70.830116720" Apr 16 14:00:31.619995 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.619940 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-m5nn8" podStartSLOduration=66.669756824 podStartE2EDuration="1m10.619920895s" podCreationTimestamp="2026-04-16 13:59:21 +0000 UTC" firstStartedPulling="2026-04-16 14:00:27.416598922 +0000 UTC m=+66.641270140" lastFinishedPulling="2026-04-16 14:00:31.366762993 +0000 UTC m=+70.591434211" observedRunningTime="2026-04-16 14:00:31.619730126 +0000 UTC m=+70.844401364" watchObservedRunningTime="2026-04-16 14:00:31.619920895 +0000 UTC m=+70.844592132" Apr 16 14:00:31.648062 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.648030 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-666696c969-rpcnv"] Apr 16 14:00:31.652719 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.652693 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.655663 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.655603 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 14:00:31.655779 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.655730 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 14:00:31.655849 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.655818 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-8qzh5\"" Apr 16 14:00:31.655907 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.655886 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 14:00:31.656084 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.656068 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 14:00:31.656162 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.656121 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 14:00:31.665820 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.664427 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-666696c969-rpcnv"] Apr 16 14:00:31.668924 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.668899 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 14:00:31.728735 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.728704 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84d7b4569-pt6vc"] Apr 16 14:00:31.731626 ip-10-0-136-109 kubenswrapper[2570]: W0416 14:00:31.731596 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod762fb34e_42ad_4e7c_8594_2fb2a1886ab3.slice/crio-03af5d6c561d9d5f669f7b84a2025134f674b57a38fd11c6eaafb99c19585260 WatchSource:0}: Error finding container 03af5d6c561d9d5f669f7b84a2025134f674b57a38fd11c6eaafb99c19585260: Status 404 returned error can't find the container with id 03af5d6c561d9d5f669f7b84a2025134f674b57a38fd11c6eaafb99c19585260 Apr 16 14:00:31.769926 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.769889 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6d8369-1763-4bdd-9853-41c96769db54-serving-certs-ca-bundle\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.769926 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.769926 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/9f6d8369-1763-4bdd-9853-41c96769db54-telemeter-client-tls\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.770128 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.769956 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9f6d8369-1763-4bdd-9853-41c96769db54-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.770128 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.769972 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6d8369-1763-4bdd-9853-41c96769db54-telemeter-trusted-ca-bundle\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.770128 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.770016 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9f6d8369-1763-4bdd-9853-41c96769db54-metrics-client-ca\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.770128 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.770042 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/9f6d8369-1763-4bdd-9853-41c96769db54-secret-telemeter-client\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.770128 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.770096 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/9f6d8369-1763-4bdd-9853-41c96769db54-federate-client-tls\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.770301 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.770137 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vbrp\" (UniqueName: \"kubernetes.io/projected/9f6d8369-1763-4bdd-9853-41c96769db54-kube-api-access-4vbrp\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.871467 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.871368 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/9f6d8369-1763-4bdd-9853-41c96769db54-federate-client-tls\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.871467 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.871421 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vbrp\" (UniqueName: \"kubernetes.io/projected/9f6d8369-1763-4bdd-9853-41c96769db54-kube-api-access-4vbrp\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.871695 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.871478 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6d8369-1763-4bdd-9853-41c96769db54-serving-certs-ca-bundle\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.871695 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.871507 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/9f6d8369-1763-4bdd-9853-41c96769db54-telemeter-client-tls\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.871695 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.871551 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9f6d8369-1763-4bdd-9853-41c96769db54-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.871695 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.871576 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6d8369-1763-4bdd-9853-41c96769db54-telemeter-trusted-ca-bundle\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.871695 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.871609 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9f6d8369-1763-4bdd-9853-41c96769db54-metrics-client-ca\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.871931 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.871842 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/9f6d8369-1763-4bdd-9853-41c96769db54-secret-telemeter-client\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.872488 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.872460 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6d8369-1763-4bdd-9853-41c96769db54-serving-certs-ca-bundle\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.872607 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.872547 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9f6d8369-1763-4bdd-9853-41c96769db54-metrics-client-ca\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.872675 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.872650 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f6d8369-1763-4bdd-9853-41c96769db54-telemeter-trusted-ca-bundle\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.874002 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.873977 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9f6d8369-1763-4bdd-9853-41c96769db54-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.874143 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.874122 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/9f6d8369-1763-4bdd-9853-41c96769db54-telemeter-client-tls\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.874292 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.874274 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/9f6d8369-1763-4bdd-9853-41c96769db54-federate-client-tls\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.874376 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.874359 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/9f6d8369-1763-4bdd-9853-41c96769db54-secret-telemeter-client\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.879213 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.879190 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vbrp\" (UniqueName: \"kubernetes.io/projected/9f6d8369-1763-4bdd-9853-41c96769db54-kube-api-access-4vbrp\") pod \"telemeter-client-666696c969-rpcnv\" (UID: \"9f6d8369-1763-4bdd-9853-41c96769db54\") " pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:31.972609 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.972573 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/efa5b441-3877-44fb-8920-f8ce3027583c-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-28lxr\" (UID: \"efa5b441-3877-44fb-8920-f8ce3027583c\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-28lxr" Apr 16 14:00:31.972771 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:31.972703 2570 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 14:00:31.972771 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:00:31.972764 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efa5b441-3877-44fb-8920-f8ce3027583c-monitoring-plugin-cert podName:efa5b441-3877-44fb-8920-f8ce3027583c nodeName:}" failed. No retries permitted until 2026-04-16 14:00:32.972748331 +0000 UTC m=+72.197419549 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/efa5b441-3877-44fb-8920-f8ce3027583c-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-28lxr" (UID: "efa5b441-3877-44fb-8920-f8ce3027583c") : secret "monitoring-plugin-cert" not found Apr 16 14:00:31.974500 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:31.974471 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" Apr 16 14:00:32.100060 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:32.100011 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-666696c969-rpcnv"] Apr 16 14:00:32.124829 ip-10-0-136-109 kubenswrapper[2570]: W0416 14:00:32.124794 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f6d8369_1763_4bdd_9853_41c96769db54.slice/crio-bf98c5c414f4d4fc02064605323d0f24da36521f25cc5cfd4aca8eb5f0af0e7f WatchSource:0}: Error finding container bf98c5c414f4d4fc02064605323d0f24da36521f25cc5cfd4aca8eb5f0af0e7f: Status 404 returned error can't find the container with id bf98c5c414f4d4fc02064605323d0f24da36521f25cc5cfd4aca8eb5f0af0e7f Apr 16 14:00:32.593727 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:32.593675 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84d7b4569-pt6vc" event={"ID":"762fb34e-42ad-4e7c-8594-2fb2a1886ab3","Type":"ContainerStarted","Data":"03af5d6c561d9d5f669f7b84a2025134f674b57a38fd11c6eaafb99c19585260"} Apr 16 14:00:32.595055 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:32.595023 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" event={"ID":"9f6d8369-1763-4bdd-9853-41c96769db54","Type":"ContainerStarted","Data":"bf98c5c414f4d4fc02064605323d0f24da36521f25cc5cfd4aca8eb5f0af0e7f"} Apr 16 14:00:32.981216 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:32.981170 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/efa5b441-3877-44fb-8920-f8ce3027583c-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-28lxr\" (UID: \"efa5b441-3877-44fb-8920-f8ce3027583c\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-28lxr" Apr 16 14:00:32.984062 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:32.984036 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/efa5b441-3877-44fb-8920-f8ce3027583c-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-28lxr\" (UID: \"efa5b441-3877-44fb-8920-f8ce3027583c\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-28lxr" Apr 16 14:00:33.005533 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:33.005494 2570 patch_prober.go:28] interesting pod/image-registry-59d866bf84-bkpt6 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:00:33.005696 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:33.005560 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" podUID="11d21f1f-6ef5-4db0-9edf-20ae92adb2a7" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:00:33.067437 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:33.067407 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-28lxr" Apr 16 14:00:33.201791 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:33.201758 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-28lxr"] Apr 16 14:00:33.206068 ip-10-0-136-109 kubenswrapper[2570]: W0416 14:00:33.206036 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefa5b441_3877_44fb_8920_f8ce3027583c.slice/crio-77ee9b2ef83a38e0448bcff6ddae72f9270bb96f1bf00fdd5ba24b29ba37d3c6 WatchSource:0}: Error finding container 77ee9b2ef83a38e0448bcff6ddae72f9270bb96f1bf00fdd5ba24b29ba37d3c6: Status 404 returned error can't find the container with id 77ee9b2ef83a38e0448bcff6ddae72f9270bb96f1bf00fdd5ba24b29ba37d3c6 Apr 16 14:00:33.599652 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:33.599617 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-28lxr" event={"ID":"efa5b441-3877-44fb-8920-f8ce3027583c","Type":"ContainerStarted","Data":"77ee9b2ef83a38e0448bcff6ddae72f9270bb96f1bf00fdd5ba24b29ba37d3c6"} Apr 16 14:00:34.524121 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:34.524086 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-59d866bf84-bkpt6" Apr 16 14:00:35.607945 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:35.607909 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84d7b4569-pt6vc" event={"ID":"762fb34e-42ad-4e7c-8594-2fb2a1886ab3","Type":"ContainerStarted","Data":"5e9afa491651aa03e350bcb721eba9101a4eef25528aecfa23196ecde0cb6757"} Apr 16 14:00:35.609589 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:35.609554 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-28lxr" event={"ID":"efa5b441-3877-44fb-8920-f8ce3027583c","Type":"ContainerStarted","Data":"8be9ddefa9a8748e26bed44215178cbeba9356e38ef9d9a4f4ff79f8108a5737"} Apr 16 14:00:35.609723 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:35.609706 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-28lxr" Apr 16 14:00:35.615213 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:35.615187 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-28lxr" Apr 16 14:00:35.625832 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:35.625786 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84d7b4569-pt6vc" podStartSLOduration=1.325007665 podStartE2EDuration="4.625773347s" podCreationTimestamp="2026-04-16 14:00:31 +0000 UTC" firstStartedPulling="2026-04-16 14:00:31.733495865 +0000 UTC m=+70.958167078" lastFinishedPulling="2026-04-16 14:00:35.034261533 +0000 UTC m=+74.258932760" observedRunningTime="2026-04-16 14:00:35.624267657 +0000 UTC m=+74.848938895" watchObservedRunningTime="2026-04-16 14:00:35.625773347 +0000 UTC m=+74.850444579" Apr 16 14:00:35.640793 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:35.640740 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-28lxr" podStartSLOduration=2.81542253 podStartE2EDuration="4.640726674s" podCreationTimestamp="2026-04-16 14:00:31 +0000 UTC" firstStartedPulling="2026-04-16 14:00:33.208434233 +0000 UTC m=+72.433105449" lastFinishedPulling="2026-04-16 14:00:35.033738373 +0000 UTC m=+74.258409593" observedRunningTime="2026-04-16 14:00:35.63948052 +0000 UTC m=+74.864151755" watchObservedRunningTime="2026-04-16 14:00:35.640726674 +0000 UTC m=+74.865397909" Apr 16 14:00:36.009847 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:36.009804 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret\") pod \"global-pull-secret-syncer-sxl7w\" (UID: \"c3956767-9c3a-4525-b3d3-d3e177d9479f\") " pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 14:00:36.012968 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:36.012944 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:00:36.023376 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:36.023329 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3956767-9c3a-4525-b3d3-d3e177d9479f-original-pull-secret\") pod \"global-pull-secret-syncer-sxl7w\" (UID: \"c3956767-9c3a-4525-b3d3-d3e177d9479f\") " pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 14:00:36.268494 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:36.268412 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sxl7w" Apr 16 14:00:36.392157 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:36.392129 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-sxl7w"] Apr 16 14:00:36.396951 ip-10-0-136-109 kubenswrapper[2570]: W0416 14:00:36.396922 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3956767_9c3a_4525_b3d3_d3e177d9479f.slice/crio-30323e7bd158d2ad5148842cc194f3a5860f932c2e9a4fb189c21e30d84921fb WatchSource:0}: Error finding container 30323e7bd158d2ad5148842cc194f3a5860f932c2e9a4fb189c21e30d84921fb: Status 404 returned error can't find the container with id 30323e7bd158d2ad5148842cc194f3a5860f932c2e9a4fb189c21e30d84921fb Apr 16 14:00:36.615640 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:36.615551 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-sxl7w" event={"ID":"c3956767-9c3a-4525-b3d3-d3e177d9479f","Type":"ContainerStarted","Data":"30323e7bd158d2ad5148842cc194f3a5860f932c2e9a4fb189c21e30d84921fb"} Apr 16 14:00:36.616920 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:36.616892 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" event={"ID":"9f6d8369-1763-4bdd-9853-41c96769db54","Type":"ContainerStarted","Data":"fc9e32420437afcd2ab0c9e7b6f97c2f82d9b7e10b811f116869d6362c45dfba"} Apr 16 14:00:41.605169 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:41.605019 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:41.605169 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:41.605069 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:41.610551 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:41.610528 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:41.637512 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:41.637483 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:00:43.640822 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:43.640775 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" event={"ID":"9f6d8369-1763-4bdd-9853-41c96769db54","Type":"ContainerStarted","Data":"6505ee14b2e278d6190dbdad6594de66a860251f881753d0393f058f4d95743a"} Apr 16 14:00:43.640822 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:43.640824 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" event={"ID":"9f6d8369-1763-4bdd-9853-41c96769db54","Type":"ContainerStarted","Data":"840bdd620567faa4f6fbb6d351178b74e65b68277f3310309fb8bcf07968c307"} Apr 16 14:00:43.642069 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:43.642033 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-sxl7w" event={"ID":"c3956767-9c3a-4525-b3d3-d3e177d9479f","Type":"ContainerStarted","Data":"86ba682a461bb808c4c6aba1ff060f72ab997b020e849909d5864f45d63a65fc"} Apr 16 14:00:43.662488 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:43.662432 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-666696c969-rpcnv" podStartSLOduration=1.630440562 podStartE2EDuration="12.662415676s" podCreationTimestamp="2026-04-16 14:00:31 +0000 UTC" firstStartedPulling="2026-04-16 14:00:32.12673816 +0000 UTC m=+71.351409373" lastFinishedPulling="2026-04-16 14:00:43.158713252 +0000 UTC m=+82.383384487" observedRunningTime="2026-04-16 14:00:43.660660084 +0000 UTC m=+82.885331319" watchObservedRunningTime="2026-04-16 14:00:43.662415676 +0000 UTC m=+82.887086914" Apr 16 14:00:43.675419 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:43.675367 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-sxl7w" podStartSLOduration=65.493115368 podStartE2EDuration="1m11.675354743s" podCreationTimestamp="2026-04-16 13:59:32 +0000 UTC" firstStartedPulling="2026-04-16 14:00:36.39911392 +0000 UTC m=+75.623785134" lastFinishedPulling="2026-04-16 14:00:42.581353296 +0000 UTC m=+81.806024509" observedRunningTime="2026-04-16 14:00:43.674375315 +0000 UTC m=+82.899046552" watchObservedRunningTime="2026-04-16 14:00:43.675354743 +0000 UTC m=+82.900025978" Apr 16 14:00:55.299212 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:00:55.299181 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84d7b4569-pt6vc"] Apr 16 14:01:02.597763 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:02.597728 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-m5nn8" Apr 16 14:01:20.318455 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.318382 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-84d7b4569-pt6vc" podUID="762fb34e-42ad-4e7c-8594-2fb2a1886ab3" containerName="console" containerID="cri-o://5e9afa491651aa03e350bcb721eba9101a4eef25528aecfa23196ecde0cb6757" gracePeriod=15 Apr 16 14:01:20.558125 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.558099 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84d7b4569-pt6vc_762fb34e-42ad-4e7c-8594-2fb2a1886ab3/console/0.log" Apr 16 14:01:20.558269 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.558164 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:01:20.623212 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.623124 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-trusted-ca-bundle\") pod \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " Apr 16 14:01:20.623212 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.623157 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-service-ca\") pod \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " Apr 16 14:01:20.623212 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.623180 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-console-serving-cert\") pod \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " Apr 16 14:01:20.623212 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.623206 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-console-oauth-config\") pod \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " Apr 16 14:01:20.623570 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.623236 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-console-config\") pod \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " Apr 16 14:01:20.623570 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.623266 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-oauth-serving-cert\") pod \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " Apr 16 14:01:20.623670 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.623629 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "762fb34e-42ad-4e7c-8594-2fb2a1886ab3" (UID: "762fb34e-42ad-4e7c-8594-2fb2a1886ab3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:20.623776 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.623666 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-service-ca" (OuterVolumeSpecName: "service-ca") pod "762fb34e-42ad-4e7c-8594-2fb2a1886ab3" (UID: "762fb34e-42ad-4e7c-8594-2fb2a1886ab3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:20.623833 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.623779 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "762fb34e-42ad-4e7c-8594-2fb2a1886ab3" (UID: "762fb34e-42ad-4e7c-8594-2fb2a1886ab3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:20.623833 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.623792 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-console-config" (OuterVolumeSpecName: "console-config") pod "762fb34e-42ad-4e7c-8594-2fb2a1886ab3" (UID: "762fb34e-42ad-4e7c-8594-2fb2a1886ab3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:20.625661 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.625634 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "762fb34e-42ad-4e7c-8594-2fb2a1886ab3" (UID: "762fb34e-42ad-4e7c-8594-2fb2a1886ab3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:20.625787 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.625708 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "762fb34e-42ad-4e7c-8594-2fb2a1886ab3" (UID: "762fb34e-42ad-4e7c-8594-2fb2a1886ab3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:20.724548 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.724514 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ddfh\" (UniqueName: \"kubernetes.io/projected/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-kube-api-access-8ddfh\") pod \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\" (UID: \"762fb34e-42ad-4e7c-8594-2fb2a1886ab3\") " Apr 16 14:01:20.724740 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.724652 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-trusted-ca-bundle\") on node \"ip-10-0-136-109.ec2.internal\" DevicePath \"\"" Apr 16 14:01:20.724740 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.724667 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-service-ca\") on node \"ip-10-0-136-109.ec2.internal\" DevicePath \"\"" Apr 16 14:01:20.724740 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.724676 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-console-serving-cert\") on node \"ip-10-0-136-109.ec2.internal\" DevicePath \"\"" Apr 16 14:01:20.724740 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.724689 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-console-oauth-config\") on node \"ip-10-0-136-109.ec2.internal\" DevicePath \"\"" Apr 16 14:01:20.724740 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.724701 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-console-config\") on node \"ip-10-0-136-109.ec2.internal\" DevicePath \"\"" Apr 16 14:01:20.724740 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.724711 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-oauth-serving-cert\") on node \"ip-10-0-136-109.ec2.internal\" DevicePath \"\"" Apr 16 14:01:20.726738 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.726708 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-kube-api-access-8ddfh" (OuterVolumeSpecName: "kube-api-access-8ddfh") pod "762fb34e-42ad-4e7c-8594-2fb2a1886ab3" (UID: "762fb34e-42ad-4e7c-8594-2fb2a1886ab3"). InnerVolumeSpecName "kube-api-access-8ddfh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:01:20.747004 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.746976 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84d7b4569-pt6vc_762fb34e-42ad-4e7c-8594-2fb2a1886ab3/console/0.log" Apr 16 14:01:20.747185 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.747030 2570 generic.go:358] "Generic (PLEG): container finished" podID="762fb34e-42ad-4e7c-8594-2fb2a1886ab3" containerID="5e9afa491651aa03e350bcb721eba9101a4eef25528aecfa23196ecde0cb6757" exitCode=2 Apr 16 14:01:20.747185 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.747114 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84d7b4569-pt6vc" Apr 16 14:01:20.747185 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.747122 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84d7b4569-pt6vc" event={"ID":"762fb34e-42ad-4e7c-8594-2fb2a1886ab3","Type":"ContainerDied","Data":"5e9afa491651aa03e350bcb721eba9101a4eef25528aecfa23196ecde0cb6757"} Apr 16 14:01:20.747185 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.747168 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84d7b4569-pt6vc" event={"ID":"762fb34e-42ad-4e7c-8594-2fb2a1886ab3","Type":"ContainerDied","Data":"03af5d6c561d9d5f669f7b84a2025134f674b57a38fd11c6eaafb99c19585260"} Apr 16 14:01:20.747413 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.747189 2570 scope.go:117] "RemoveContainer" containerID="5e9afa491651aa03e350bcb721eba9101a4eef25528aecfa23196ecde0cb6757" Apr 16 14:01:20.756075 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.756053 2570 scope.go:117] "RemoveContainer" containerID="5e9afa491651aa03e350bcb721eba9101a4eef25528aecfa23196ecde0cb6757" Apr 16 14:01:20.756517 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:01:20.756486 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e9afa491651aa03e350bcb721eba9101a4eef25528aecfa23196ecde0cb6757\": container with ID starting with 5e9afa491651aa03e350bcb721eba9101a4eef25528aecfa23196ecde0cb6757 not found: ID does not exist" containerID="5e9afa491651aa03e350bcb721eba9101a4eef25528aecfa23196ecde0cb6757" Apr 16 14:01:20.756752 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.756518 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e9afa491651aa03e350bcb721eba9101a4eef25528aecfa23196ecde0cb6757"} err="failed to get container status \"5e9afa491651aa03e350bcb721eba9101a4eef25528aecfa23196ecde0cb6757\": rpc error: code = NotFound desc = could not find container \"5e9afa491651aa03e350bcb721eba9101a4eef25528aecfa23196ecde0cb6757\": container with ID starting with 5e9afa491651aa03e350bcb721eba9101a4eef25528aecfa23196ecde0cb6757 not found: ID does not exist" Apr 16 14:01:20.772026 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.772000 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84d7b4569-pt6vc"] Apr 16 14:01:20.775375 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.775350 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-84d7b4569-pt6vc"] Apr 16 14:01:20.825379 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:20.825341 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8ddfh\" (UniqueName: \"kubernetes.io/projected/762fb34e-42ad-4e7c-8594-2fb2a1886ab3-kube-api-access-8ddfh\") on node \"ip-10-0-136-109.ec2.internal\" DevicePath \"\"" Apr 16 14:01:21.359706 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:21.359674 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="762fb34e-42ad-4e7c-8594-2fb2a1886ab3" path="/var/lib/kubelet/pods/762fb34e-42ad-4e7c-8594-2fb2a1886ab3/volumes" Apr 16 14:01:59.268106 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.268074 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-577ccd76d8-tq6qp"] Apr 16 14:01:59.268558 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.268336 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="762fb34e-42ad-4e7c-8594-2fb2a1886ab3" containerName="console" Apr 16 14:01:59.268558 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.268348 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="762fb34e-42ad-4e7c-8594-2fb2a1886ab3" containerName="console" Apr 16 14:01:59.268558 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.268388 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="762fb34e-42ad-4e7c-8594-2fb2a1886ab3" containerName="console" Apr 16 14:01:59.271190 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.271173 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.276137 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.276109 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 14:01:59.276310 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.276109 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 14:01:59.276310 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.276154 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-sb62k\"" Apr 16 14:01:59.276310 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.276229 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 14:01:59.276310 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.276275 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 14:01:59.276521 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.276473 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 14:01:59.276794 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.276779 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 14:01:59.276908 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.276892 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 14:01:59.280929 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.280905 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 14:01:59.281043 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.280969 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-577ccd76d8-tq6qp"] Apr 16 14:01:59.392548 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.392506 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-console-serving-cert\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.392732 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.392564 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-service-ca\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.392732 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.392595 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-console-oauth-config\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.392732 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.392632 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-oauth-serving-cert\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.392732 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.392655 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w22lj\" (UniqueName: \"kubernetes.io/projected/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-kube-api-access-w22lj\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.392873 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.392751 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-console-config\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.392873 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.392779 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-trusted-ca-bundle\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.493633 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.493601 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-console-config\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.493633 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.493634 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-trusted-ca-bundle\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.493813 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.493662 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-console-serving-cert\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.493813 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.493705 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-service-ca\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.493813 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.493733 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-console-oauth-config\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.493944 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.493890 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-oauth-serving-cert\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.493944 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.493933 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w22lj\" (UniqueName: \"kubernetes.io/projected/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-kube-api-access-w22lj\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.494401 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.494377 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-console-config\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.494531 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.494512 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-trusted-ca-bundle\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.494578 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.494530 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-oauth-serving-cert\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.494946 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.494932 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-service-ca\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.496685 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.496664 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-console-serving-cert\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.496777 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.496687 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-console-oauth-config\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.501966 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.501944 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w22lj\" (UniqueName: \"kubernetes.io/projected/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-kube-api-access-w22lj\") pod \"console-577ccd76d8-tq6qp\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.584556 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.584445 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:01:59.702666 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.702494 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-577ccd76d8-tq6qp"] Apr 16 14:01:59.705499 ip-10-0-136-109 kubenswrapper[2570]: W0416 14:01:59.705470 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65bc9bba_08ff_4b5d_9adf_cb5d049d1781.slice/crio-e17296057962ab83d545d98b29d485a0c544847abde8f0cf3101c95b0c35f8e3 WatchSource:0}: Error finding container e17296057962ab83d545d98b29d485a0c544847abde8f0cf3101c95b0c35f8e3: Status 404 returned error can't find the container with id e17296057962ab83d545d98b29d485a0c544847abde8f0cf3101c95b0c35f8e3 Apr 16 14:01:59.852193 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.852108 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-577ccd76d8-tq6qp" event={"ID":"65bc9bba-08ff-4b5d-9adf-cb5d049d1781","Type":"ContainerStarted","Data":"ca8767dbdd255db219a895944a8e7a909de621d89f8e45ab4632478b7a541ade"} Apr 16 14:01:59.852193 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.852145 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-577ccd76d8-tq6qp" event={"ID":"65bc9bba-08ff-4b5d-9adf-cb5d049d1781","Type":"ContainerStarted","Data":"e17296057962ab83d545d98b29d485a0c544847abde8f0cf3101c95b0c35f8e3"} Apr 16 14:01:59.871722 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:01:59.871674 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-577ccd76d8-tq6qp" podStartSLOduration=0.871660265 podStartE2EDuration="871.660265ms" podCreationTimestamp="2026-04-16 14:01:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:01:59.870135072 +0000 UTC m=+159.094806306" watchObservedRunningTime="2026-04-16 14:01:59.871660265 +0000 UTC m=+159.096331499" Apr 16 14:02:09.585282 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:09.585172 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:02:09.585282 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:09.585238 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:02:09.590034 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:09.590009 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:02:09.881943 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:09.881864 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:02:48.128578 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.128543 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84b99c995c-8lsfd"] Apr 16 14:02:48.131559 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.131541 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84b99c995c-8lsfd" Apr 16 14:02:48.133862 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.133839 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 14:02:48.133995 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.133977 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 14:02:48.135267 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.135250 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 14:02:48.135337 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.135301 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-g6mtn\"" Apr 16 14:02:48.135381 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.135361 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 14:02:48.140284 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.140263 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84b99c995c-8lsfd"] Apr 16 14:02:48.234203 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.234165 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b55479476-n5llj"] Apr 16 14:02:48.237273 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.237254 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b55479476-n5llj" Apr 16 14:02:48.239360 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.239342 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 14:02:48.247149 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.247127 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7"] Apr 16 14:02:48.250010 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.249990 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b55479476-n5llj"] Apr 16 14:02:48.250087 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.250082 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" Apr 16 14:02:48.252483 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.252465 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 14:02:48.252591 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.252469 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 14:02:48.252591 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.252532 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 14:02:48.252952 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.252937 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 14:02:48.261913 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.261892 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7"] Apr 16 14:02:48.277098 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.277074 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7x6h\" (UniqueName: \"kubernetes.io/projected/788d6e5f-31fe-424d-9709-eea5a95b9a5c-kube-api-access-t7x6h\") pod \"managed-serviceaccount-addon-agent-84b99c995c-8lsfd\" (UID: \"788d6e5f-31fe-424d-9709-eea5a95b9a5c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84b99c995c-8lsfd" Apr 16 14:02:48.277197 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.277110 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/788d6e5f-31fe-424d-9709-eea5a95b9a5c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-84b99c995c-8lsfd\" (UID: \"788d6e5f-31fe-424d-9709-eea5a95b9a5c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84b99c995c-8lsfd" Apr 16 14:02:48.378121 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.378085 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8db9be41-94e7-40ff-9640-4818eac29b16-klusterlet-config\") pod \"klusterlet-addon-workmgr-b55479476-n5llj\" (UID: \"8db9be41-94e7-40ff-9640-4818eac29b16\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b55479476-n5llj" Apr 16 14:02:48.378121 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.378121 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/4d1341d9-1e21-4649-87d7-a05b6842e881-ca\") pod \"cluster-proxy-proxy-agent-78f587b577-lv9j7\" (UID: \"4d1341d9-1e21-4649-87d7-a05b6842e881\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" Apr 16 14:02:48.378343 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.378150 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7x6h\" (UniqueName: \"kubernetes.io/projected/788d6e5f-31fe-424d-9709-eea5a95b9a5c-kube-api-access-t7x6h\") pod \"managed-serviceaccount-addon-agent-84b99c995c-8lsfd\" (UID: \"788d6e5f-31fe-424d-9709-eea5a95b9a5c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84b99c995c-8lsfd" Apr 16 14:02:48.378343 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.378206 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/788d6e5f-31fe-424d-9709-eea5a95b9a5c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-84b99c995c-8lsfd\" (UID: \"788d6e5f-31fe-424d-9709-eea5a95b9a5c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84b99c995c-8lsfd" Apr 16 14:02:48.378343 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.378308 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/4d1341d9-1e21-4649-87d7-a05b6842e881-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-78f587b577-lv9j7\" (UID: \"4d1341d9-1e21-4649-87d7-a05b6842e881\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" Apr 16 14:02:48.378476 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.378354 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clxzx\" (UniqueName: \"kubernetes.io/projected/4d1341d9-1e21-4649-87d7-a05b6842e881-kube-api-access-clxzx\") pod \"cluster-proxy-proxy-agent-78f587b577-lv9j7\" (UID: \"4d1341d9-1e21-4649-87d7-a05b6842e881\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" Apr 16 14:02:48.378476 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.378377 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/4d1341d9-1e21-4649-87d7-a05b6842e881-hub\") pod \"cluster-proxy-proxy-agent-78f587b577-lv9j7\" (UID: \"4d1341d9-1e21-4649-87d7-a05b6842e881\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" Apr 16 14:02:48.378476 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.378405 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/4d1341d9-1e21-4649-87d7-a05b6842e881-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-78f587b577-lv9j7\" (UID: \"4d1341d9-1e21-4649-87d7-a05b6842e881\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" Apr 16 14:02:48.378476 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.378427 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4d1341d9-1e21-4649-87d7-a05b6842e881-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-78f587b577-lv9j7\" (UID: \"4d1341d9-1e21-4649-87d7-a05b6842e881\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" Apr 16 14:02:48.378689 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.378509 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8db9be41-94e7-40ff-9640-4818eac29b16-tmp\") pod \"klusterlet-addon-workmgr-b55479476-n5llj\" (UID: \"8db9be41-94e7-40ff-9640-4818eac29b16\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b55479476-n5llj" Apr 16 14:02:48.378689 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.378545 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frt2b\" (UniqueName: \"kubernetes.io/projected/8db9be41-94e7-40ff-9640-4818eac29b16-kube-api-access-frt2b\") pod \"klusterlet-addon-workmgr-b55479476-n5llj\" (UID: \"8db9be41-94e7-40ff-9640-4818eac29b16\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b55479476-n5llj" Apr 16 14:02:48.380638 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.380621 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/788d6e5f-31fe-424d-9709-eea5a95b9a5c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-84b99c995c-8lsfd\" (UID: \"788d6e5f-31fe-424d-9709-eea5a95b9a5c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84b99c995c-8lsfd" Apr 16 14:02:48.387209 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.387190 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7x6h\" (UniqueName: \"kubernetes.io/projected/788d6e5f-31fe-424d-9709-eea5a95b9a5c-kube-api-access-t7x6h\") pod \"managed-serviceaccount-addon-agent-84b99c995c-8lsfd\" (UID: \"788d6e5f-31fe-424d-9709-eea5a95b9a5c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84b99c995c-8lsfd" Apr 16 14:02:48.454435 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.454407 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84b99c995c-8lsfd" Apr 16 14:02:48.479356 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.479306 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/4d1341d9-1e21-4649-87d7-a05b6842e881-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-78f587b577-lv9j7\" (UID: \"4d1341d9-1e21-4649-87d7-a05b6842e881\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" Apr 16 14:02:48.479498 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.479363 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clxzx\" (UniqueName: \"kubernetes.io/projected/4d1341d9-1e21-4649-87d7-a05b6842e881-kube-api-access-clxzx\") pod \"cluster-proxy-proxy-agent-78f587b577-lv9j7\" (UID: \"4d1341d9-1e21-4649-87d7-a05b6842e881\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" Apr 16 14:02:48.479498 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.479392 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/4d1341d9-1e21-4649-87d7-a05b6842e881-hub\") pod \"cluster-proxy-proxy-agent-78f587b577-lv9j7\" (UID: \"4d1341d9-1e21-4649-87d7-a05b6842e881\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" Apr 16 14:02:48.479498 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.479424 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/4d1341d9-1e21-4649-87d7-a05b6842e881-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-78f587b577-lv9j7\" (UID: \"4d1341d9-1e21-4649-87d7-a05b6842e881\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" Apr 16 14:02:48.479498 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.479463 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4d1341d9-1e21-4649-87d7-a05b6842e881-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-78f587b577-lv9j7\" (UID: \"4d1341d9-1e21-4649-87d7-a05b6842e881\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" Apr 16 14:02:48.479498 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.479491 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8db9be41-94e7-40ff-9640-4818eac29b16-tmp\") pod \"klusterlet-addon-workmgr-b55479476-n5llj\" (UID: \"8db9be41-94e7-40ff-9640-4818eac29b16\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b55479476-n5llj" Apr 16 14:02:48.479753 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.479509 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frt2b\" (UniqueName: \"kubernetes.io/projected/8db9be41-94e7-40ff-9640-4818eac29b16-kube-api-access-frt2b\") pod \"klusterlet-addon-workmgr-b55479476-n5llj\" (UID: \"8db9be41-94e7-40ff-9640-4818eac29b16\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b55479476-n5llj" Apr 16 14:02:48.479753 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.479542 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8db9be41-94e7-40ff-9640-4818eac29b16-klusterlet-config\") pod \"klusterlet-addon-workmgr-b55479476-n5llj\" (UID: \"8db9be41-94e7-40ff-9640-4818eac29b16\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b55479476-n5llj" Apr 16 14:02:48.479753 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.479574 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/4d1341d9-1e21-4649-87d7-a05b6842e881-ca\") pod \"cluster-proxy-proxy-agent-78f587b577-lv9j7\" (UID: \"4d1341d9-1e21-4649-87d7-a05b6842e881\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" Apr 16 14:02:48.480129 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.480076 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8db9be41-94e7-40ff-9640-4818eac29b16-tmp\") pod \"klusterlet-addon-workmgr-b55479476-n5llj\" (UID: \"8db9be41-94e7-40ff-9640-4818eac29b16\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b55479476-n5llj" Apr 16 14:02:48.480476 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.480449 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/4d1341d9-1e21-4649-87d7-a05b6842e881-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-78f587b577-lv9j7\" (UID: \"4d1341d9-1e21-4649-87d7-a05b6842e881\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" Apr 16 14:02:48.482029 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.482002 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/4d1341d9-1e21-4649-87d7-a05b6842e881-ca\") pod \"cluster-proxy-proxy-agent-78f587b577-lv9j7\" (UID: \"4d1341d9-1e21-4649-87d7-a05b6842e881\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" Apr 16 14:02:48.482211 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.482158 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/4d1341d9-1e21-4649-87d7-a05b6842e881-hub\") pod \"cluster-proxy-proxy-agent-78f587b577-lv9j7\" (UID: \"4d1341d9-1e21-4649-87d7-a05b6842e881\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" Apr 16 14:02:48.482477 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.482459 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/8db9be41-94e7-40ff-9640-4818eac29b16-klusterlet-config\") pod \"klusterlet-addon-workmgr-b55479476-n5llj\" (UID: \"8db9be41-94e7-40ff-9640-4818eac29b16\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b55479476-n5llj" Apr 16 14:02:48.482672 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.482655 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/4d1341d9-1e21-4649-87d7-a05b6842e881-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-78f587b577-lv9j7\" (UID: \"4d1341d9-1e21-4649-87d7-a05b6842e881\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" Apr 16 14:02:48.483074 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.483056 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4d1341d9-1e21-4649-87d7-a05b6842e881-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-78f587b577-lv9j7\" (UID: \"4d1341d9-1e21-4649-87d7-a05b6842e881\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" Apr 16 14:02:48.488571 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.488545 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frt2b\" (UniqueName: \"kubernetes.io/projected/8db9be41-94e7-40ff-9640-4818eac29b16-kube-api-access-frt2b\") pod \"klusterlet-addon-workmgr-b55479476-n5llj\" (UID: \"8db9be41-94e7-40ff-9640-4818eac29b16\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b55479476-n5llj" Apr 16 14:02:48.488674 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.488566 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clxzx\" (UniqueName: \"kubernetes.io/projected/4d1341d9-1e21-4649-87d7-a05b6842e881-kube-api-access-clxzx\") pod \"cluster-proxy-proxy-agent-78f587b577-lv9j7\" (UID: \"4d1341d9-1e21-4649-87d7-a05b6842e881\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" Apr 16 14:02:48.546641 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.546611 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b55479476-n5llj" Apr 16 14:02:48.559454 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.559429 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" Apr 16 14:02:48.569158 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.569131 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84b99c995c-8lsfd"] Apr 16 14:02:48.572093 ip-10-0-136-109 kubenswrapper[2570]: W0416 14:02:48.572054 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod788d6e5f_31fe_424d_9709_eea5a95b9a5c.slice/crio-707bbecf73d390758cb715635c3f6d4672445c977faa53cd339cb797f250c748 WatchSource:0}: Error finding container 707bbecf73d390758cb715635c3f6d4672445c977faa53cd339cb797f250c748: Status 404 returned error can't find the container with id 707bbecf73d390758cb715635c3f6d4672445c977faa53cd339cb797f250c748 Apr 16 14:02:48.668539 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.668504 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b55479476-n5llj"] Apr 16 14:02:48.671537 ip-10-0-136-109 kubenswrapper[2570]: W0416 14:02:48.671507 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8db9be41_94e7_40ff_9640_4818eac29b16.slice/crio-e02b1543f4265aa47811187c814de3e89e1628a58cbfb9d48452764103660e3f WatchSource:0}: Error finding container e02b1543f4265aa47811187c814de3e89e1628a58cbfb9d48452764103660e3f: Status 404 returned error can't find the container with id e02b1543f4265aa47811187c814de3e89e1628a58cbfb9d48452764103660e3f Apr 16 14:02:48.689770 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.689751 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7"] Apr 16 14:02:48.692064 ip-10-0-136-109 kubenswrapper[2570]: W0416 14:02:48.692040 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d1341d9_1e21_4649_87d7_a05b6842e881.slice/crio-940cf7c3bd7c3a06a3f341fa399b3055ad7a9535ab64a6d33f7abef1b3a39aeb WatchSource:0}: Error finding container 940cf7c3bd7c3a06a3f341fa399b3055ad7a9535ab64a6d33f7abef1b3a39aeb: Status 404 returned error can't find the container with id 940cf7c3bd7c3a06a3f341fa399b3055ad7a9535ab64a6d33f7abef1b3a39aeb Apr 16 14:02:48.980233 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.980193 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84b99c995c-8lsfd" event={"ID":"788d6e5f-31fe-424d-9709-eea5a95b9a5c","Type":"ContainerStarted","Data":"707bbecf73d390758cb715635c3f6d4672445c977faa53cd339cb797f250c748"} Apr 16 14:02:48.981132 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.981106 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" event={"ID":"4d1341d9-1e21-4649-87d7-a05b6842e881","Type":"ContainerStarted","Data":"940cf7c3bd7c3a06a3f341fa399b3055ad7a9535ab64a6d33f7abef1b3a39aeb"} Apr 16 14:02:48.981990 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:48.981970 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b55479476-n5llj" event={"ID":"8db9be41-94e7-40ff-9640-4818eac29b16","Type":"ContainerStarted","Data":"e02b1543f4265aa47811187c814de3e89e1628a58cbfb9d48452764103660e3f"} Apr 16 14:02:52.994707 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:52.994670 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" event={"ID":"4d1341d9-1e21-4649-87d7-a05b6842e881","Type":"ContainerStarted","Data":"65937c23588f4b592be6638c01eac3c82b6530957df8cc8579da168c274c2fd7"} Apr 16 14:02:52.995873 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:52.995847 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84b99c995c-8lsfd" event={"ID":"788d6e5f-31fe-424d-9709-eea5a95b9a5c","Type":"ContainerStarted","Data":"1d0ad776807bd15d11a480a48ff9d84dc97f235470fea972e6d6e70a3097ea4f"} Apr 16 14:02:53.011554 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:53.011512 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-84b99c995c-8lsfd" podStartSLOduration=1.526551934 podStartE2EDuration="5.011499395s" podCreationTimestamp="2026-04-16 14:02:48 +0000 UTC" firstStartedPulling="2026-04-16 14:02:48.573981123 +0000 UTC m=+207.798652336" lastFinishedPulling="2026-04-16 14:02:52.05892858 +0000 UTC m=+211.283599797" observedRunningTime="2026-04-16 14:02:53.009990975 +0000 UTC m=+212.234662211" watchObservedRunningTime="2026-04-16 14:02:53.011499395 +0000 UTC m=+212.236170630" Apr 16 14:02:55.010836 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:55.010798 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" event={"ID":"4d1341d9-1e21-4649-87d7-a05b6842e881","Type":"ContainerStarted","Data":"42496d64a46b7e3a2daa741a0349144a328c053eb4a1cc000ce96fc81dcff640"} Apr 16 14:02:55.010836 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:55.010835 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" event={"ID":"4d1341d9-1e21-4649-87d7-a05b6842e881","Type":"ContainerStarted","Data":"87d9cce4e6e364b7225a0bc833e1d1486d80e1eb793aadc91238b4de905d4730"} Apr 16 14:02:55.029772 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:02:55.029727 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78f587b577-lv9j7" podStartSLOduration=1.396818586 podStartE2EDuration="7.029713499s" podCreationTimestamp="2026-04-16 14:02:48 +0000 UTC" firstStartedPulling="2026-04-16 14:02:48.693724586 +0000 UTC m=+207.918395800" lastFinishedPulling="2026-04-16 14:02:54.326619494 +0000 UTC m=+213.551290713" observedRunningTime="2026-04-16 14:02:55.02890478 +0000 UTC m=+214.253576018" watchObservedRunningTime="2026-04-16 14:02:55.029713499 +0000 UTC m=+214.254384733" Apr 16 14:03:04.034866 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:03:04.034833 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b55479476-n5llj" event={"ID":"8db9be41-94e7-40ff-9640-4818eac29b16","Type":"ContainerStarted","Data":"6428f45ecf3430cf400018f27e1554c009281c6e9db6a08e40a8ffbbbcea1008"} Apr 16 14:03:04.035263 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:03:04.035069 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b55479476-n5llj" Apr 16 14:03:04.036900 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:03:04.036880 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b55479476-n5llj" Apr 16 14:03:04.050736 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:03:04.050677 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b55479476-n5llj" podStartSLOduration=1.416403573 podStartE2EDuration="16.050663158s" podCreationTimestamp="2026-04-16 14:02:48 +0000 UTC" firstStartedPulling="2026-04-16 14:02:48.673403194 +0000 UTC m=+207.898074407" lastFinishedPulling="2026-04-16 14:03:03.307662767 +0000 UTC m=+222.532333992" observedRunningTime="2026-04-16 14:03:04.049605381 +0000 UTC m=+223.274276615" watchObservedRunningTime="2026-04-16 14:03:04.050663158 +0000 UTC m=+223.275334392" Apr 16 14:04:21.240674 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:04:21.240644 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 14:04:21.241185 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:04:21.240734 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 14:04:21.247712 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:04:21.247688 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 14:06:11.221759 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:11.221722 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-577ccd76d8-tq6qp"] Apr 16 14:06:36.243065 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.243000 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-577ccd76d8-tq6qp" podUID="65bc9bba-08ff-4b5d-9adf-cb5d049d1781" containerName="console" containerID="cri-o://ca8767dbdd255db219a895944a8e7a909de621d89f8e45ab4632478b7a541ade" gracePeriod=15 Apr 16 14:06:36.483938 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.483912 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-577ccd76d8-tq6qp_65bc9bba-08ff-4b5d-9adf-cb5d049d1781/console/0.log" Apr 16 14:06:36.484120 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.483976 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:06:36.600839 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.600755 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-577ccd76d8-tq6qp_65bc9bba-08ff-4b5d-9adf-cb5d049d1781/console/0.log" Apr 16 14:06:36.600839 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.600797 2570 generic.go:358] "Generic (PLEG): container finished" podID="65bc9bba-08ff-4b5d-9adf-cb5d049d1781" containerID="ca8767dbdd255db219a895944a8e7a909de621d89f8e45ab4632478b7a541ade" exitCode=2 Apr 16 14:06:36.600839 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.600829 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-577ccd76d8-tq6qp" event={"ID":"65bc9bba-08ff-4b5d-9adf-cb5d049d1781","Type":"ContainerDied","Data":"ca8767dbdd255db219a895944a8e7a909de621d89f8e45ab4632478b7a541ade"} Apr 16 14:06:36.601099 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.600852 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-577ccd76d8-tq6qp" event={"ID":"65bc9bba-08ff-4b5d-9adf-cb5d049d1781","Type":"ContainerDied","Data":"e17296057962ab83d545d98b29d485a0c544847abde8f0cf3101c95b0c35f8e3"} Apr 16 14:06:36.601099 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.600867 2570 scope.go:117] "RemoveContainer" containerID="ca8767dbdd255db219a895944a8e7a909de621d89f8e45ab4632478b7a541ade" Apr 16 14:06:36.601099 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.600877 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-577ccd76d8-tq6qp" Apr 16 14:06:36.608157 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.608138 2570 scope.go:117] "RemoveContainer" containerID="ca8767dbdd255db219a895944a8e7a909de621d89f8e45ab4632478b7a541ade" Apr 16 14:06:36.608426 ip-10-0-136-109 kubenswrapper[2570]: E0416 14:06:36.608407 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca8767dbdd255db219a895944a8e7a909de621d89f8e45ab4632478b7a541ade\": container with ID starting with ca8767dbdd255db219a895944a8e7a909de621d89f8e45ab4632478b7a541ade not found: ID does not exist" containerID="ca8767dbdd255db219a895944a8e7a909de621d89f8e45ab4632478b7a541ade" Apr 16 14:06:36.608480 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.608434 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca8767dbdd255db219a895944a8e7a909de621d89f8e45ab4632478b7a541ade"} err="failed to get container status \"ca8767dbdd255db219a895944a8e7a909de621d89f8e45ab4632478b7a541ade\": rpc error: code = NotFound desc = could not find container \"ca8767dbdd255db219a895944a8e7a909de621d89f8e45ab4632478b7a541ade\": container with ID starting with ca8767dbdd255db219a895944a8e7a909de621d89f8e45ab4632478b7a541ade not found: ID does not exist" Apr 16 14:06:36.634687 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.634660 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-console-config\") pod \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " Apr 16 14:06:36.634750 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.634715 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w22lj\" (UniqueName: \"kubernetes.io/projected/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-kube-api-access-w22lj\") pod \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " Apr 16 14:06:36.634750 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.634741 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-console-serving-cert\") pod \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " Apr 16 14:06:36.634819 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.634767 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-oauth-serving-cert\") pod \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " Apr 16 14:06:36.634819 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.634784 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-service-ca\") pod \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " Apr 16 14:06:36.634819 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.634815 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-console-oauth-config\") pod \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " Apr 16 14:06:36.634966 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.634831 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-trusted-ca-bundle\") pod \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\" (UID: \"65bc9bba-08ff-4b5d-9adf-cb5d049d1781\") " Apr 16 14:06:36.635148 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.635106 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-console-config" (OuterVolumeSpecName: "console-config") pod "65bc9bba-08ff-4b5d-9adf-cb5d049d1781" (UID: "65bc9bba-08ff-4b5d-9adf-cb5d049d1781"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:06:36.635261 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.635239 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "65bc9bba-08ff-4b5d-9adf-cb5d049d1781" (UID: "65bc9bba-08ff-4b5d-9adf-cb5d049d1781"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:06:36.635362 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.635257 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-service-ca" (OuterVolumeSpecName: "service-ca") pod "65bc9bba-08ff-4b5d-9adf-cb5d049d1781" (UID: "65bc9bba-08ff-4b5d-9adf-cb5d049d1781"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:06:36.635362 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.635339 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "65bc9bba-08ff-4b5d-9adf-cb5d049d1781" (UID: "65bc9bba-08ff-4b5d-9adf-cb5d049d1781"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:06:36.636953 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.636930 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-kube-api-access-w22lj" (OuterVolumeSpecName: "kube-api-access-w22lj") pod "65bc9bba-08ff-4b5d-9adf-cb5d049d1781" (UID: "65bc9bba-08ff-4b5d-9adf-cb5d049d1781"). InnerVolumeSpecName "kube-api-access-w22lj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:06:36.637021 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.636919 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "65bc9bba-08ff-4b5d-9adf-cb5d049d1781" (UID: "65bc9bba-08ff-4b5d-9adf-cb5d049d1781"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:06:36.637021 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.636998 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "65bc9bba-08ff-4b5d-9adf-cb5d049d1781" (UID: "65bc9bba-08ff-4b5d-9adf-cb5d049d1781"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:06:36.735834 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.735787 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-oauth-serving-cert\") on node \"ip-10-0-136-109.ec2.internal\" DevicePath \"\"" Apr 16 14:06:36.735834 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.735832 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-service-ca\") on node \"ip-10-0-136-109.ec2.internal\" DevicePath \"\"" Apr 16 14:06:36.735834 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.735842 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-console-oauth-config\") on node \"ip-10-0-136-109.ec2.internal\" DevicePath \"\"" Apr 16 14:06:36.736043 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.735851 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-trusted-ca-bundle\") on node \"ip-10-0-136-109.ec2.internal\" DevicePath \"\"" Apr 16 14:06:36.736043 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.735861 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-console-config\") on node \"ip-10-0-136-109.ec2.internal\" DevicePath \"\"" Apr 16 14:06:36.736043 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.735870 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w22lj\" (UniqueName: \"kubernetes.io/projected/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-kube-api-access-w22lj\") on node \"ip-10-0-136-109.ec2.internal\" DevicePath \"\"" Apr 16 14:06:36.736043 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.735879 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65bc9bba-08ff-4b5d-9adf-cb5d049d1781-console-serving-cert\") on node \"ip-10-0-136-109.ec2.internal\" DevicePath \"\"" Apr 16 14:06:36.921598 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.921569 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-577ccd76d8-tq6qp"] Apr 16 14:06:36.924584 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:36.924558 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-577ccd76d8-tq6qp"] Apr 16 14:06:37.362278 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:06:37.362186 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65bc9bba-08ff-4b5d-9adf-cb5d049d1781" path="/var/lib/kubelet/pods/65bc9bba-08ff-4b5d-9adf-cb5d049d1781/volumes" Apr 16 14:09:21.261395 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:09:21.261365 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 14:09:21.261829 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:09:21.261560 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 14:14:21.281111 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:14:21.281032 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 14:14:21.281696 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:14:21.281285 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 14:19:21.298599 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:19:21.298570 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 14:19:21.299631 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:19:21.299616 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 14:24:21.317544 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:24:21.317517 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 14:24:21.320015 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:24:21.318978 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 14:29:21.334084 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:29:21.333971 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 14:29:21.341531 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:29:21.337563 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 14:34:21.352984 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:34:21.352872 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 14:34:21.356963 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:34:21.355673 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 14:39:21.369771 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:39:21.369665 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 14:39:21.373604 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:39:21.373587 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 14:44:21.386962 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:44:21.386846 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 14:44:21.391970 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:44:21.391952 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 14:47:28.248092 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:28.248053 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wdzv9/must-gather-f97t2"] Apr 16 14:47:28.248559 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:28.248358 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65bc9bba-08ff-4b5d-9adf-cb5d049d1781" containerName="console" Apr 16 14:47:28.248559 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:28.248370 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="65bc9bba-08ff-4b5d-9adf-cb5d049d1781" containerName="console" Apr 16 14:47:28.248559 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:28.248417 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="65bc9bba-08ff-4b5d-9adf-cb5d049d1781" containerName="console" Apr 16 14:47:28.251170 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:28.251155 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wdzv9/must-gather-f97t2" Apr 16 14:47:28.253468 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:28.253445 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wdzv9\"/\"default-dockercfg-hq28d\"" Apr 16 14:47:28.253599 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:28.253485 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wdzv9\"/\"openshift-service-ca.crt\"" Apr 16 14:47:28.253599 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:28.253446 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wdzv9\"/\"kube-root-ca.crt\"" Apr 16 14:47:28.257399 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:28.257377 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wdzv9/must-gather-f97t2"] Apr 16 14:47:28.403490 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:28.403451 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbmr9\" (UniqueName: \"kubernetes.io/projected/88d99faf-5bb6-4b35-b5cb-beead5f77765-kube-api-access-dbmr9\") pod \"must-gather-f97t2\" (UID: \"88d99faf-5bb6-4b35-b5cb-beead5f77765\") " pod="openshift-must-gather-wdzv9/must-gather-f97t2" Apr 16 14:47:28.403670 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:28.403500 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/88d99faf-5bb6-4b35-b5cb-beead5f77765-must-gather-output\") pod \"must-gather-f97t2\" (UID: \"88d99faf-5bb6-4b35-b5cb-beead5f77765\") " pod="openshift-must-gather-wdzv9/must-gather-f97t2" Apr 16 14:47:28.504782 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:28.504675 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbmr9\" (UniqueName: \"kubernetes.io/projected/88d99faf-5bb6-4b35-b5cb-beead5f77765-kube-api-access-dbmr9\") pod \"must-gather-f97t2\" (UID: \"88d99faf-5bb6-4b35-b5cb-beead5f77765\") " pod="openshift-must-gather-wdzv9/must-gather-f97t2" Apr 16 14:47:28.505017 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:28.504996 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/88d99faf-5bb6-4b35-b5cb-beead5f77765-must-gather-output\") pod \"must-gather-f97t2\" (UID: \"88d99faf-5bb6-4b35-b5cb-beead5f77765\") " pod="openshift-must-gather-wdzv9/must-gather-f97t2" Apr 16 14:47:28.505276 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:28.505257 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/88d99faf-5bb6-4b35-b5cb-beead5f77765-must-gather-output\") pod \"must-gather-f97t2\" (UID: \"88d99faf-5bb6-4b35-b5cb-beead5f77765\") " pod="openshift-must-gather-wdzv9/must-gather-f97t2" Apr 16 14:47:28.513465 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:28.513441 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbmr9\" (UniqueName: \"kubernetes.io/projected/88d99faf-5bb6-4b35-b5cb-beead5f77765-kube-api-access-dbmr9\") pod \"must-gather-f97t2\" (UID: \"88d99faf-5bb6-4b35-b5cb-beead5f77765\") " pod="openshift-must-gather-wdzv9/must-gather-f97t2" Apr 16 14:47:28.561649 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:28.561616 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wdzv9/must-gather-f97t2" Apr 16 14:47:28.681972 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:28.681912 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wdzv9/must-gather-f97t2"] Apr 16 14:47:28.686616 ip-10-0-136-109 kubenswrapper[2570]: W0416 14:47:28.686587 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88d99faf_5bb6_4b35_b5cb_beead5f77765.slice/crio-7936fd57d7a12a5bb9ed1ed7752a948b858e425723fa38744a717b37fe688b78 WatchSource:0}: Error finding container 7936fd57d7a12a5bb9ed1ed7752a948b858e425723fa38744a717b37fe688b78: Status 404 returned error can't find the container with id 7936fd57d7a12a5bb9ed1ed7752a948b858e425723fa38744a717b37fe688b78 Apr 16 14:47:28.688385 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:28.688366 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:47:29.048377 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:29.048338 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wdzv9/must-gather-f97t2" event={"ID":"88d99faf-5bb6-4b35-b5cb-beead5f77765","Type":"ContainerStarted","Data":"7936fd57d7a12a5bb9ed1ed7752a948b858e425723fa38744a717b37fe688b78"} Apr 16 14:47:30.054368 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:30.054051 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wdzv9/must-gather-f97t2" event={"ID":"88d99faf-5bb6-4b35-b5cb-beead5f77765","Type":"ContainerStarted","Data":"eec65950483cf7b30e3f807141629deb6f8f1144f5a1e35be868c4d230831ecb"} Apr 16 14:47:30.054368 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:30.054095 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wdzv9/must-gather-f97t2" event={"ID":"88d99faf-5bb6-4b35-b5cb-beead5f77765","Type":"ContainerStarted","Data":"a2a9851aee3dbf18e8e3da2941ed1fbba98a58b87d937a4b427fd5fd456094af"} Apr 16 14:47:30.068811 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:30.068748 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wdzv9/must-gather-f97t2" podStartSLOduration=1.337720813 podStartE2EDuration="2.068726235s" podCreationTimestamp="2026-04-16 14:47:28 +0000 UTC" firstStartedPulling="2026-04-16 14:47:28.688553721 +0000 UTC m=+2887.913224948" lastFinishedPulling="2026-04-16 14:47:29.419559155 +0000 UTC m=+2888.644230370" observedRunningTime="2026-04-16 14:47:30.067802123 +0000 UTC m=+2889.292473369" watchObservedRunningTime="2026-04-16 14:47:30.068726235 +0000 UTC m=+2889.293397471" Apr 16 14:47:30.887090 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:30.887044 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-sxl7w_c3956767-9c3a-4525-b3d3-d3e177d9479f/global-pull-secret-syncer/0.log" Apr 16 14:47:31.005713 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:31.005678 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-x8zzl_3ad29733-1e32-4cb7-9641-906b311b4961/konnectivity-agent/0.log" Apr 16 14:47:31.079386 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:31.079357 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-109.ec2.internal_2a1992b488b3339bc008fb24c80291d9/haproxy/0.log" Apr 16 14:47:34.457517 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:34.457490 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-28lxr_efa5b441-3877-44fb-8920-f8ce3027583c/monitoring-plugin/0.log" Apr 16 14:47:34.563804 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:34.563769 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gc5pn_bace3793-d54a-47c6-a30b-d75e319d7753/node-exporter/0.log" Apr 16 14:47:34.584414 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:34.584392 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gc5pn_bace3793-d54a-47c6-a30b-d75e319d7753/kube-rbac-proxy/0.log" Apr 16 14:47:34.605845 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:34.605811 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gc5pn_bace3793-d54a-47c6-a30b-d75e319d7753/init-textfile/0.log" Apr 16 14:47:34.961869 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:34.961835 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-xgrnx_4348b7d4-6755-4435-9405-2298a8d123bc/prometheus-operator/0.log" Apr 16 14:47:34.985551 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:34.985476 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-xgrnx_4348b7d4-6755-4435-9405-2298a8d123bc/kube-rbac-proxy/0.log" Apr 16 14:47:35.009179 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:35.009146 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-kqz98_98f256c6-561e-44bb-8dcb-e35ac6f8bab0/prometheus-operator-admission-webhook/0.log" Apr 16 14:47:35.038414 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:35.038363 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-666696c969-rpcnv_9f6d8369-1763-4bdd-9853-41c96769db54/telemeter-client/0.log" Apr 16 14:47:35.059339 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:35.059285 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-666696c969-rpcnv_9f6d8369-1763-4bdd-9853-41c96769db54/reload/0.log" Apr 16 14:47:35.078500 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:35.078469 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-666696c969-rpcnv_9f6d8369-1763-4bdd-9853-41c96769db54/kube-rbac-proxy/0.log" Apr 16 14:47:38.086613 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.086576 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8"] Apr 16 14:47:38.091157 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.091133 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" Apr 16 14:47:38.095851 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.095827 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8"] Apr 16 14:47:38.190497 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.190458 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9d938034-a0df-4b1e-9318-6d8f05b0e6e0-podres\") pod \"perf-node-gather-daemonset-tcbq8\" (UID: \"9d938034-a0df-4b1e-9318-6d8f05b0e6e0\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" Apr 16 14:47:38.190785 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.190764 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9d938034-a0df-4b1e-9318-6d8f05b0e6e0-lib-modules\") pod \"perf-node-gather-daemonset-tcbq8\" (UID: \"9d938034-a0df-4b1e-9318-6d8f05b0e6e0\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" Apr 16 14:47:38.191706 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.191682 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtnvx\" (UniqueName: \"kubernetes.io/projected/9d938034-a0df-4b1e-9318-6d8f05b0e6e0-kube-api-access-xtnvx\") pod \"perf-node-gather-daemonset-tcbq8\" (UID: \"9d938034-a0df-4b1e-9318-6d8f05b0e6e0\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" Apr 16 14:47:38.191811 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.191713 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9d938034-a0df-4b1e-9318-6d8f05b0e6e0-proc\") pod \"perf-node-gather-daemonset-tcbq8\" (UID: \"9d938034-a0df-4b1e-9318-6d8f05b0e6e0\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" Apr 16 14:47:38.191811 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.191745 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9d938034-a0df-4b1e-9318-6d8f05b0e6e0-sys\") pod \"perf-node-gather-daemonset-tcbq8\" (UID: \"9d938034-a0df-4b1e-9318-6d8f05b0e6e0\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" Apr 16 14:47:38.292895 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.292859 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9d938034-a0df-4b1e-9318-6d8f05b0e6e0-lib-modules\") pod \"perf-node-gather-daemonset-tcbq8\" (UID: \"9d938034-a0df-4b1e-9318-6d8f05b0e6e0\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" Apr 16 14:47:38.293074 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.292940 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xtnvx\" (UniqueName: \"kubernetes.io/projected/9d938034-a0df-4b1e-9318-6d8f05b0e6e0-kube-api-access-xtnvx\") pod \"perf-node-gather-daemonset-tcbq8\" (UID: \"9d938034-a0df-4b1e-9318-6d8f05b0e6e0\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" Apr 16 14:47:38.293074 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.293002 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9d938034-a0df-4b1e-9318-6d8f05b0e6e0-proc\") pod \"perf-node-gather-daemonset-tcbq8\" (UID: \"9d938034-a0df-4b1e-9318-6d8f05b0e6e0\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" Apr 16 14:47:38.293074 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.293034 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9d938034-a0df-4b1e-9318-6d8f05b0e6e0-lib-modules\") pod \"perf-node-gather-daemonset-tcbq8\" (UID: \"9d938034-a0df-4b1e-9318-6d8f05b0e6e0\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" Apr 16 14:47:38.293074 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.293047 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9d938034-a0df-4b1e-9318-6d8f05b0e6e0-sys\") pod \"perf-node-gather-daemonset-tcbq8\" (UID: \"9d938034-a0df-4b1e-9318-6d8f05b0e6e0\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" Apr 16 14:47:38.293265 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.293105 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9d938034-a0df-4b1e-9318-6d8f05b0e6e0-sys\") pod \"perf-node-gather-daemonset-tcbq8\" (UID: \"9d938034-a0df-4b1e-9318-6d8f05b0e6e0\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" Apr 16 14:47:38.293265 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.293119 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9d938034-a0df-4b1e-9318-6d8f05b0e6e0-podres\") pod \"perf-node-gather-daemonset-tcbq8\" (UID: \"9d938034-a0df-4b1e-9318-6d8f05b0e6e0\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" Apr 16 14:47:38.293265 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.293106 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9d938034-a0df-4b1e-9318-6d8f05b0e6e0-proc\") pod \"perf-node-gather-daemonset-tcbq8\" (UID: \"9d938034-a0df-4b1e-9318-6d8f05b0e6e0\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" Apr 16 14:47:38.293265 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.293220 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9d938034-a0df-4b1e-9318-6d8f05b0e6e0-podres\") pod \"perf-node-gather-daemonset-tcbq8\" (UID: \"9d938034-a0df-4b1e-9318-6d8f05b0e6e0\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" Apr 16 14:47:38.300866 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.300841 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtnvx\" (UniqueName: \"kubernetes.io/projected/9d938034-a0df-4b1e-9318-6d8f05b0e6e0-kube-api-access-xtnvx\") pod \"perf-node-gather-daemonset-tcbq8\" (UID: \"9d938034-a0df-4b1e-9318-6d8f05b0e6e0\") " pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" Apr 16 14:47:38.326292 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.326250 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-g94vr_4ade50be-58d9-4908-9196-52d293c0182d/dns/0.log" Apr 16 14:47:38.344657 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.344593 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-g94vr_4ade50be-58d9-4908-9196-52d293c0182d/kube-rbac-proxy/0.log" Apr 16 14:47:38.404007 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.403947 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" Apr 16 14:47:38.452509 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.452483 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zvl68_d62a9568-15dd-4b2a-b879-e1ae35037432/dns-node-resolver/0.log" Apr 16 14:47:38.540090 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.539909 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8"] Apr 16 14:47:38.543000 ip-10-0-136-109 kubenswrapper[2570]: W0416 14:47:38.542966 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9d938034_a0df_4b1e_9318_6d8f05b0e6e0.slice/crio-a39e8dba2c95309dfc26b3e9035fe4167d00a4a75af124c288e4b5a6bc47a265 WatchSource:0}: Error finding container a39e8dba2c95309dfc26b3e9035fe4167d00a4a75af124c288e4b5a6bc47a265: Status 404 returned error can't find the container with id a39e8dba2c95309dfc26b3e9035fe4167d00a4a75af124c288e4b5a6bc47a265 Apr 16 14:47:38.870052 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.870027 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-59d866bf84-bkpt6_11d21f1f-6ef5-4db0-9edf-20ae92adb2a7/registry/0.log" Apr 16 14:47:38.936090 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:38.936062 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-q4pbl_3f616ae2-8c1b-4e05-b95a-1e9e5ed4db5d/node-ca/0.log" Apr 16 14:47:39.087267 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:39.087232 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" event={"ID":"9d938034-a0df-4b1e-9318-6d8f05b0e6e0","Type":"ContainerStarted","Data":"5bcf242e22792282d6b1e23b87e6d12fb5c0fbd176a2d8c23c2201dff7047f1f"} Apr 16 14:47:39.087267 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:39.087266 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" event={"ID":"9d938034-a0df-4b1e-9318-6d8f05b0e6e0","Type":"ContainerStarted","Data":"a39e8dba2c95309dfc26b3e9035fe4167d00a4a75af124c288e4b5a6bc47a265"} Apr 16 14:47:39.087687 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:39.087383 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" Apr 16 14:47:39.103732 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:39.103674 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" podStartSLOduration=1.103655593 podStartE2EDuration="1.103655593s" podCreationTimestamp="2026-04-16 14:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:47:39.102265809 +0000 UTC m=+2898.326937044" watchObservedRunningTime="2026-04-16 14:47:39.103655593 +0000 UTC m=+2898.328326832" Apr 16 14:47:39.972351 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:39.972295 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2z9p5_d9aec4e9-fdb6-48ec-9e84-bfcd44149787/serve-healthcheck-canary/0.log" Apr 16 14:47:40.495612 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:40.495575 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lmzjv_472bbe6c-6852-49ae-b248-56beee337ffa/kube-rbac-proxy/0.log" Apr 16 14:47:40.513341 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:40.513292 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lmzjv_472bbe6c-6852-49ae-b248-56beee337ffa/exporter/0.log" Apr 16 14:47:40.561049 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:40.561012 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lmzjv_472bbe6c-6852-49ae-b248-56beee337ffa/extractor/0.log" Apr 16 14:47:45.101234 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:45.101201 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wdzv9/perf-node-gather-daemonset-tcbq8" Apr 16 14:47:47.779352 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:47.779262 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvgp6_1cbe0c2f-4375-424a-a6f9-acf5ed5f216c/kube-multus-additional-cni-plugins/0.log" Apr 16 14:47:47.799143 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:47.799111 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvgp6_1cbe0c2f-4375-424a-a6f9-acf5ed5f216c/egress-router-binary-copy/0.log" Apr 16 14:47:47.819862 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:47.819833 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvgp6_1cbe0c2f-4375-424a-a6f9-acf5ed5f216c/cni-plugins/0.log" Apr 16 14:47:47.840512 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:47.840475 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvgp6_1cbe0c2f-4375-424a-a6f9-acf5ed5f216c/bond-cni-plugin/0.log" Apr 16 14:47:47.859159 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:47.859130 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvgp6_1cbe0c2f-4375-424a-a6f9-acf5ed5f216c/routeoverride-cni/0.log" Apr 16 14:47:47.877330 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:47.877285 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvgp6_1cbe0c2f-4375-424a-a6f9-acf5ed5f216c/whereabouts-cni-bincopy/0.log" Apr 16 14:47:47.898776 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:47.898744 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mvgp6_1cbe0c2f-4375-424a-a6f9-acf5ed5f216c/whereabouts-cni/0.log" Apr 16 14:47:48.092231 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:48.092157 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-skgt6_45edff06-17ec-4445-a612-10113a6f9a02/kube-multus/0.log" Apr 16 14:47:48.169037 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:48.169005 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2dz2d_f570a9dc-9480-415b-9633-11fb3c3a05eb/network-metrics-daemon/0.log" Apr 16 14:47:48.186881 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:48.186858 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2dz2d_f570a9dc-9480-415b-9633-11fb3c3a05eb/kube-rbac-proxy/0.log" Apr 16 14:47:49.371286 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:49.371242 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-controller/0.log" Apr 16 14:47:49.405529 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:49.405500 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/0.log" Apr 16 14:47:49.428918 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:49.428874 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovn-acl-logging/1.log" Apr 16 14:47:49.448230 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:49.448149 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/kube-rbac-proxy-node/0.log" Apr 16 14:47:49.466970 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:49.466936 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 14:47:49.486435 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:49.486401 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/northd/0.log" Apr 16 14:47:49.507609 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:49.507584 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/nbdb/0.log" Apr 16 14:47:49.526910 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:49.526882 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/sbdb/0.log" Apr 16 14:47:49.631360 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:49.631326 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fkt9w_8ad82b8c-5f9d-40e3-bf04-ee7dff525d90/ovnkube-controller/0.log" Apr 16 14:47:50.901711 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:50.901678 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-m5nn8_1522dd59-b1b0-4b61-8eed-6b2da396ebac/network-check-target-container/0.log" Apr 16 14:47:51.774859 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:51.774828 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-gn66w_7d916fa4-9672-4e7a-be82-02e78c5a0df3/iptables-alerter/0.log" Apr 16 14:47:52.419334 ip-10-0-136-109 kubenswrapper[2570]: I0416 14:47:52.419291 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-b8kz6_d47f2738-9503-4e5e-8359-c1d73e1fc168/tuned/0.log"