Apr 16 14:52:10.760419 ip-10-0-129-105 systemd[1]: Starting Kubernetes Kubelet... Apr 16 14:52:11.233928 ip-10-0-129-105 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:11.233928 ip-10-0-129-105 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 14:52:11.233928 ip-10-0-129-105 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:11.233928 ip-10-0-129-105 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 14:52:11.233928 ip-10-0-129-105 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:11.236334 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.236236 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 14:52:11.244467 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244441 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:11.244467 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244467 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:11.244467 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244470 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:11.244572 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244473 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:11.244572 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244477 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:11.244572 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244480 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:11.244572 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244483 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:11.244572 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244486 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:11.244572 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244489 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:11.244572 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244492 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:11.244572 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244494 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:11.244572 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244497 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:11.244572 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244500 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:11.244572 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244503 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:11.244572 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244506 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:11.244572 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244509 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:11.244572 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244511 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:11.244572 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244514 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:11.244572 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244517 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:11.244572 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244520 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:11.244572 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244522 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:11.244572 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244525 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:11.244572 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244528 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:11.245045 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244531 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:11.245045 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244535 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:11.245045 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244538 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:11.245045 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244541 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:11.245045 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244544 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:11.245045 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244547 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:11.245045 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244550 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:11.245045 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244552 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:11.245045 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244555 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:11.245045 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244558 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:11.245045 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244560 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:11.245045 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244563 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:11.245045 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244566 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:11.245045 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244569 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:11.245045 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244572 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:11.245045 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244575 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:11.245045 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244577 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:11.245045 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244581 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:11.245045 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244584 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:11.245045 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244587 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:11.245546 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244589 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:11.245546 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244592 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:11.245546 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244597 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:11.245546 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244602 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:11.245546 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244606 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:11.245546 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244609 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:11.245546 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244611 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:11.245546 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244615 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:11.245546 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244617 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:11.245546 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244620 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:11.245546 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244623 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:11.245546 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244626 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:11.245546 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244628 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:11.245546 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244631 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:11.245546 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244634 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:11.245546 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244636 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:11.245546 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244639 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:11.245546 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244641 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:11.245546 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244644 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:11.246006 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244646 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:11.246006 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244649 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:11.246006 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244652 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:11.246006 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244654 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:11.246006 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244657 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:11.246006 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244661 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:11.246006 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244668 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:11.246006 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244671 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:11.246006 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244674 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:11.246006 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244677 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:11.246006 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244680 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:11.246006 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244683 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:11.246006 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244688 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:11.246006 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244691 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:11.246006 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244694 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:11.246006 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244697 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:11.246006 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244700 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:11.246006 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244703 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:11.246006 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244705 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:11.246006 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244708 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:11.246506 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244711 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:11.246506 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244714 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:11.246506 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244718 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:11.246506 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.244721 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:11.246506 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245189 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:11.246506 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245195 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:11.246506 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245199 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:11.246506 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245202 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:11.246506 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245205 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:11.246506 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245209 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:11.246506 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245211 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:11.246506 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245214 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:11.246506 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245217 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:11.246506 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245220 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:11.246506 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245223 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:11.246506 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245225 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:11.246506 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245228 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:11.246506 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245231 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:11.246506 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245234 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:11.246506 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245237 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:11.246980 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245240 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:11.246980 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245243 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:11.246980 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245245 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:11.246980 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245248 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:11.246980 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245251 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:11.246980 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245254 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:11.246980 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245256 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:11.246980 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245259 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:11.246980 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245262 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:11.246980 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245264 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:11.246980 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245267 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:11.246980 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245269 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:11.246980 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245272 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:11.246980 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245276 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:11.246980 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245278 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:11.246980 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245281 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:11.246980 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245283 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:11.246980 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245286 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:11.246980 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245288 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:11.247475 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245291 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:11.247475 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245294 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:11.247475 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245296 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:11.247475 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245299 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:11.247475 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245301 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:11.247475 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245304 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:11.247475 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245306 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:11.247475 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245309 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:11.247475 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245312 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:11.247475 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245314 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:11.247475 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245317 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:11.247475 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245321 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:11.247475 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245324 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:11.247475 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245326 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:11.247475 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245329 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:11.247475 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245334 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:11.247475 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245337 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:11.247475 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245340 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:11.247475 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245343 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245346 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245349 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245352 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245354 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245358 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245360 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245363 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245365 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245368 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245371 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245374 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245377 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245380 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245382 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245385 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245387 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245390 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245393 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245395 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245398 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:11.247949 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245401 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245403 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245406 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245409 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245413 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245416 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245418 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245421 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245423 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245426 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245430 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.245433 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247202 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247214 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247222 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247227 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247232 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247235 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247240 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247245 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247248 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 14:52:11.248490 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247252 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247256 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247260 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247264 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247268 2579 flags.go:64] FLAG: --cgroup-root="" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247271 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247274 2579 flags.go:64] FLAG: --client-ca-file="" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247277 2579 flags.go:64] FLAG: --cloud-config="" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247280 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247283 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247288 2579 flags.go:64] FLAG: --cluster-domain="" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247291 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247294 2579 flags.go:64] FLAG: --config-dir="" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247297 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247300 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247305 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247309 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247312 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247316 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247334 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247338 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247342 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247345 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247348 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247353 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 14:52:11.249009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247356 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247359 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247362 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247366 2579 flags.go:64] FLAG: --enable-server="true" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247369 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247374 2579 flags.go:64] FLAG: --event-burst="100" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247378 2579 flags.go:64] FLAG: --event-qps="50" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247381 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247385 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247388 2579 flags.go:64] FLAG: --eviction-hard="" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247392 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247396 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247399 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247402 2579 flags.go:64] FLAG: --eviction-soft="" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247405 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247408 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247412 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247415 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247418 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247421 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247424 2579 flags.go:64] FLAG: --feature-gates="" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247428 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247432 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247436 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247440 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247443 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 16 14:52:11.249672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247446 2579 flags.go:64] FLAG: --help="false" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247450 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-129-105.ec2.internal" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247453 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247456 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247459 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247463 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247466 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247469 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247472 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247476 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247479 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247482 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247485 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247489 2579 flags.go:64] FLAG: --kube-reserved="" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247492 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247495 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247498 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247502 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247505 2579 flags.go:64] FLAG: --lock-file="" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247508 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247511 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247514 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247520 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 14:52:11.250335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247523 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247526 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247529 2579 flags.go:64] FLAG: --logging-format="text" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247532 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247535 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247539 2579 flags.go:64] FLAG: --manifest-url="" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247542 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247547 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247550 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247555 2579 flags.go:64] FLAG: --max-pods="110" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247558 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247561 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247564 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247567 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247570 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247573 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247576 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247585 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247588 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247591 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247595 2579 flags.go:64] FLAG: --pod-cidr="" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247598 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247603 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247606 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 14:52:11.250891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247609 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247613 2579 flags.go:64] FLAG: --port="10250" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247616 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247619 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e014752cbb9b86b6" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247623 2579 flags.go:64] FLAG: --qos-reserved="" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247626 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247629 2579 flags.go:64] FLAG: --register-node="true" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247632 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247635 2579 flags.go:64] FLAG: --register-with-taints="" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247639 2579 flags.go:64] FLAG: --registry-burst="10" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247642 2579 flags.go:64] FLAG: --registry-qps="5" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247645 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247648 2579 flags.go:64] FLAG: --reserved-memory="" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247652 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247655 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247658 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247661 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247664 2579 flags.go:64] FLAG: --runonce="false" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247667 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247671 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247674 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247677 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247680 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247683 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247686 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247690 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 14:52:11.251514 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247693 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247695 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247698 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247701 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247704 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247707 2579 flags.go:64] FLAG: --system-cgroups="" Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247710 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247716 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247719 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247728 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247732 2579 flags.go:64] FLAG: --tls-min-version="" Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247735 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247738 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247741 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247745 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247748 2579 flags.go:64] FLAG: --v="2" Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247752 2579 flags.go:64] FLAG: --version="false" Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247756 2579 flags.go:64] FLAG: --vmodule="" Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247761 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.247764 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248285 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248292 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248295 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248298 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:11.252142 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248301 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:11.252728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248304 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:11.252728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248307 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:11.252728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248310 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:11.252728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248314 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:11.252728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248316 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:11.252728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248319 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:11.252728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248323 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:11.252728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248327 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:11.252728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248330 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:11.252728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248333 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:11.252728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248336 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:11.252728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248339 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:11.252728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248343 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:11.252728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248346 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:11.252728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248348 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:11.252728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248351 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:11.252728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248355 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:11.252728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248358 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:11.252728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248361 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:11.253240 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248364 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:11.253240 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248367 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:11.253240 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248370 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:11.253240 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248372 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:11.253240 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248375 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:11.253240 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248378 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:11.253240 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248381 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:11.253240 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248383 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:11.253240 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248386 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:11.253240 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248388 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:11.253240 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248391 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:11.253240 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248393 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:11.253240 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248396 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:11.253240 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248398 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:11.253240 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248401 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:11.253240 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248403 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:11.253240 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248407 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:11.253240 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248409 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:11.253240 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248412 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:11.253708 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248414 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:11.253708 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248417 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:11.253708 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248419 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:11.253708 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248422 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:11.253708 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248425 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:11.253708 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248430 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:11.253708 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248433 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:11.253708 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248435 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:11.253708 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248438 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:11.253708 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248440 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:11.253708 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248443 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:11.253708 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248446 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:11.253708 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248449 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:11.253708 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248452 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:11.253708 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248455 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:11.253708 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248457 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:11.253708 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248460 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:11.253708 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248463 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:11.253708 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248465 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:11.254227 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248468 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:11.254227 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248470 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:11.254227 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248473 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:11.254227 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248476 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:11.254227 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248478 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:11.254227 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248481 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:11.254227 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248484 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:11.254227 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248487 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:11.254227 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248489 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:11.254227 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248492 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:11.254227 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248495 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:11.254227 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248497 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:11.254227 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248500 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:11.254227 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248503 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:11.254227 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248505 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:11.254227 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248508 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:11.254227 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248511 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:11.254227 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248513 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:11.254227 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248516 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:11.254227 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248518 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:11.255147 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248521 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:11.255147 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248523 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:11.255147 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248526 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:11.255147 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.248528 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:11.255147 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.249079 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:11.256722 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.256696 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 14:52:11.256722 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.256724 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 14:52:11.256879 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256806 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:11.256879 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256816 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:11.256879 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256821 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:11.256879 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256826 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:11.256879 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256831 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:11.256879 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256835 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:11.256879 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256839 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:11.256879 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256844 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:11.256879 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256848 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:11.256879 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256853 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:11.256879 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256857 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:11.256879 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256862 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:11.256879 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256866 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:11.256879 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256870 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:11.256879 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256876 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:11.256879 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256881 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:11.256879 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256886 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:11.257645 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256891 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:11.257645 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256895 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:11.257645 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256900 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:11.257645 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256904 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:11.257645 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256908 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:11.257645 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256912 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:11.257645 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256917 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:11.257645 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256921 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:11.257645 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256926 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:11.257645 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256930 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:11.257645 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256934 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:11.257645 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256939 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:11.257645 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256943 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:11.257645 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256947 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:11.257645 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256951 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:11.257645 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256956 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:11.257645 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256960 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:11.257645 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256964 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:11.257645 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256969 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:11.258298 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256973 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:11.258298 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256977 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:11.258298 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256981 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:11.258298 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256985 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:11.258298 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256989 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:11.258298 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256993 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:11.258298 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.256998 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:11.258298 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257002 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:11.258298 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257009 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:11.258298 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257016 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:11.258298 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257021 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:11.258298 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257027 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:11.258298 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257032 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:11.258298 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257036 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:11.258298 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257042 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:11.258298 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257046 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:11.258298 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257051 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:11.258298 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257056 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:11.258298 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257060 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:11.258777 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257064 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:11.258777 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257086 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:11.258777 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257090 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:11.258777 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257094 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:11.258777 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257098 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:11.258777 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257102 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:11.258777 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257106 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:11.258777 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257110 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:11.258777 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257114 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:11.258777 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257119 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:11.258777 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257123 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:11.258777 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257127 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:11.258777 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257132 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:11.258777 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257136 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:11.258777 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257141 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:11.258777 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257146 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:11.258777 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257150 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:11.258777 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257155 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:11.258777 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257160 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:11.258777 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257164 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:11.259382 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257169 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:11.259382 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257174 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:11.259382 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257178 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:11.259382 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257182 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:11.259382 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257187 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:11.259382 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257194 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:11.259382 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257201 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:11.259382 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257207 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:11.259382 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257211 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:11.259382 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257216 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:11.259382 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257220 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:11.259382 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.257229 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:11.259382 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257394 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:11.259382 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257402 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:11.259382 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257407 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:11.260101 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257412 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:11.260101 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257417 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:11.260101 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257421 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:11.260101 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257425 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:11.260101 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257429 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:11.260101 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257434 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:11.260101 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257438 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:11.260101 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257442 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:11.260101 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257446 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:11.260101 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257451 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:11.260101 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257455 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:11.260101 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257459 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:11.260101 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257463 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:11.260101 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257470 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:11.260101 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257476 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:11.260101 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257481 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:11.260101 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257487 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:11.260101 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257492 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:11.260101 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257497 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:11.260101 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257501 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:11.260959 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257505 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:11.260959 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257509 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:11.260959 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257514 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:11.260959 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257519 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:11.260959 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257523 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:11.260959 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257528 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:11.260959 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257532 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:11.260959 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257536 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:11.260959 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257540 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:11.260959 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257544 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:11.260959 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257550 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:11.260959 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257555 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:11.260959 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257559 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:11.260959 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257564 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:11.260959 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257568 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:11.260959 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257572 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:11.260959 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257577 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:11.260959 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257581 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:11.260959 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257585 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:11.261728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257589 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:11.261728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257594 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:11.261728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257598 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:11.261728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257602 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:11.261728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257606 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:11.261728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257610 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:11.261728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257615 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:11.261728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257620 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:11.261728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257624 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:11.261728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257629 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:11.261728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257633 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:11.261728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257638 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:11.261728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257642 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:11.261728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257647 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:11.261728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257650 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:11.261728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257655 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:11.261728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257659 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:11.261728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257669 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:11.261728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257674 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:11.261728 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257679 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:11.262392 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257682 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:11.262392 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257687 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:11.262392 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257691 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:11.262392 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257695 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:11.262392 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257699 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:11.262392 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257704 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:11.262392 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257708 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:11.262392 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257712 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:11.262392 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257717 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:11.262392 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257721 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:11.262392 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257725 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:11.262392 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257730 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:11.262392 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257734 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:11.262392 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257738 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:11.262392 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257742 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:11.262392 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257747 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:11.262392 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257751 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:11.262392 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257755 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:11.262392 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257760 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:11.262392 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257763 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:11.262892 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257768 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:11.262892 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257774 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:11.262892 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257779 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:11.262892 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:11.257783 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:11.262892 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.257792 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:11.262892 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.258799 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 14:52:11.262892 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.261825 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 14:52:11.263148 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.262929 2579 server.go:1019] "Starting client certificate rotation" Apr 16 14:52:11.263148 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.263037 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:11.263148 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.263092 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:11.293929 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.293898 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:11.296934 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.296903 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:11.312153 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.312126 2579 log.go:25] "Validated CRI v1 runtime API" Apr 16 14:52:11.317184 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.317160 2579 log.go:25] "Validated CRI v1 image API" Apr 16 14:52:11.318555 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.318537 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 14:52:11.322899 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.322870 2579 fs.go:135] Filesystem UUIDs: map[55e3ff63-2e9f-480b-b2c4-a9fbb69509d4:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 f420e22a-4d95-4912-9b44-9bb40f09b2a2:/dev/nvme0n1p4] Apr 16 14:52:11.323005 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.322895 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 14:52:11.324178 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.324156 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:11.329778 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.329622 2579 manager.go:217] Machine: {Timestamp:2026-04-16 14:52:11.327192583 +0000 UTC m=+0.434821763 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097912 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b8520d9c66e8564bf15421fc9b463 SystemUUID:ec2b8520-d9c6-6e85-64bf-15421fc9b463 BootID:0d308664-643a-42e4-aeb6-22cd77c2ad38 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ec:a2:f5:6c:f9 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ec:a2:f5:6c:f9 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a6:d6:94:8c:f0:7e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 14:52:11.329778 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.329756 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 14:52:11.329993 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.329972 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 14:52:11.331195 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.331157 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 14:52:11.331392 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.331196 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-105.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 14:52:11.331491 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.331413 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 14:52:11.331491 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.331428 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 14:52:11.331491 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.331446 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:11.332304 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.332279 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:11.333100 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.333087 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:11.333245 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.333233 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 14:52:11.335544 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.335530 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 16 14:52:11.335612 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.335550 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 14:52:11.335612 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.335567 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 14:52:11.335612 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.335581 2579 kubelet.go:397] "Adding apiserver pod source" Apr 16 14:52:11.335612 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.335595 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 14:52:11.337730 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.337359 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:11.337730 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.337459 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:11.343545 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.343524 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 14:52:11.345253 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.345232 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7hfdb" Apr 16 14:52:11.345322 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.345296 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 14:52:11.346673 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.346658 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 14:52:11.346725 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.346678 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 14:52:11.346725 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.346685 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 14:52:11.346725 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.346691 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 14:52:11.346725 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.346700 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 14:52:11.346725 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.346708 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 14:52:11.346725 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.346715 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 14:52:11.346725 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.346720 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 14:52:11.346725 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.346728 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 14:52:11.346954 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.346735 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 14:52:11.346954 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.346746 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 14:52:11.346954 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.346755 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 14:52:11.347364 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:11.347340 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-105.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 14:52:11.347437 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:11.347398 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 14:52:11.348708 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.348691 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 14:52:11.348708 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.348707 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 14:52:11.352566 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.352550 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 14:52:11.352655 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.352604 2579 server.go:1295] "Started kubelet" Apr 16 14:52:11.352758 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.352721 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 14:52:11.352848 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.352806 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 14:52:11.352888 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.352871 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 14:52:11.353603 ip-10-0-129-105 systemd[1]: Started Kubernetes Kubelet. Apr 16 14:52:11.353864 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.353845 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 14:52:11.354564 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.354482 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-105.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 14:52:11.355416 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.355404 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 16 14:52:11.356067 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.356049 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7hfdb" Apr 16 14:52:11.361175 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:11.360154 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-105.ec2.internal.18a6ddf144a20ad2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-105.ec2.internal,UID:ip-10-0-129-105.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-105.ec2.internal,},FirstTimestamp:2026-04-16 14:52:11.352566482 +0000 UTC m=+0.460195600,LastTimestamp:2026-04-16 14:52:11.352566482 +0000 UTC m=+0.460195600,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-105.ec2.internal,}" Apr 16 14:52:11.361756 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:11.361726 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 14:52:11.362417 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.362401 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:11.362841 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.362824 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 14:52:11.363686 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.363664 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 14:52:11.363915 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.363687 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 14:52:11.363973 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:11.363693 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-105.ec2.internal\" not found" Apr 16 14:52:11.363973 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.363664 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 14:52:11.363973 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.363925 2579 factory.go:55] Registering systemd factory Apr 16 14:52:11.364116 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.363989 2579 factory.go:223] Registration of the systemd container factory successfully Apr 16 14:52:11.364116 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.364053 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 16 14:52:11.364116 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.364104 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 16 14:52:11.364313 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.364298 2579 factory.go:153] Registering CRI-O factory Apr 16 14:52:11.364381 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.364316 2579 factory.go:223] Registration of the crio container factory successfully Apr 16 14:52:11.364463 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.364452 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 14:52:11.364516 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.364481 2579 factory.go:103] Registering Raw factory Apr 16 14:52:11.364516 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.364516 2579 manager.go:1196] Started watching for new ooms in manager Apr 16 14:52:11.365207 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.365189 2579 manager.go:319] Starting recovery of all containers Apr 16 14:52:11.367128 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.367106 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:11.369950 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:11.369922 2579 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-105.ec2.internal\" not found" node="ip-10-0-129-105.ec2.internal" Apr 16 14:52:11.378680 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.378655 2579 manager.go:324] Recovery completed Apr 16 14:52:11.383394 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.383377 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:11.386177 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.386156 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-105.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:11.386267 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.386192 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-105.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:11.386267 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.386203 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-105.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:11.386707 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.386686 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 14:52:11.386707 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.386696 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 14:52:11.386805 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.386718 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:11.388891 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.388877 2579 policy_none.go:49] "None policy: Start" Apr 16 14:52:11.388933 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.388897 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 14:52:11.388933 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.388911 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 16 14:52:11.427905 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.427881 2579 manager.go:341] "Starting Device Plugin manager" Apr 16 14:52:11.438412 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:11.428066 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 14:52:11.438412 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.428108 2579 server.go:85] "Starting device plugin registration server" Apr 16 14:52:11.438412 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.428389 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 14:52:11.438412 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.428401 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 14:52:11.438412 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.428538 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 14:52:11.438412 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.428626 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 14:52:11.438412 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.428641 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 14:52:11.438412 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:11.429299 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 14:52:11.438412 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:11.429335 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-105.ec2.internal\" not found" Apr 16 14:52:11.465899 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.465858 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 14:52:11.467134 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.467110 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 14:52:11.467134 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.467135 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 14:52:11.467302 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.467155 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 14:52:11.467302 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.467164 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 14:52:11.467302 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:11.467200 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 14:52:11.468983 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.468961 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:11.529499 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.529404 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:11.530430 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.530413 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-105.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:11.530495 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.530445 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-105.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:11.530495 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.530454 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-105.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:11.530495 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.530486 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-105.ec2.internal" Apr 16 14:52:11.537732 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.537712 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-105.ec2.internal" Apr 16 14:52:11.537798 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:11.537741 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-105.ec2.internal\": node \"ip-10-0-129-105.ec2.internal\" not found" Apr 16 14:52:11.554661 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:11.554630 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-105.ec2.internal\" not found" Apr 16 14:52:11.568240 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.568204 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-105.ec2.internal"] Apr 16 14:52:11.568312 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.568291 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:11.569234 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.569220 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-105.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:11.569286 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.569249 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-105.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:11.569286 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.569263 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-105.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:11.570333 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.570321 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:11.570492 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.570477 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal" Apr 16 14:52:11.570547 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.570509 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:11.571023 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.571007 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-105.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:11.571113 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.571034 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-105.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:11.571113 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.571048 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-105.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:11.571113 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.571058 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-105.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:11.571113 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.571099 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-105.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:11.571113 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.571112 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-105.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:11.573422 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.573409 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-105.ec2.internal" Apr 16 14:52:11.573472 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.573433 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:11.574225 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.574212 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-105.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:11.574283 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.574232 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-105.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:11.574283 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.574244 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-105.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:11.600001 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:11.599977 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-105.ec2.internal\" not found" node="ip-10-0-129-105.ec2.internal" Apr 16 14:52:11.604716 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:11.604698 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-105.ec2.internal\" not found" node="ip-10-0-129-105.ec2.internal" Apr 16 14:52:11.654833 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:11.654805 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-105.ec2.internal\" not found" Apr 16 14:52:11.665177 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.665149 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf0dcc2c635c65b3b15206dc72520006-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal\" (UID: \"cf0dcc2c635c65b3b15206dc72520006\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal" Apr 16 14:52:11.665238 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.665182 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c7b6d0dbc6e31e2e27484e63c6997a81-config\") pod \"kube-apiserver-proxy-ip-10-0-129-105.ec2.internal\" (UID: \"c7b6d0dbc6e31e2e27484e63c6997a81\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-105.ec2.internal" Apr 16 14:52:11.665238 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.665210 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/cf0dcc2c635c65b3b15206dc72520006-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal\" (UID: \"cf0dcc2c635c65b3b15206dc72520006\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal" Apr 16 14:52:11.755088 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:11.755038 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-105.ec2.internal\" not found" Apr 16 14:52:11.765412 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.765389 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/cf0dcc2c635c65b3b15206dc72520006-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal\" (UID: \"cf0dcc2c635c65b3b15206dc72520006\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal" Apr 16 14:52:11.765469 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.765419 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf0dcc2c635c65b3b15206dc72520006-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal\" (UID: \"cf0dcc2c635c65b3b15206dc72520006\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal" Apr 16 14:52:11.765469 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.765437 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c7b6d0dbc6e31e2e27484e63c6997a81-config\") pod \"kube-apiserver-proxy-ip-10-0-129-105.ec2.internal\" (UID: \"c7b6d0dbc6e31e2e27484e63c6997a81\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-105.ec2.internal" Apr 16 14:52:11.765529 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.765494 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c7b6d0dbc6e31e2e27484e63c6997a81-config\") pod \"kube-apiserver-proxy-ip-10-0-129-105.ec2.internal\" (UID: \"c7b6d0dbc6e31e2e27484e63c6997a81\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-105.ec2.internal" Apr 16 14:52:11.765561 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.765515 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cf0dcc2c635c65b3b15206dc72520006-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal\" (UID: \"cf0dcc2c635c65b3b15206dc72520006\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal" Apr 16 14:52:11.765561 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.765515 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/cf0dcc2c635c65b3b15206dc72520006-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal\" (UID: \"cf0dcc2c635c65b3b15206dc72520006\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal" Apr 16 14:52:11.855928 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:11.855844 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-105.ec2.internal\" not found" Apr 16 14:52:11.903382 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.903352 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal" Apr 16 14:52:11.906986 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:11.906960 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-105.ec2.internal" Apr 16 14:52:11.956770 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:11.956719 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-105.ec2.internal\" not found" Apr 16 14:52:12.057376 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:12.057330 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-105.ec2.internal\" not found" Apr 16 14:52:12.157979 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:12.157905 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-105.ec2.internal\" not found" Apr 16 14:52:12.258419 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:12.258384 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-105.ec2.internal\" not found" Apr 16 14:52:12.262565 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:12.262540 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 14:52:12.262708 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:12.262692 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:12.262748 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:12.262730 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:12.359372 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:12.359336 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-105.ec2.internal\" not found" Apr 16 14:52:12.359372 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:12.359353 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 14:47:11 +0000 UTC" deadline="2027-12-15 12:23:27.795006029 +0000 UTC" Apr 16 14:52:12.359372 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:12.359379 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14589h31m15.435630087s" Apr 16 14:52:12.363327 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:12.363306 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:12.372632 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:12.372594 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7b6d0dbc6e31e2e27484e63c6997a81.slice/crio-f10402ceb089fbd1ea60eeea6930a11a7d85fd4464f125794b48cc1a4a2b428e WatchSource:0}: Error finding container f10402ceb089fbd1ea60eeea6930a11a7d85fd4464f125794b48cc1a4a2b428e: Status 404 returned error can't find the container with id f10402ceb089fbd1ea60eeea6930a11a7d85fd4464f125794b48cc1a4a2b428e Apr 16 14:52:12.373387 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:12.373367 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf0dcc2c635c65b3b15206dc72520006.slice/crio-1b45a0b8835c2d34ab409d8cc2d4c363ddaa6e38f70fc3cb4748ef9b0223f191 WatchSource:0}: Error finding container 1b45a0b8835c2d34ab409d8cc2d4c363ddaa6e38f70fc3cb4748ef9b0223f191: Status 404 returned error can't find the container with id 1b45a0b8835c2d34ab409d8cc2d4c363ddaa6e38f70fc3cb4748ef9b0223f191 Apr 16 14:52:12.374882 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:12.374861 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:12.377120 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:12.377108 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:52:12.394199 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:12.394169 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-fdrkh" Apr 16 14:52:12.403136 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:12.403106 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-fdrkh" Apr 16 14:52:12.460245 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:12.460156 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-105.ec2.internal\" not found" Apr 16 14:52:12.470535 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:12.470477 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-105.ec2.internal" event={"ID":"c7b6d0dbc6e31e2e27484e63c6997a81","Type":"ContainerStarted","Data":"f10402ceb089fbd1ea60eeea6930a11a7d85fd4464f125794b48cc1a4a2b428e"} Apr 16 14:52:12.471334 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:12.471313 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal" event={"ID":"cf0dcc2c635c65b3b15206dc72520006","Type":"ContainerStarted","Data":"1b45a0b8835c2d34ab409d8cc2d4c363ddaa6e38f70fc3cb4748ef9b0223f191"} Apr 16 14:52:12.560579 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:12.560544 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-105.ec2.internal\" not found" Apr 16 14:52:12.622977 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:12.622947 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:12.663131 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:12.663111 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal" Apr 16 14:52:12.673128 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:12.673105 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:12.675182 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:12.675168 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-105.ec2.internal" Apr 16 14:52:12.681964 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:12.681941 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:12.880131 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:12.879748 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:13.302004 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.301922 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:13.337568 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.337530 2579 apiserver.go:52] "Watching apiserver" Apr 16 14:52:13.342671 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.342644 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 14:52:13.343035 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.343005 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-6kn6f","kube-system/konnectivity-agent-nxthn","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6","openshift-image-registry/node-ca-4d625","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal","openshift-network-operator/iptables-alerter-cl4w9","openshift-ovn-kubernetes/ovnkube-node-m882k","kube-system/kube-apiserver-proxy-ip-10-0-129-105.ec2.internal","openshift-cluster-node-tuning-operator/tuned-jpzc4","openshift-dns/node-resolver-5qg4q","openshift-multus/multus-47k57","openshift-multus/multus-additional-cni-plugins-hkq7w","openshift-multus/network-metrics-daemon-8g7qk"] Apr 16 14:52:13.346273 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.346232 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.348163 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.348124 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 14:52:13.348163 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.348159 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 14:52:13.348350 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.348195 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rkf9w\"" Apr 16 14:52:13.348554 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.348533 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nxthn" Apr 16 14:52:13.348651 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.348617 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 14:52:13.348706 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.348665 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 14:52:13.348835 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.348806 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 14:52:13.349018 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.349002 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 14:52:13.350396 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.350378 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 14:52:13.350880 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.350859 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.351064 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.351041 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-26c42\"" Apr 16 14:52:13.351372 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.351272 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 14:52:13.352567 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.352549 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 14:52:13.352761 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.352746 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 14:52:13.352885 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.352770 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 14:52:13.353024 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.353010 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-2jzmn\"" Apr 16 14:52:13.353191 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.353166 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4d625" Apr 16 14:52:13.354768 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.354736 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-xftsh\"" Apr 16 14:52:13.354865 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.354823 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 14:52:13.354865 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.354829 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 14:52:13.354865 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.354849 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 14:52:13.355416 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.355396 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cl4w9" Apr 16 14:52:13.358355 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.356971 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 14:52:13.358355 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.357298 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:13.358355 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.357285 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:13.358355 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.357630 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-fj4s8\"" Apr 16 14:52:13.360973 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.360951 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:13.361098 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:13.361041 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6kn6f" podUID="274057c1-8751-4b12-8464-7a42a2c6372c" Apr 16 14:52:13.361098 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.361052 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.362866 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.362847 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:13.362866 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.362863 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:13.363018 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.362847 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-pnhl7\"" Apr 16 14:52:13.363322 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.363299 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5qg4q" Apr 16 14:52:13.365909 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.365659 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-47k57" Apr 16 14:52:13.365909 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.365695 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-b5zzr\"" Apr 16 14:52:13.365909 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.365705 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 14:52:13.365909 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.365829 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 14:52:13.367933 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.367913 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 14:52:13.368031 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.367980 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 14:52:13.368224 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.368196 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 14:52:13.368315 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.368276 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jb7vq\"" Apr 16 14:52:13.368394 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.368379 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.368489 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.368474 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 14:52:13.370935 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.370786 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:13.371177 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:13.370969 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8g7qk" podUID="a3db0253-f985-4d95-b46c-abb2acc3e872" Apr 16 14:52:13.371177 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.371009 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 14:52:13.371177 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.371031 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 14:52:13.371350 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.371330 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-6bv8c\"" Apr 16 14:52:13.374235 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374209 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ca6006fe-c049-4f20-b847-d14270e6af58-multus-daemon-config\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.374345 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374249 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-sysctl-conf\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.374345 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374272 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-slash\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.374345 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374288 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-cnibin\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.374345 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374304 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-host-var-lib-cni-multus\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.374345 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374330 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-modprobe-d\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.374613 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374348 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-sys\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.374613 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374373 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-var-lib-kubelet\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.374613 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374397 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-var-lib-openvswitch\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.374613 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374454 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-node-log\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.374613 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374484 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-multus-conf-dir\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.374613 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374511 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-log-socket\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.374613 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374536 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.374613 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374559 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-host-var-lib-cni-bin\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.374613 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374579 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-hostroot\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.374613 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374600 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-etc-kubernetes\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.375107 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374620 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-run\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.375107 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374642 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-run-ovn\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.375107 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374713 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-multus-socket-dir-parent\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.375107 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374742 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c55b2\" (UniqueName: \"kubernetes.io/projected/ca6006fe-c049-4f20-b847-d14270e6af58-kube-api-access-c55b2\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.375107 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374799 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f6ae390c-ede3-458f-8330-0d8d3aad76c2-hosts-file\") pod \"node-resolver-5qg4q\" (UID: \"f6ae390c-ede3-458f-8330-0d8d3aad76c2\") " pod="openshift-dns/node-resolver-5qg4q" Apr 16 14:52:13.375107 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374846 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-kubelet\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.375107 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374882 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/493eb5f3-c1cb-4508-8b86-fb60aa459acc-etc-selinux\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.375107 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374909 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmdjx\" (UniqueName: \"kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx\") pod \"network-check-target-6kn6f\" (UID: \"274057c1-8751-4b12-8464-7a42a2c6372c\") " pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:13.375107 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374933 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-lib-modules\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.375107 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374955 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f6ae390c-ede3-458f-8330-0d8d3aad76c2-tmp-dir\") pod \"node-resolver-5qg4q\" (UID: \"f6ae390c-ede3-458f-8330-0d8d3aad76c2\") " pod="openshift-dns/node-resolver-5qg4q" Apr 16 14:52:13.375107 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.374990 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/539ba0b2-e94b-4e6d-9955-d2325acb7a00-ovnkube-script-lib\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.375107 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375013 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/493eb5f3-c1cb-4508-8b86-fb60aa459acc-socket-dir\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.375107 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375038 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-system-cni-dir\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.375107 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375085 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ca6006fe-c049-4f20-b847-d14270e6af58-cni-binary-copy\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.375741 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375127 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-host-var-lib-kubelet\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.375741 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375163 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-cni-netd\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.375741 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375210 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/539ba0b2-e94b-4e6d-9955-d2325acb7a00-ovnkube-config\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.375741 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375242 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-host-run-multus-certs\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.375741 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375295 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/539ba0b2-e94b-4e6d-9955-d2325acb7a00-env-overrides\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.375741 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375340 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9e9ac884-30e2-4486-9e89-d541e73ee8c4-iptables-alerter-script\") pod \"iptables-alerter-cl4w9\" (UID: \"9e9ac884-30e2-4486-9e89-d541e73ee8c4\") " pod="openshift-network-operator/iptables-alerter-cl4w9" Apr 16 14:52:13.375741 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375362 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2b4700c8-468e-4f4d-9f39-760f7db3a824-agent-certs\") pod \"konnectivity-agent-nxthn\" (UID: \"2b4700c8-468e-4f4d-9f39-760f7db3a824\") " pod="kube-system/konnectivity-agent-nxthn" Apr 16 14:52:13.375741 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375379 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-sysconfig\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.375741 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375416 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-kubernetes\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.375741 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375441 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q2ck\" (UniqueName: \"kubernetes.io/projected/4c666548-852c-40b9-aa2a-ee177bbfb811-kube-api-access-8q2ck\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.375741 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375478 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-systemd-units\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.375741 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375550 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmphs\" (UniqueName: \"kubernetes.io/projected/539ba0b2-e94b-4e6d-9955-d2325acb7a00-kube-api-access-cmphs\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.375741 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375582 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c778a259-410c-444b-a486-c230dd795def-host\") pod \"node-ca-4d625\" (UID: \"c778a259-410c-444b-a486-c230dd795def\") " pod="openshift-image-registry/node-ca-4d625" Apr 16 14:52:13.375741 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375666 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-multus-cni-dir\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.375741 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375703 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e9ac884-30e2-4486-9e89-d541e73ee8c4-host-slash\") pod \"iptables-alerter-cl4w9\" (UID: \"9e9ac884-30e2-4486-9e89-d541e73ee8c4\") " pod="openshift-network-operator/iptables-alerter-cl4w9" Apr 16 14:52:13.375741 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375729 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-host-run-k8s-cni-cncf-io\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.376416 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375744 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-sysctl-d\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.376416 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375759 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/539ba0b2-e94b-4e6d-9955-d2325acb7a00-ovn-node-metrics-cert\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.376416 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375790 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/493eb5f3-c1cb-4508-8b86-fb60aa459acc-sys-fs\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.376416 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375812 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c778a259-410c-444b-a486-c230dd795def-serviceca\") pod \"node-ca-4d625\" (UID: \"c778a259-410c-444b-a486-c230dd795def\") " pod="openshift-image-registry/node-ca-4d625" Apr 16 14:52:13.376416 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375828 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-run-netns\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.376416 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375869 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-run-systemd\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.376416 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375890 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/493eb5f3-c1cb-4508-8b86-fb60aa459acc-device-dir\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.376416 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375907 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2b4700c8-468e-4f4d-9f39-760f7db3a824-konnectivity-ca\") pod \"konnectivity-agent-nxthn\" (UID: \"2b4700c8-468e-4f4d-9f39-760f7db3a824\") " pod="kube-system/konnectivity-agent-nxthn" Apr 16 14:52:13.376416 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375933 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-etc-openvswitch\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.376416 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375949 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-run-openvswitch\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.376416 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375966 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhgr6\" (UniqueName: \"kubernetes.io/projected/493eb5f3-c1cb-4508-8b86-fb60aa459acc-kube-api-access-mhgr6\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.376416 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.375989 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9db4\" (UniqueName: \"kubernetes.io/projected/c778a259-410c-444b-a486-c230dd795def-kube-api-access-x9db4\") pod \"node-ca-4d625\" (UID: \"c778a259-410c-444b-a486-c230dd795def\") " pod="openshift-image-registry/node-ca-4d625" Apr 16 14:52:13.376416 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.376013 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4c666548-852c-40b9-aa2a-ee177bbfb811-tmp\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.376416 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.376037 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx796\" (UniqueName: \"kubernetes.io/projected/f6ae390c-ede3-458f-8330-0d8d3aad76c2-kube-api-access-nx796\") pod \"node-resolver-5qg4q\" (UID: \"f6ae390c-ede3-458f-8330-0d8d3aad76c2\") " pod="openshift-dns/node-resolver-5qg4q" Apr 16 14:52:13.376416 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.376058 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-cni-bin\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.376416 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.376109 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-host\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.376997 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.376157 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/493eb5f3-c1cb-4508-8b86-fb60aa459acc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.376997 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.376183 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-host-run-netns\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.376997 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.376208 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/493eb5f3-c1cb-4508-8b86-fb60aa459acc-registration-dir\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.376997 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.376237 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzk65\" (UniqueName: \"kubernetes.io/projected/9e9ac884-30e2-4486-9e89-d541e73ee8c4-kube-api-access-xzk65\") pod \"iptables-alerter-cl4w9\" (UID: \"9e9ac884-30e2-4486-9e89-d541e73ee8c4\") " pod="openshift-network-operator/iptables-alerter-cl4w9" Apr 16 14:52:13.376997 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.376261 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-systemd\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.376997 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.376282 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-tuned\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.376997 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.376306 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-run-ovn-kubernetes\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.376997 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.376344 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-os-release\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.403963 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.403921 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:12 +0000 UTC" deadline="2028-01-31 16:42:42.192791816 +0000 UTC" Apr 16 14:52:13.403963 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.403963 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15721h50m28.788833684s" Apr 16 14:52:13.464829 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.464799 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 14:52:13.476994 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.476962 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-host\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.476994 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.476995 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/493eb5f3-c1cb-4508-8b86-fb60aa459acc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.477231 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477020 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-host-run-netns\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.477231 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477043 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/493eb5f3-c1cb-4508-8b86-fb60aa459acc-registration-dir\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.477231 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477066 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzk65\" (UniqueName: \"kubernetes.io/projected/9e9ac884-30e2-4486-9e89-d541e73ee8c4-kube-api-access-xzk65\") pod \"iptables-alerter-cl4w9\" (UID: \"9e9ac884-30e2-4486-9e89-d541e73ee8c4\") " pod="openshift-network-operator/iptables-alerter-cl4w9" Apr 16 14:52:13.477231 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477092 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/493eb5f3-c1cb-4508-8b86-fb60aa459acc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.477231 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477092 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-host\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.477231 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477134 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-host-run-netns\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.477231 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477171 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-systemd\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.477231 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477201 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-tuned\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.477231 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477226 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-run-ovn-kubernetes\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.477231 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477230 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/493eb5f3-c1cb-4508-8b86-fb60aa459acc-registration-dir\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.477670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477248 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-os-release\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.477670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477261 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-systemd\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.477670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477272 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ca6006fe-c049-4f20-b847-d14270e6af58-multus-daemon-config\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.477670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477277 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-run-ovn-kubernetes\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.477670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477296 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-sysctl-conf\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.477670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477319 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-slash\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.477670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477343 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-cnibin\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.477670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477322 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-os-release\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.477670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477367 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-host-var-lib-cni-multus\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.477670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477384 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-slash\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.477670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477412 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-host-var-lib-cni-multus\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.477670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477454 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-sysctl-conf\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.477670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477466 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-cnibin\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.477670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477515 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc0a6a72-089b-44bd-97ca-a4963264f458-cni-binary-copy\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.477670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477542 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-modprobe-d\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.477670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477557 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-sys\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.477670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477572 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-var-lib-kubelet\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.477670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477589 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-var-lib-openvswitch\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.478402 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477604 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 14:52:13.478402 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477617 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-node-log\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.478402 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477651 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-node-log\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.478402 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477652 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-modprobe-d\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.478402 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477693 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-sys\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.478402 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477704 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-multus-conf-dir\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.478402 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477701 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-var-lib-openvswitch\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.478402 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477766 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-multus-conf-dir\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.478402 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477757 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cc0a6a72-089b-44bd-97ca-a4963264f458-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.478402 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477781 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ca6006fe-c049-4f20-b847-d14270e6af58-multus-daemon-config\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.478402 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477781 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-var-lib-kubelet\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.478402 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477833 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-log-socket\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.478402 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477863 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-log-socket\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.478402 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477885 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.478402 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477925 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-host-var-lib-cni-bin\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.478402 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477942 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.478402 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477950 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-hostroot\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.478402 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477974 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-etc-kubernetes\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.479249 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.477995 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-host-var-lib-cni-bin\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.479249 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478001 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc0a6a72-089b-44bd-97ca-a4963264f458-os-release\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.479249 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478021 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-hostroot\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.479249 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478026 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-run\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.479249 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478036 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-etc-kubernetes\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.479249 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478063 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-run-ovn\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.479249 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478082 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-run\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.479249 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478120 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-multus-socket-dir-parent\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.479249 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478145 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c55b2\" (UniqueName: \"kubernetes.io/projected/ca6006fe-c049-4f20-b847-d14270e6af58-kube-api-access-c55b2\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.479249 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478170 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f6ae390c-ede3-458f-8330-0d8d3aad76c2-hosts-file\") pod \"node-resolver-5qg4q\" (UID: \"f6ae390c-ede3-458f-8330-0d8d3aad76c2\") " pod="openshift-dns/node-resolver-5qg4q" Apr 16 14:52:13.479249 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478179 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-run-ovn\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.479249 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478196 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-kubelet\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.479249 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478212 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-multus-socket-dir-parent\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.479249 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478240 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-kubelet\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.479249 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478240 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f6ae390c-ede3-458f-8330-0d8d3aad76c2-hosts-file\") pod \"node-resolver-5qg4q\" (UID: \"f6ae390c-ede3-458f-8330-0d8d3aad76c2\") " pod="openshift-dns/node-resolver-5qg4q" Apr 16 14:52:13.479249 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478310 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/493eb5f3-c1cb-4508-8b86-fb60aa459acc-etc-selinux\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.479249 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478348 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmdjx\" (UniqueName: \"kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx\") pod \"network-check-target-6kn6f\" (UID: \"274057c1-8751-4b12-8464-7a42a2c6372c\") " pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:13.479249 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478371 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/493eb5f3-c1cb-4508-8b86-fb60aa459acc-etc-selinux\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.480041 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478373 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-lib-modules\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.480041 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478422 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f6ae390c-ede3-458f-8330-0d8d3aad76c2-tmp-dir\") pod \"node-resolver-5qg4q\" (UID: \"f6ae390c-ede3-458f-8330-0d8d3aad76c2\") " pod="openshift-dns/node-resolver-5qg4q" Apr 16 14:52:13.480041 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478457 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/539ba0b2-e94b-4e6d-9955-d2325acb7a00-ovnkube-script-lib\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.480041 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478540 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-lib-modules\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.480041 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478562 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/493eb5f3-c1cb-4508-8b86-fb60aa459acc-socket-dir\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.480041 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478594 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-system-cni-dir\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.480041 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478629 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ca6006fe-c049-4f20-b847-d14270e6af58-cni-binary-copy\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.480041 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478651 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-host-var-lib-kubelet\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.480041 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478663 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/493eb5f3-c1cb-4508-8b86-fb60aa459acc-socket-dir\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.480041 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478680 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cc0a6a72-089b-44bd-97ca-a4963264f458-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.480041 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478685 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-system-cni-dir\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.480041 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478706 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-host-var-lib-kubelet\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.480041 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478708 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-cni-netd\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.480041 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478750 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f6ae390c-ede3-458f-8330-0d8d3aad76c2-tmp-dir\") pod \"node-resolver-5qg4q\" (UID: \"f6ae390c-ede3-458f-8330-0d8d3aad76c2\") " pod="openshift-dns/node-resolver-5qg4q" Apr 16 14:52:13.480041 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478752 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/539ba0b2-e94b-4e6d-9955-d2325acb7a00-ovnkube-config\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.480041 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478817 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-cni-netd\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.480041 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478829 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-host-run-multus-certs\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.480041 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478855 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/539ba0b2-e94b-4e6d-9955-d2325acb7a00-env-overrides\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.480882 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478880 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9e9ac884-30e2-4486-9e89-d541e73ee8c4-iptables-alerter-script\") pod \"iptables-alerter-cl4w9\" (UID: \"9e9ac884-30e2-4486-9e89-d541e73ee8c4\") " pod="openshift-network-operator/iptables-alerter-cl4w9" Apr 16 14:52:13.480882 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.478907 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2b4700c8-468e-4f4d-9f39-760f7db3a824-agent-certs\") pod \"konnectivity-agent-nxthn\" (UID: \"2b4700c8-468e-4f4d-9f39-760f7db3a824\") " pod="kube-system/konnectivity-agent-nxthn" Apr 16 14:52:13.480882 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479197 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/539ba0b2-e94b-4e6d-9955-d2325acb7a00-ovnkube-script-lib\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.480882 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479239 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ca6006fe-c049-4f20-b847-d14270e6af58-cni-binary-copy\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.480882 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479252 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-host-run-multus-certs\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.480882 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479285 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-sysconfig\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.480882 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479315 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-kubernetes\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.480882 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479335 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/539ba0b2-e94b-4e6d-9955-d2325acb7a00-ovnkube-config\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.480882 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479343 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8q2ck\" (UniqueName: \"kubernetes.io/projected/4c666548-852c-40b9-aa2a-ee177bbfb811-kube-api-access-8q2ck\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.480882 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479370 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-systemd-units\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.480882 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479395 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmphs\" (UniqueName: \"kubernetes.io/projected/539ba0b2-e94b-4e6d-9955-d2325acb7a00-kube-api-access-cmphs\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.480882 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479398 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-kubernetes\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.480882 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479419 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c778a259-410c-444b-a486-c230dd795def-host\") pod \"node-ca-4d625\" (UID: \"c778a259-410c-444b-a486-c230dd795def\") " pod="openshift-image-registry/node-ca-4d625" Apr 16 14:52:13.480882 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479443 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-sysconfig\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.480882 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479444 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-multus-cni-dir\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.480882 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479443 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/539ba0b2-e94b-4e6d-9955-d2325acb7a00-env-overrides\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.480882 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479472 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9e9ac884-30e2-4486-9e89-d541e73ee8c4-iptables-alerter-script\") pod \"iptables-alerter-cl4w9\" (UID: \"9e9ac884-30e2-4486-9e89-d541e73ee8c4\") " pod="openshift-network-operator/iptables-alerter-cl4w9" Apr 16 14:52:13.480882 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479496 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-multus-cni-dir\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.481719 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479521 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c778a259-410c-444b-a486-c230dd795def-host\") pod \"node-ca-4d625\" (UID: \"c778a259-410c-444b-a486-c230dd795def\") " pod="openshift-image-registry/node-ca-4d625" Apr 16 14:52:13.481719 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479566 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-systemd-units\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.481719 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479598 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc0a6a72-089b-44bd-97ca-a4963264f458-system-cni-dir\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.481719 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479636 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc0a6a72-089b-44bd-97ca-a4963264f458-cnibin\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.481719 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479666 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs\") pod \"network-metrics-daemon-8g7qk\" (UID: \"a3db0253-f985-4d95-b46c-abb2acc3e872\") " pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:13.481719 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479695 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e9ac884-30e2-4486-9e89-d541e73ee8c4-host-slash\") pod \"iptables-alerter-cl4w9\" (UID: \"9e9ac884-30e2-4486-9e89-d541e73ee8c4\") " pod="openshift-network-operator/iptables-alerter-cl4w9" Apr 16 14:52:13.481719 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479720 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-host-run-k8s-cni-cncf-io\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.481719 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479755 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-sysctl-d\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.481719 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479759 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e9ac884-30e2-4486-9e89-d541e73ee8c4-host-slash\") pod \"iptables-alerter-cl4w9\" (UID: \"9e9ac884-30e2-4486-9e89-d541e73ee8c4\") " pod="openshift-network-operator/iptables-alerter-cl4w9" Apr 16 14:52:13.481719 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479773 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ca6006fe-c049-4f20-b847-d14270e6af58-host-run-k8s-cni-cncf-io\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.481719 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479781 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/539ba0b2-e94b-4e6d-9955-d2325acb7a00-ovn-node-metrics-cert\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.481719 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479811 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/493eb5f3-c1cb-4508-8b86-fb60aa459acc-sys-fs\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.481719 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479848 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c778a259-410c-444b-a486-c230dd795def-serviceca\") pod \"node-ca-4d625\" (UID: \"c778a259-410c-444b-a486-c230dd795def\") " pod="openshift-image-registry/node-ca-4d625" Apr 16 14:52:13.481719 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479871 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-sysctl-d\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.481719 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479873 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-run-netns\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.481719 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479914 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-run-netns\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.481719 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479918 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-run-systemd\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.482548 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479950 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-run-systemd\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.482548 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479954 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/493eb5f3-c1cb-4508-8b86-fb60aa459acc-device-dir\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.482548 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.479988 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc0a6a72-089b-44bd-97ca-a4963264f458-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.482548 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.480016 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2b4700c8-468e-4f4d-9f39-760f7db3a824-konnectivity-ca\") pod \"konnectivity-agent-nxthn\" (UID: \"2b4700c8-468e-4f4d-9f39-760f7db3a824\") " pod="kube-system/konnectivity-agent-nxthn" Apr 16 14:52:13.482548 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.480062 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/493eb5f3-c1cb-4508-8b86-fb60aa459acc-device-dir\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.482548 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.480106 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-etc-openvswitch\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.482548 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.480135 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-run-openvswitch\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.482548 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.480160 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhgr6\" (UniqueName: \"kubernetes.io/projected/493eb5f3-c1cb-4508-8b86-fb60aa459acc-kube-api-access-mhgr6\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.482548 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.480187 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9db4\" (UniqueName: \"kubernetes.io/projected/c778a259-410c-444b-a486-c230dd795def-kube-api-access-x9db4\") pod \"node-ca-4d625\" (UID: \"c778a259-410c-444b-a486-c230dd795def\") " pod="openshift-image-registry/node-ca-4d625" Apr 16 14:52:13.482548 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.480223 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsshl\" (UniqueName: \"kubernetes.io/projected/cc0a6a72-089b-44bd-97ca-a4963264f458-kube-api-access-jsshl\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.482548 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.480281 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4c666548-852c-40b9-aa2a-ee177bbfb811-tmp\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.482548 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.480308 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nx796\" (UniqueName: \"kubernetes.io/projected/f6ae390c-ede3-458f-8330-0d8d3aad76c2-kube-api-access-nx796\") pod \"node-resolver-5qg4q\" (UID: \"f6ae390c-ede3-458f-8330-0d8d3aad76c2\") " pod="openshift-dns/node-resolver-5qg4q" Apr 16 14:52:13.482548 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.480335 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-cni-bin\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.482548 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.480372 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c778a259-410c-444b-a486-c230dd795def-serviceca\") pod \"node-ca-4d625\" (UID: \"c778a259-410c-444b-a486-c230dd795def\") " pod="openshift-image-registry/node-ca-4d625" Apr 16 14:52:13.482548 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.480377 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-host-cni-bin\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.482548 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.480386 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-etc-openvswitch\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.482548 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.480425 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv8b9\" (UniqueName: \"kubernetes.io/projected/a3db0253-f985-4d95-b46c-abb2acc3e872-kube-api-access-hv8b9\") pod \"network-metrics-daemon-8g7qk\" (UID: \"a3db0253-f985-4d95-b46c-abb2acc3e872\") " pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:13.483223 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.480454 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/539ba0b2-e94b-4e6d-9955-d2325acb7a00-run-openvswitch\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.483223 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.480607 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/493eb5f3-c1cb-4508-8b86-fb60aa459acc-sys-fs\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.483223 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.480926 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2b4700c8-468e-4f4d-9f39-760f7db3a824-konnectivity-ca\") pod \"konnectivity-agent-nxthn\" (UID: \"2b4700c8-468e-4f4d-9f39-760f7db3a824\") " pod="kube-system/konnectivity-agent-nxthn" Apr 16 14:52:13.483223 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.480999 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4c666548-852c-40b9-aa2a-ee177bbfb811-etc-tuned\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.483223 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.482553 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2b4700c8-468e-4f4d-9f39-760f7db3a824-agent-certs\") pod \"konnectivity-agent-nxthn\" (UID: \"2b4700c8-468e-4f4d-9f39-760f7db3a824\") " pod="kube-system/konnectivity-agent-nxthn" Apr 16 14:52:13.483223 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.482866 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4c666548-852c-40b9-aa2a-ee177bbfb811-tmp\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.483223 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.482925 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/539ba0b2-e94b-4e6d-9955-d2325acb7a00-ovn-node-metrics-cert\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.485667 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:13.485638 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:13.485667 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:13.485663 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:13.485836 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:13.485676 2579 projected.go:194] Error preparing data for projected volume kube-api-access-cmdjx for pod openshift-network-diagnostics/network-check-target-6kn6f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:13.485836 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:13.485765 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx podName:274057c1-8751-4b12-8464-7a42a2c6372c nodeName:}" failed. No retries permitted until 2026-04-16 14:52:13.9857197 +0000 UTC m=+3.093348802 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cmdjx" (UniqueName: "kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx") pod "network-check-target-6kn6f" (UID: "274057c1-8751-4b12-8464-7a42a2c6372c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:13.490765 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.487500 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzk65\" (UniqueName: \"kubernetes.io/projected/9e9ac884-30e2-4486-9e89-d541e73ee8c4-kube-api-access-xzk65\") pod \"iptables-alerter-cl4w9\" (UID: \"9e9ac884-30e2-4486-9e89-d541e73ee8c4\") " pod="openshift-network-operator/iptables-alerter-cl4w9" Apr 16 14:52:13.490765 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.489057 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c55b2\" (UniqueName: \"kubernetes.io/projected/ca6006fe-c049-4f20-b847-d14270e6af58-kube-api-access-c55b2\") pod \"multus-47k57\" (UID: \"ca6006fe-c049-4f20-b847-d14270e6af58\") " pod="openshift-multus/multus-47k57" Apr 16 14:52:13.490765 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.489258 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhgr6\" (UniqueName: \"kubernetes.io/projected/493eb5f3-c1cb-4508-8b86-fb60aa459acc-kube-api-access-mhgr6\") pod \"aws-ebs-csi-driver-node-pwmd6\" (UID: \"493eb5f3-c1cb-4508-8b86-fb60aa459acc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.491747 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.491703 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx796\" (UniqueName: \"kubernetes.io/projected/f6ae390c-ede3-458f-8330-0d8d3aad76c2-kube-api-access-nx796\") pod \"node-resolver-5qg4q\" (UID: \"f6ae390c-ede3-458f-8330-0d8d3aad76c2\") " pod="openshift-dns/node-resolver-5qg4q" Apr 16 14:52:13.491837 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.491785 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmphs\" (UniqueName: \"kubernetes.io/projected/539ba0b2-e94b-4e6d-9955-d2325acb7a00-kube-api-access-cmphs\") pod \"ovnkube-node-m882k\" (UID: \"539ba0b2-e94b-4e6d-9955-d2325acb7a00\") " pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.491913 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.491875 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q2ck\" (UniqueName: \"kubernetes.io/projected/4c666548-852c-40b9-aa2a-ee177bbfb811-kube-api-access-8q2ck\") pod \"tuned-jpzc4\" (UID: \"4c666548-852c-40b9-aa2a-ee177bbfb811\") " pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.493172 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.493149 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9db4\" (UniqueName: \"kubernetes.io/projected/c778a259-410c-444b-a486-c230dd795def-kube-api-access-x9db4\") pod \"node-ca-4d625\" (UID: \"c778a259-410c-444b-a486-c230dd795def\") " pod="openshift-image-registry/node-ca-4d625" Apr 16 14:52:13.581513 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.581420 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc0a6a72-089b-44bd-97ca-a4963264f458-cni-binary-copy\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.581513 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.581467 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cc0a6a72-089b-44bd-97ca-a4963264f458-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.581726 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.581600 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc0a6a72-089b-44bd-97ca-a4963264f458-os-release\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.581812 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.581742 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cc0a6a72-089b-44bd-97ca-a4963264f458-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.581812 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.581741 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc0a6a72-089b-44bd-97ca-a4963264f458-os-release\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.581812 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.581787 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc0a6a72-089b-44bd-97ca-a4963264f458-system-cni-dir\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.581812 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.581806 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc0a6a72-089b-44bd-97ca-a4963264f458-cnibin\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.582013 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.581852 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc0a6a72-089b-44bd-97ca-a4963264f458-cnibin\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.582013 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.581853 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc0a6a72-089b-44bd-97ca-a4963264f458-system-cni-dir\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.582013 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.581888 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs\") pod \"network-metrics-daemon-8g7qk\" (UID: \"a3db0253-f985-4d95-b46c-abb2acc3e872\") " pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:13.582013 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.581917 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc0a6a72-089b-44bd-97ca-a4963264f458-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.582013 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.581945 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jsshl\" (UniqueName: \"kubernetes.io/projected/cc0a6a72-089b-44bd-97ca-a4963264f458-kube-api-access-jsshl\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.582013 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.581974 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hv8b9\" (UniqueName: \"kubernetes.io/projected/a3db0253-f985-4d95-b46c-abb2acc3e872-kube-api-access-hv8b9\") pod \"network-metrics-daemon-8g7qk\" (UID: \"a3db0253-f985-4d95-b46c-abb2acc3e872\") " pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:13.582013 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:13.582011 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:13.582328 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.582057 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/cc0a6a72-089b-44bd-97ca-a4963264f458-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.582328 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:13.582098 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs podName:a3db0253-f985-4d95-b46c-abb2acc3e872 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:14.082065745 +0000 UTC m=+3.189694861 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs") pod "network-metrics-daemon-8g7qk" (UID: "a3db0253-f985-4d95-b46c-abb2acc3e872") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:13.582328 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.582161 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc0a6a72-089b-44bd-97ca-a4963264f458-cni-binary-copy\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.582328 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.582260 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cc0a6a72-089b-44bd-97ca-a4963264f458-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.582717 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.582697 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc0a6a72-089b-44bd-97ca-a4963264f458-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.590523 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.590499 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv8b9\" (UniqueName: \"kubernetes.io/projected/a3db0253-f985-4d95-b46c-abb2acc3e872-kube-api-access-hv8b9\") pod \"network-metrics-daemon-8g7qk\" (UID: \"a3db0253-f985-4d95-b46c-abb2acc3e872\") " pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:13.590667 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.590542 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsshl\" (UniqueName: \"kubernetes.io/projected/cc0a6a72-089b-44bd-97ca-a4963264f458-kube-api-access-jsshl\") pod \"multus-additional-cni-plugins-hkq7w\" (UID: \"cc0a6a72-089b-44bd-97ca-a4963264f458\") " pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.661500 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.661461 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:13.667308 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.667277 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nxthn" Apr 16 14:52:13.676083 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.676030 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" Apr 16 14:52:13.681687 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.681662 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4d625" Apr 16 14:52:13.689309 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.689272 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cl4w9" Apr 16 14:52:13.697056 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.697023 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" Apr 16 14:52:13.706796 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.706765 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5qg4q" Apr 16 14:52:13.712527 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.712496 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-47k57" Apr 16 14:52:13.718213 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.718186 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hkq7w" Apr 16 14:52:13.805256 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:13.805221 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:14.070167 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:14.070126 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e9ac884_30e2_4486_9e89_d541e73ee8c4.slice/crio-dad50aa8e4f56cedb5c41abc4c64ab18fb510cb6580d3f7e073cf7061c5097eb WatchSource:0}: Error finding container dad50aa8e4f56cedb5c41abc4c64ab18fb510cb6580d3f7e073cf7061c5097eb: Status 404 returned error can't find the container with id dad50aa8e4f56cedb5c41abc4c64ab18fb510cb6580d3f7e073cf7061c5097eb Apr 16 14:52:14.071947 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:14.071925 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b4700c8_468e_4f4d_9f39_760f7db3a824.slice/crio-e90da5edd6dcdf0013700a0f673393ae226db4597983e7f885dda45f8faa2ec4 WatchSource:0}: Error finding container e90da5edd6dcdf0013700a0f673393ae226db4597983e7f885dda45f8faa2ec4: Status 404 returned error can't find the container with id e90da5edd6dcdf0013700a0f673393ae226db4597983e7f885dda45f8faa2ec4 Apr 16 14:52:14.073547 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:14.073525 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca6006fe_c049_4f20_b847_d14270e6af58.slice/crio-6664c9d62e0ce9d5316a981c4d7f0d69f29a7dd94f089d05ad1fb214abe75745 WatchSource:0}: Error finding container 6664c9d62e0ce9d5316a981c4d7f0d69f29a7dd94f089d05ad1fb214abe75745: Status 404 returned error can't find the container with id 6664c9d62e0ce9d5316a981c4d7f0d69f29a7dd94f089d05ad1fb214abe75745 Apr 16 14:52:14.084937 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:14.084916 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmdjx\" (UniqueName: \"kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx\") pod \"network-check-target-6kn6f\" (UID: \"274057c1-8751-4b12-8464-7a42a2c6372c\") " pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:14.085020 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:14.084952 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs\") pod \"network-metrics-daemon-8g7qk\" (UID: \"a3db0253-f985-4d95-b46c-abb2acc3e872\") " pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:14.085058 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:14.085043 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:14.085058 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:14.085055 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:14.085133 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:14.085085 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:14.085133 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:14.085099 2579 projected.go:194] Error preparing data for projected volume kube-api-access-cmdjx for pod openshift-network-diagnostics/network-check-target-6kn6f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:14.085133 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:14.085117 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs podName:a3db0253-f985-4d95-b46c-abb2acc3e872 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:15.085095498 +0000 UTC m=+4.192724601 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs") pod "network-metrics-daemon-8g7qk" (UID: "a3db0253-f985-4d95-b46c-abb2acc3e872") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:14.085261 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:14.085132 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx podName:274057c1-8751-4b12-8464-7a42a2c6372c nodeName:}" failed. No retries permitted until 2026-04-16 14:52:15.085122323 +0000 UTC m=+4.192751431 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cmdjx" (UniqueName: "kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx") pod "network-check-target-6kn6f" (UID: "274057c1-8751-4b12-8464-7a42a2c6372c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:14.095743 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:14.095720 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc0a6a72_089b_44bd_97ca_a4963264f458.slice/crio-367b2fe64a28a04fc9c2b8587300b897f975ff12c9cac11b6eb3cab4911cf54e WatchSource:0}: Error finding container 367b2fe64a28a04fc9c2b8587300b897f975ff12c9cac11b6eb3cab4911cf54e: Status 404 returned error can't find the container with id 367b2fe64a28a04fc9c2b8587300b897f975ff12c9cac11b6eb3cab4911cf54e Apr 16 14:52:14.096367 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:14.096308 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6ae390c_ede3_458f_8330_0d8d3aad76c2.slice/crio-59c3eb50b47de071c5b188e30d0240937c6d850fad8932eb7413d73ede28419f WatchSource:0}: Error finding container 59c3eb50b47de071c5b188e30d0240937c6d850fad8932eb7413d73ede28419f: Status 404 returned error can't find the container with id 59c3eb50b47de071c5b188e30d0240937c6d850fad8932eb7413d73ede28419f Apr 16 14:52:14.097971 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:14.097939 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c666548_852c_40b9_aa2a_ee177bbfb811.slice/crio-df8910edc96147a02202a15752a2048c55269400f9cecb347cfaaef008059910 WatchSource:0}: Error finding container df8910edc96147a02202a15752a2048c55269400f9cecb347cfaaef008059910: Status 404 returned error can't find the container with id df8910edc96147a02202a15752a2048c55269400f9cecb347cfaaef008059910 Apr 16 14:52:14.098758 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:14.098726 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc778a259_410c_444b_a486_c230dd795def.slice/crio-74a7e970595e1f9d738c17c1aed23e700740bc737597fbcf09bdca40ac3f313f WatchSource:0}: Error finding container 74a7e970595e1f9d738c17c1aed23e700740bc737597fbcf09bdca40ac3f313f: Status 404 returned error can't find the container with id 74a7e970595e1f9d738c17c1aed23e700740bc737597fbcf09bdca40ac3f313f Apr 16 14:52:14.099801 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:14.099773 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod539ba0b2_e94b_4e6d_9955_d2325acb7a00.slice/crio-aa981d9802e4bc9cf7996b4bbcd8ff49725056a1227f6c604873f6e33679a90f WatchSource:0}: Error finding container aa981d9802e4bc9cf7996b4bbcd8ff49725056a1227f6c604873f6e33679a90f: Status 404 returned error can't find the container with id aa981d9802e4bc9cf7996b4bbcd8ff49725056a1227f6c604873f6e33679a90f Apr 16 14:52:14.100727 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:14.100707 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod493eb5f3_c1cb_4508_8b86_fb60aa459acc.slice/crio-4d1b72ec52c6e2477b75067f17fc2fb8b3d249b064287d741ce12dcf751bf9cd WatchSource:0}: Error finding container 4d1b72ec52c6e2477b75067f17fc2fb8b3d249b064287d741ce12dcf751bf9cd: Status 404 returned error can't find the container with id 4d1b72ec52c6e2477b75067f17fc2fb8b3d249b064287d741ce12dcf751bf9cd Apr 16 14:52:14.404672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:14.404545 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:12 +0000 UTC" deadline="2027-10-17 02:03:35.081107976 +0000 UTC" Apr 16 14:52:14.404672 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:14.404586 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13163h11m20.676526234s" Apr 16 14:52:14.477577 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:14.477538 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-105.ec2.internal" event={"ID":"c7b6d0dbc6e31e2e27484e63c6997a81","Type":"ContainerStarted","Data":"be5a9c674afd6b16f8affd70d79c161b5a545f1a1200a75f364f8514d9cdfe69"} Apr 16 14:52:14.485849 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:14.485789 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" event={"ID":"4c666548-852c-40b9-aa2a-ee177bbfb811","Type":"ContainerStarted","Data":"df8910edc96147a02202a15752a2048c55269400f9cecb347cfaaef008059910"} Apr 16 14:52:14.493130 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:14.493056 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5qg4q" event={"ID":"f6ae390c-ede3-458f-8330-0d8d3aad76c2","Type":"ContainerStarted","Data":"59c3eb50b47de071c5b188e30d0240937c6d850fad8932eb7413d73ede28419f"} Apr 16 14:52:14.497847 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:14.497779 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-47k57" event={"ID":"ca6006fe-c049-4f20-b847-d14270e6af58","Type":"ContainerStarted","Data":"6664c9d62e0ce9d5316a981c4d7f0d69f29a7dd94f089d05ad1fb214abe75745"} Apr 16 14:52:14.500907 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:14.500851 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" event={"ID":"493eb5f3-c1cb-4508-8b86-fb60aa459acc","Type":"ContainerStarted","Data":"4d1b72ec52c6e2477b75067f17fc2fb8b3d249b064287d741ce12dcf751bf9cd"} Apr 16 14:52:14.503725 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:14.503691 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m882k" event={"ID":"539ba0b2-e94b-4e6d-9955-d2325acb7a00","Type":"ContainerStarted","Data":"aa981d9802e4bc9cf7996b4bbcd8ff49725056a1227f6c604873f6e33679a90f"} Apr 16 14:52:14.509065 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:14.509034 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4d625" event={"ID":"c778a259-410c-444b-a486-c230dd795def","Type":"ContainerStarted","Data":"74a7e970595e1f9d738c17c1aed23e700740bc737597fbcf09bdca40ac3f313f"} Apr 16 14:52:14.511147 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:14.511114 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hkq7w" event={"ID":"cc0a6a72-089b-44bd-97ca-a4963264f458","Type":"ContainerStarted","Data":"367b2fe64a28a04fc9c2b8587300b897f975ff12c9cac11b6eb3cab4911cf54e"} Apr 16 14:52:14.513504 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:14.513465 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nxthn" event={"ID":"2b4700c8-468e-4f4d-9f39-760f7db3a824","Type":"ContainerStarted","Data":"e90da5edd6dcdf0013700a0f673393ae226db4597983e7f885dda45f8faa2ec4"} Apr 16 14:52:14.516857 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:14.516833 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cl4w9" event={"ID":"9e9ac884-30e2-4486-9e89-d541e73ee8c4","Type":"ContainerStarted","Data":"dad50aa8e4f56cedb5c41abc4c64ab18fb510cb6580d3f7e073cf7061c5097eb"} Apr 16 14:52:15.093371 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:15.092546 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs\") pod \"network-metrics-daemon-8g7qk\" (UID: \"a3db0253-f985-4d95-b46c-abb2acc3e872\") " pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:15.093371 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:15.092621 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmdjx\" (UniqueName: \"kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx\") pod \"network-check-target-6kn6f\" (UID: \"274057c1-8751-4b12-8464-7a42a2c6372c\") " pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:15.093371 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:15.092757 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:15.093371 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:15.092777 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:15.093371 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:15.092790 2579 projected.go:194] Error preparing data for projected volume kube-api-access-cmdjx for pod openshift-network-diagnostics/network-check-target-6kn6f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:15.093371 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:15.092848 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx podName:274057c1-8751-4b12-8464-7a42a2c6372c nodeName:}" failed. No retries permitted until 2026-04-16 14:52:17.092828255 +0000 UTC m=+6.200457378 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cmdjx" (UniqueName: "kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx") pod "network-check-target-6kn6f" (UID: "274057c1-8751-4b12-8464-7a42a2c6372c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:15.093371 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:15.093268 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:15.093371 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:15.093317 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs podName:a3db0253-f985-4d95-b46c-abb2acc3e872 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:17.093303744 +0000 UTC m=+6.200932853 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs") pod "network-metrics-daemon-8g7qk" (UID: "a3db0253-f985-4d95-b46c-abb2acc3e872") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:15.480608 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:15.480577 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:15.481182 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:15.480713 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8g7qk" podUID="a3db0253-f985-4d95-b46c-abb2acc3e872" Apr 16 14:52:15.481182 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:15.481099 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:15.481293 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:15.481194 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6kn6f" podUID="274057c1-8751-4b12-8464-7a42a2c6372c" Apr 16 14:52:15.531032 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:15.530999 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal" event={"ID":"cf0dcc2c635c65b3b15206dc72520006","Type":"ContainerDied","Data":"9bd1dd9004aa3ee1639bed261dd1e18d8d324eb8dab78c5e366e226918330fc0"} Apr 16 14:52:15.531229 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:15.530967 2579 generic.go:358] "Generic (PLEG): container finished" podID="cf0dcc2c635c65b3b15206dc72520006" containerID="9bd1dd9004aa3ee1639bed261dd1e18d8d324eb8dab78c5e366e226918330fc0" exitCode=0 Apr 16 14:52:15.547964 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:15.546613 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-105.ec2.internal" podStartSLOduration=3.546597455 podStartE2EDuration="3.546597455s" podCreationTimestamp="2026-04-16 14:52:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:14.490688407 +0000 UTC m=+3.598317532" watchObservedRunningTime="2026-04-16 14:52:15.546597455 +0000 UTC m=+4.654226582" Apr 16 14:52:16.540621 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:16.539930 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal" event={"ID":"cf0dcc2c635c65b3b15206dc72520006","Type":"ContainerStarted","Data":"4a6fa7c814be4682c0a9c469b6adf7fef72e63db595bbd76aa15c10e1f63b096"} Apr 16 14:52:16.552519 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:16.552439 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-105.ec2.internal" podStartSLOduration=4.55241845 podStartE2EDuration="4.55241845s" podCreationTimestamp="2026-04-16 14:52:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:16.551980971 +0000 UTC m=+5.659610095" watchObservedRunningTime="2026-04-16 14:52:16.55241845 +0000 UTC m=+5.660047576" Apr 16 14:52:17.112410 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:17.112367 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmdjx\" (UniqueName: \"kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx\") pod \"network-check-target-6kn6f\" (UID: \"274057c1-8751-4b12-8464-7a42a2c6372c\") " pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:17.112586 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:17.112442 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs\") pod \"network-metrics-daemon-8g7qk\" (UID: \"a3db0253-f985-4d95-b46c-abb2acc3e872\") " pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:17.112586 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:17.112568 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:17.112705 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:17.112631 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs podName:a3db0253-f985-4d95-b46c-abb2acc3e872 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:21.112611737 +0000 UTC m=+10.220240839 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs") pod "network-metrics-daemon-8g7qk" (UID: "a3db0253-f985-4d95-b46c-abb2acc3e872") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:17.113083 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:17.113047 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:17.113083 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:17.113085 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:17.113254 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:17.113099 2579 projected.go:194] Error preparing data for projected volume kube-api-access-cmdjx for pod openshift-network-diagnostics/network-check-target-6kn6f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:17.113254 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:17.113147 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx podName:274057c1-8751-4b12-8464-7a42a2c6372c nodeName:}" failed. No retries permitted until 2026-04-16 14:52:21.113129865 +0000 UTC m=+10.220758970 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cmdjx" (UniqueName: "kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx") pod "network-check-target-6kn6f" (UID: "274057c1-8751-4b12-8464-7a42a2c6372c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:17.468598 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:17.468353 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:17.468598 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:17.468494 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8g7qk" podUID="a3db0253-f985-4d95-b46c-abb2acc3e872" Apr 16 14:52:17.468931 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:17.468898 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:17.469024 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:17.468991 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6kn6f" podUID="274057c1-8751-4b12-8464-7a42a2c6372c" Apr 16 14:52:19.468934 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:19.468205 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:19.468934 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:19.468356 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8g7qk" podUID="a3db0253-f985-4d95-b46c-abb2acc3e872" Apr 16 14:52:19.468934 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:19.468802 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:19.468934 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:19.468890 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6kn6f" podUID="274057c1-8751-4b12-8464-7a42a2c6372c" Apr 16 14:52:21.149730 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:21.149688 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs\") pod \"network-metrics-daemon-8g7qk\" (UID: \"a3db0253-f985-4d95-b46c-abb2acc3e872\") " pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:21.150370 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:21.149780 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmdjx\" (UniqueName: \"kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx\") pod \"network-check-target-6kn6f\" (UID: \"274057c1-8751-4b12-8464-7a42a2c6372c\") " pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:21.150370 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:21.149895 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:21.150370 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:21.149924 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:21.150370 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:21.149942 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:21.150370 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:21.149955 2579 projected.go:194] Error preparing data for projected volume kube-api-access-cmdjx for pod openshift-network-diagnostics/network-check-target-6kn6f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:21.150370 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:21.149975 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs podName:a3db0253-f985-4d95-b46c-abb2acc3e872 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:29.149954497 +0000 UTC m=+18.257583602 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs") pod "network-metrics-daemon-8g7qk" (UID: "a3db0253-f985-4d95-b46c-abb2acc3e872") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:21.150370 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:21.150007 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx podName:274057c1-8751-4b12-8464-7a42a2c6372c nodeName:}" failed. No retries permitted until 2026-04-16 14:52:29.149991118 +0000 UTC m=+18.257620234 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cmdjx" (UniqueName: "kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx") pod "network-check-target-6kn6f" (UID: "274057c1-8751-4b12-8464-7a42a2c6372c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:21.469345 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:21.469266 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:21.469345 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:21.469316 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:21.469543 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:21.469404 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6kn6f" podUID="274057c1-8751-4b12-8464-7a42a2c6372c" Apr 16 14:52:21.469599 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:21.469532 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8g7qk" podUID="a3db0253-f985-4d95-b46c-abb2acc3e872" Apr 16 14:52:23.468441 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:23.468370 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:23.468441 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:23.468413 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:23.468877 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:23.468527 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8g7qk" podUID="a3db0253-f985-4d95-b46c-abb2acc3e872" Apr 16 14:52:23.468877 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:23.468669 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6kn6f" podUID="274057c1-8751-4b12-8464-7a42a2c6372c" Apr 16 14:52:25.467786 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:25.467722 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:25.468267 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:25.467859 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8g7qk" podUID="a3db0253-f985-4d95-b46c-abb2acc3e872" Apr 16 14:52:25.468329 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:25.467723 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:25.468413 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:25.468384 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6kn6f" podUID="274057c1-8751-4b12-8464-7a42a2c6372c" Apr 16 14:52:27.467799 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:27.467759 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:27.468303 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:27.467902 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6kn6f" podUID="274057c1-8751-4b12-8464-7a42a2c6372c" Apr 16 14:52:27.468303 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:27.467983 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:27.468303 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:27.468125 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8g7qk" podUID="a3db0253-f985-4d95-b46c-abb2acc3e872" Apr 16 14:52:29.203852 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:29.203811 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmdjx\" (UniqueName: \"kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx\") pod \"network-check-target-6kn6f\" (UID: \"274057c1-8751-4b12-8464-7a42a2c6372c\") " pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:29.204307 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:29.203878 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs\") pod \"network-metrics-daemon-8g7qk\" (UID: \"a3db0253-f985-4d95-b46c-abb2acc3e872\") " pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:29.204307 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:29.203977 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:29.204307 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:29.203999 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:29.204307 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:29.204004 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:29.204307 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:29.204017 2579 projected.go:194] Error preparing data for projected volume kube-api-access-cmdjx for pod openshift-network-diagnostics/network-check-target-6kn6f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:29.204307 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:29.204066 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs podName:a3db0253-f985-4d95-b46c-abb2acc3e872 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:45.20404573 +0000 UTC m=+34.311674836 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs") pod "network-metrics-daemon-8g7qk" (UID: "a3db0253-f985-4d95-b46c-abb2acc3e872") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:29.204307 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:29.204100 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx podName:274057c1-8751-4b12-8464-7a42a2c6372c nodeName:}" failed. No retries permitted until 2026-04-16 14:52:45.204089641 +0000 UTC m=+34.311718744 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cmdjx" (UniqueName: "kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx") pod "network-check-target-6kn6f" (UID: "274057c1-8751-4b12-8464-7a42a2c6372c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:29.468057 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:29.467983 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:29.468314 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:29.468142 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8g7qk" podUID="a3db0253-f985-4d95-b46c-abb2acc3e872" Apr 16 14:52:29.468314 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:29.468181 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:29.468314 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:29.468264 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6kn6f" podUID="274057c1-8751-4b12-8464-7a42a2c6372c" Apr 16 14:52:29.930641 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:29.929335 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-s8pz4"] Apr 16 14:52:30.016024 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:30.015989 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:30.016198 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:30.016105 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s8pz4" podUID="eb304f66-fefa-4772-b282-3ce9a4298910" Apr 16 14:52:30.111290 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:30.111248 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/eb304f66-fefa-4772-b282-3ce9a4298910-dbus\") pod \"global-pull-secret-syncer-s8pz4\" (UID: \"eb304f66-fefa-4772-b282-3ce9a4298910\") " pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:30.111480 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:30.111403 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/eb304f66-fefa-4772-b282-3ce9a4298910-kubelet-config\") pod \"global-pull-secret-syncer-s8pz4\" (UID: \"eb304f66-fefa-4772-b282-3ce9a4298910\") " pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:30.111480 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:30.111441 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/eb304f66-fefa-4772-b282-3ce9a4298910-original-pull-secret\") pod \"global-pull-secret-syncer-s8pz4\" (UID: \"eb304f66-fefa-4772-b282-3ce9a4298910\") " pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:30.212572 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:30.212488 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/eb304f66-fefa-4772-b282-3ce9a4298910-kubelet-config\") pod \"global-pull-secret-syncer-s8pz4\" (UID: \"eb304f66-fefa-4772-b282-3ce9a4298910\") " pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:30.212572 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:30.212529 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/eb304f66-fefa-4772-b282-3ce9a4298910-original-pull-secret\") pod \"global-pull-secret-syncer-s8pz4\" (UID: \"eb304f66-fefa-4772-b282-3ce9a4298910\") " pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:30.212572 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:30.212569 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/eb304f66-fefa-4772-b282-3ce9a4298910-dbus\") pod \"global-pull-secret-syncer-s8pz4\" (UID: \"eb304f66-fefa-4772-b282-3ce9a4298910\") " pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:30.213138 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:30.212616 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/eb304f66-fefa-4772-b282-3ce9a4298910-kubelet-config\") pod \"global-pull-secret-syncer-s8pz4\" (UID: \"eb304f66-fefa-4772-b282-3ce9a4298910\") " pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:30.213138 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:30.212712 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/eb304f66-fefa-4772-b282-3ce9a4298910-dbus\") pod \"global-pull-secret-syncer-s8pz4\" (UID: \"eb304f66-fefa-4772-b282-3ce9a4298910\") " pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:30.213138 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:30.212727 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:30.213138 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:30.212793 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb304f66-fefa-4772-b282-3ce9a4298910-original-pull-secret podName:eb304f66-fefa-4772-b282-3ce9a4298910 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:30.712777161 +0000 UTC m=+19.820406264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/eb304f66-fefa-4772-b282-3ce9a4298910-original-pull-secret") pod "global-pull-secret-syncer-s8pz4" (UID: "eb304f66-fefa-4772-b282-3ce9a4298910") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:30.716878 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:30.716848 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/eb304f66-fefa-4772-b282-3ce9a4298910-original-pull-secret\") pod \"global-pull-secret-syncer-s8pz4\" (UID: \"eb304f66-fefa-4772-b282-3ce9a4298910\") " pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:30.717012 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:30.717002 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:30.717060 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:30.717055 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb304f66-fefa-4772-b282-3ce9a4298910-original-pull-secret podName:eb304f66-fefa-4772-b282-3ce9a4298910 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:31.71704125 +0000 UTC m=+20.824670356 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/eb304f66-fefa-4772-b282-3ce9a4298910-original-pull-secret") pod "global-pull-secret-syncer-s8pz4" (UID: "eb304f66-fefa-4772-b282-3ce9a4298910") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:31.468189 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:31.467964 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:31.468995 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:31.467966 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:31.468995 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:31.468272 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s8pz4" podUID="eb304f66-fefa-4772-b282-3ce9a4298910" Apr 16 14:52:31.468995 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:31.468357 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6kn6f" podUID="274057c1-8751-4b12-8464-7a42a2c6372c" Apr 16 14:52:31.468995 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:31.468015 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:31.468995 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:31.468455 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8g7qk" podUID="a3db0253-f985-4d95-b46c-abb2acc3e872" Apr 16 14:52:31.564564 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:31.564528 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" event={"ID":"493eb5f3-c1cb-4508-8b86-fb60aa459acc","Type":"ContainerStarted","Data":"08304e81a05c4d868f26b7e97c7c825d2a912c355e3dd90b477dea036e953567"} Apr 16 14:52:31.565948 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:31.565923 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m882k" event={"ID":"539ba0b2-e94b-4e6d-9955-d2325acb7a00","Type":"ContainerStarted","Data":"002d065fdebd5796b331d1baf3a4473485acee5ca0dafb38c17da151a9b0c807"} Apr 16 14:52:31.566065 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:31.565957 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m882k" event={"ID":"539ba0b2-e94b-4e6d-9955-d2325acb7a00","Type":"ContainerStarted","Data":"b615a8df4d4795956e6cac8426d07d611ad1300ee441ccf89819ba18909dfddf"} Apr 16 14:52:31.567155 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:31.567129 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4d625" event={"ID":"c778a259-410c-444b-a486-c230dd795def","Type":"ContainerStarted","Data":"3393843d7085cb7747e7ac63a97641f0541aaa56bf25a0feac4f5a92f3c4e249"} Apr 16 14:52:31.568477 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:31.568445 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hkq7w" event={"ID":"cc0a6a72-089b-44bd-97ca-a4963264f458","Type":"ContainerStarted","Data":"111e39d0a8ebd9d4d2c4ac51650da26a2c68d1ffa96409c52a22069175f61b6e"} Apr 16 14:52:31.569953 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:31.569916 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nxthn" event={"ID":"2b4700c8-468e-4f4d-9f39-760f7db3a824","Type":"ContainerStarted","Data":"db0be96a92d1ea51c00abf48d7cb89f8165b03e22df5bbd4acdf82e99caeaabb"} Apr 16 14:52:31.571281 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:31.571259 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" event={"ID":"4c666548-852c-40b9-aa2a-ee177bbfb811","Type":"ContainerStarted","Data":"e172258bba856910bcb47a98a8130d3c4d51bdbf3fb68ea4e98784e333f243f1"} Apr 16 14:52:31.572616 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:31.572595 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5qg4q" event={"ID":"f6ae390c-ede3-458f-8330-0d8d3aad76c2","Type":"ContainerStarted","Data":"130a241a5e4f937c3a9c6fcda2e83d68b09be2ab5d63591d3968db1542e087ba"} Apr 16 14:52:31.574090 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:31.574053 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-47k57" event={"ID":"ca6006fe-c049-4f20-b847-d14270e6af58","Type":"ContainerStarted","Data":"5b82a81f46346e2af0268401992311176592c86428dfb40aa3a4379ca2358335"} Apr 16 14:52:31.577976 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:31.577938 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4d625" podStartSLOduration=3.600204329 podStartE2EDuration="20.577928146s" podCreationTimestamp="2026-04-16 14:52:11 +0000 UTC" firstStartedPulling="2026-04-16 14:52:14.102321214 +0000 UTC m=+3.209950329" lastFinishedPulling="2026-04-16 14:52:31.080045031 +0000 UTC m=+20.187674146" observedRunningTime="2026-04-16 14:52:31.577765528 +0000 UTC m=+20.685394651" watchObservedRunningTime="2026-04-16 14:52:31.577928146 +0000 UTC m=+20.685557269" Apr 16 14:52:31.588907 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:31.588830 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-47k57" podStartSLOduration=3.547810424 podStartE2EDuration="20.588815489s" podCreationTimestamp="2026-04-16 14:52:11 +0000 UTC" firstStartedPulling="2026-04-16 14:52:14.094614996 +0000 UTC m=+3.202244108" lastFinishedPulling="2026-04-16 14:52:31.135620068 +0000 UTC m=+20.243249173" observedRunningTime="2026-04-16 14:52:31.588705801 +0000 UTC m=+20.696334924" watchObservedRunningTime="2026-04-16 14:52:31.588815489 +0000 UTC m=+20.696444612" Apr 16 14:52:31.613471 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:31.613421 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-nxthn" podStartSLOduration=3.5854384660000003 podStartE2EDuration="20.613404388s" podCreationTimestamp="2026-04-16 14:52:11 +0000 UTC" firstStartedPulling="2026-04-16 14:52:14.09402126 +0000 UTC m=+3.201650363" lastFinishedPulling="2026-04-16 14:52:31.121987172 +0000 UTC m=+20.229616285" observedRunningTime="2026-04-16 14:52:31.612992883 +0000 UTC m=+20.720622007" watchObservedRunningTime="2026-04-16 14:52:31.613404388 +0000 UTC m=+20.721033613" Apr 16 14:52:31.628394 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:31.628203 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jpzc4" podStartSLOduration=3.629451515 podStartE2EDuration="20.628188373s" podCreationTimestamp="2026-04-16 14:52:11 +0000 UTC" firstStartedPulling="2026-04-16 14:52:14.101989784 +0000 UTC m=+3.209618885" lastFinishedPulling="2026-04-16 14:52:31.100726627 +0000 UTC m=+20.208355743" observedRunningTime="2026-04-16 14:52:31.627650559 +0000 UTC m=+20.735279685" watchObservedRunningTime="2026-04-16 14:52:31.628188373 +0000 UTC m=+20.735817496" Apr 16 14:52:31.642534 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:31.642476 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5qg4q" podStartSLOduration=3.6395875220000002 podStartE2EDuration="20.642458523s" podCreationTimestamp="2026-04-16 14:52:11 +0000 UTC" firstStartedPulling="2026-04-16 14:52:14.097887212 +0000 UTC m=+3.205516327" lastFinishedPulling="2026-04-16 14:52:31.100758224 +0000 UTC m=+20.208387328" observedRunningTime="2026-04-16 14:52:31.642096869 +0000 UTC m=+20.749725996" watchObservedRunningTime="2026-04-16 14:52:31.642458523 +0000 UTC m=+20.750087646" Apr 16 14:52:31.726398 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:31.726361 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/eb304f66-fefa-4772-b282-3ce9a4298910-original-pull-secret\") pod \"global-pull-secret-syncer-s8pz4\" (UID: \"eb304f66-fefa-4772-b282-3ce9a4298910\") " pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:31.726543 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:31.726474 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:31.726543 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:31.726528 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb304f66-fefa-4772-b282-3ce9a4298910-original-pull-secret podName:eb304f66-fefa-4772-b282-3ce9a4298910 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:33.726514515 +0000 UTC m=+22.834143617 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/eb304f66-fefa-4772-b282-3ce9a4298910-original-pull-secret") pod "global-pull-secret-syncer-s8pz4" (UID: "eb304f66-fefa-4772-b282-3ce9a4298910") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:31.960269 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:31.960224 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-nxthn" Apr 16 14:52:31.960902 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:31.960878 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-nxthn" Apr 16 14:52:32.578462 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:32.578434 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 14:52:32.578825 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:32.578709 2579 generic.go:358] "Generic (PLEG): container finished" podID="539ba0b2-e94b-4e6d-9955-d2325acb7a00" containerID="002d065fdebd5796b331d1baf3a4473485acee5ca0dafb38c17da151a9b0c807" exitCode=1 Apr 16 14:52:32.578825 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:32.578784 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m882k" event={"ID":"539ba0b2-e94b-4e6d-9955-d2325acb7a00","Type":"ContainerDied","Data":"002d065fdebd5796b331d1baf3a4473485acee5ca0dafb38c17da151a9b0c807"} Apr 16 14:52:32.578825 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:32.578822 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m882k" event={"ID":"539ba0b2-e94b-4e6d-9955-d2325acb7a00","Type":"ContainerStarted","Data":"ee5814340f8817c176fe981950e5a55df61131070e37ec8d79b10981feff6fb6"} Apr 16 14:52:32.578943 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:32.578836 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m882k" event={"ID":"539ba0b2-e94b-4e6d-9955-d2325acb7a00","Type":"ContainerStarted","Data":"0a849bcebf0c26a05ff9b4797d2272b97fdcc1fe48cd4661b81d3b6f54238a8c"} Apr 16 14:52:32.578943 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:32.578848 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m882k" event={"ID":"539ba0b2-e94b-4e6d-9955-d2325acb7a00","Type":"ContainerStarted","Data":"4c94539a91503d2cae0640bd4e419be1cc56ce53fbfa7790ac81f18d008d381a"} Apr 16 14:52:32.578943 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:32.578861 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m882k" event={"ID":"539ba0b2-e94b-4e6d-9955-d2325acb7a00","Type":"ContainerStarted","Data":"dffff1e98b0865ba2dbadc4a5309c59f553ea93fc08698d0d3414f6b4da0df56"} Apr 16 14:52:32.580046 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:32.580025 2579 generic.go:358] "Generic (PLEG): container finished" podID="cc0a6a72-089b-44bd-97ca-a4963264f458" containerID="111e39d0a8ebd9d4d2c4ac51650da26a2c68d1ffa96409c52a22069175f61b6e" exitCode=0 Apr 16 14:52:32.580153 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:32.580105 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hkq7w" event={"ID":"cc0a6a72-089b-44bd-97ca-a4963264f458","Type":"ContainerDied","Data":"111e39d0a8ebd9d4d2c4ac51650da26a2c68d1ffa96409c52a22069175f61b6e"} Apr 16 14:52:32.581360 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:32.581335 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cl4w9" event={"ID":"9e9ac884-30e2-4486-9e89-d541e73ee8c4","Type":"ContainerStarted","Data":"69c87522d9a50747fcee0800484d1a25075f838645fd7994b5a47fc7db447511"} Apr 16 14:52:32.581674 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:32.581650 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-nxthn" Apr 16 14:52:32.582086 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:32.582054 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-nxthn" Apr 16 14:52:32.612780 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:32.612737 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-cl4w9" podStartSLOduration=4.942119358 podStartE2EDuration="21.612723365s" podCreationTimestamp="2026-04-16 14:52:11 +0000 UTC" firstStartedPulling="2026-04-16 14:52:14.072959985 +0000 UTC m=+3.180589094" lastFinishedPulling="2026-04-16 14:52:30.743563983 +0000 UTC m=+19.851193101" observedRunningTime="2026-04-16 14:52:32.612408972 +0000 UTC m=+21.720038096" watchObservedRunningTime="2026-04-16 14:52:32.612723365 +0000 UTC m=+21.720352488" Apr 16 14:52:32.779014 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:32.778811 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 14:52:33.442262 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:33.442148 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T14:52:32.778970912Z","UUID":"80f7cab4-7602-43d1-bf24-40683080446a","Handler":null,"Name":"","Endpoint":""} Apr 16 14:52:33.444152 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:33.444130 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 14:52:33.444305 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:33.444162 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 14:52:33.471098 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:33.471051 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:33.471261 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:33.471051 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:33.471261 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:33.471190 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6kn6f" podUID="274057c1-8751-4b12-8464-7a42a2c6372c" Apr 16 14:52:33.471362 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:33.471294 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8g7qk" podUID="a3db0253-f985-4d95-b46c-abb2acc3e872" Apr 16 14:52:33.471362 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:33.471051 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:33.471440 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:33.471401 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s8pz4" podUID="eb304f66-fefa-4772-b282-3ce9a4298910" Apr 16 14:52:33.585815 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:33.585519 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" event={"ID":"493eb5f3-c1cb-4508-8b86-fb60aa459acc","Type":"ContainerStarted","Data":"e8cbc6767da2c3edf6abce88729448dc4b3e6f99fa27054feec702268047348a"} Apr 16 14:52:33.740868 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:33.740788 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/eb304f66-fefa-4772-b282-3ce9a4298910-original-pull-secret\") pod \"global-pull-secret-syncer-s8pz4\" (UID: \"eb304f66-fefa-4772-b282-3ce9a4298910\") " pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:33.741010 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:33.740949 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:33.741067 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:33.741051 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb304f66-fefa-4772-b282-3ce9a4298910-original-pull-secret podName:eb304f66-fefa-4772-b282-3ce9a4298910 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:37.741009647 +0000 UTC m=+26.848638756 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/eb304f66-fefa-4772-b282-3ce9a4298910-original-pull-secret") pod "global-pull-secret-syncer-s8pz4" (UID: "eb304f66-fefa-4772-b282-3ce9a4298910") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:34.589555 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:34.589366 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" event={"ID":"493eb5f3-c1cb-4508-8b86-fb60aa459acc","Type":"ContainerStarted","Data":"67b747f50063c84b5fe92be72ff59a0293b061dd010012c479c3bd22abc66e45"} Apr 16 14:52:34.592709 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:34.592689 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 14:52:34.593473 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:34.593404 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m882k" event={"ID":"539ba0b2-e94b-4e6d-9955-d2325acb7a00","Type":"ContainerStarted","Data":"3a96037add777c9316d721e253bf5fbf32e2d41c241ca460eaa0be02d2b0a146"} Apr 16 14:52:34.604544 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:34.604480 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pwmd6" podStartSLOduration=3.569086241 podStartE2EDuration="23.604464965s" podCreationTimestamp="2026-04-16 14:52:11 +0000 UTC" firstStartedPulling="2026-04-16 14:52:14.102244527 +0000 UTC m=+3.209873642" lastFinishedPulling="2026-04-16 14:52:34.137623249 +0000 UTC m=+23.245252366" observedRunningTime="2026-04-16 14:52:34.604281245 +0000 UTC m=+23.711910371" watchObservedRunningTime="2026-04-16 14:52:34.604464965 +0000 UTC m=+23.712094091" Apr 16 14:52:35.467761 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:35.467723 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:35.467761 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:35.467766 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:35.468006 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:35.467856 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6kn6f" podUID="274057c1-8751-4b12-8464-7a42a2c6372c" Apr 16 14:52:35.468006 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:35.467923 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:35.468122 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:35.468039 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8g7qk" podUID="a3db0253-f985-4d95-b46c-abb2acc3e872" Apr 16 14:52:35.468158 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:35.468143 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s8pz4" podUID="eb304f66-fefa-4772-b282-3ce9a4298910" Apr 16 14:52:37.470620 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:37.470442 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:37.471323 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:37.470442 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:37.471323 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:37.470693 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s8pz4" podUID="eb304f66-fefa-4772-b282-3ce9a4298910" Apr 16 14:52:37.471323 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:37.470442 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:37.471323 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:37.470770 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6kn6f" podUID="274057c1-8751-4b12-8464-7a42a2c6372c" Apr 16 14:52:37.471323 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:37.470827 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8g7qk" podUID="a3db0253-f985-4d95-b46c-abb2acc3e872" Apr 16 14:52:37.601898 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:37.601873 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 14:52:37.602250 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:37.602226 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m882k" event={"ID":"539ba0b2-e94b-4e6d-9955-d2325acb7a00","Type":"ContainerStarted","Data":"5ab9cf74e95ddda62890bf7bc286f087c774758aae21ebb82fc5373e3ac12583"} Apr 16 14:52:37.602500 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:37.602467 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:37.602500 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:37.602497 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:37.602686 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:37.602509 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:37.602773 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:37.602741 2579 scope.go:117] "RemoveContainer" containerID="002d065fdebd5796b331d1baf3a4473485acee5ca0dafb38c17da151a9b0c807" Apr 16 14:52:37.603993 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:37.603969 2579 generic.go:358] "Generic (PLEG): container finished" podID="cc0a6a72-089b-44bd-97ca-a4963264f458" containerID="495d57be75b928d9ef314ed7022cd0145737819a61cf5b3aebf184c5219c220b" exitCode=0 Apr 16 14:52:37.604102 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:37.604017 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hkq7w" event={"ID":"cc0a6a72-089b-44bd-97ca-a4963264f458","Type":"ContainerDied","Data":"495d57be75b928d9ef314ed7022cd0145737819a61cf5b3aebf184c5219c220b"} Apr 16 14:52:37.618359 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:37.618295 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:37.618414 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:37.618392 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:52:37.771751 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:37.771643 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/eb304f66-fefa-4772-b282-3ce9a4298910-original-pull-secret\") pod \"global-pull-secret-syncer-s8pz4\" (UID: \"eb304f66-fefa-4772-b282-3ce9a4298910\") " pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:37.771940 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:37.771802 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:37.771940 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:37.771873 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb304f66-fefa-4772-b282-3ce9a4298910-original-pull-secret podName:eb304f66-fefa-4772-b282-3ce9a4298910 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:45.771858033 +0000 UTC m=+34.879487136 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/eb304f66-fefa-4772-b282-3ce9a4298910-original-pull-secret") pod "global-pull-secret-syncer-s8pz4" (UID: "eb304f66-fefa-4772-b282-3ce9a4298910") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:38.549445 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:38.549213 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-s8pz4"] Apr 16 14:52:38.549774 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:38.549505 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:38.549774 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:38.549590 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s8pz4" podUID="eb304f66-fefa-4772-b282-3ce9a4298910" Apr 16 14:52:38.552477 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:38.552447 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8g7qk"] Apr 16 14:52:38.552597 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:38.552573 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:38.552681 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:38.552663 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8g7qk" podUID="a3db0253-f985-4d95-b46c-abb2acc3e872" Apr 16 14:52:38.553353 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:38.553329 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6kn6f"] Apr 16 14:52:38.553442 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:38.553432 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:38.553546 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:38.553521 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6kn6f" podUID="274057c1-8751-4b12-8464-7a42a2c6372c" Apr 16 14:52:38.609015 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:38.608986 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 14:52:38.609365 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:38.609333 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m882k" event={"ID":"539ba0b2-e94b-4e6d-9955-d2325acb7a00","Type":"ContainerStarted","Data":"580c8a79fe0b2aea6a8637a810a53d5f86796c004da3d74d573762fefa42fd78"} Apr 16 14:52:38.611151 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:38.611130 2579 generic.go:358] "Generic (PLEG): container finished" podID="cc0a6a72-089b-44bd-97ca-a4963264f458" containerID="7130067849cab5c41d7cf9aa640f296ab8313cfa1aa773a325c5bf57604fbfe0" exitCode=0 Apr 16 14:52:38.611249 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:38.611168 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hkq7w" event={"ID":"cc0a6a72-089b-44bd-97ca-a4963264f458","Type":"ContainerDied","Data":"7130067849cab5c41d7cf9aa640f296ab8313cfa1aa773a325c5bf57604fbfe0"} Apr 16 14:52:38.632631 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:38.632590 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-m882k" podStartSLOduration=10.559263302 podStartE2EDuration="27.632576837s" podCreationTimestamp="2026-04-16 14:52:11 +0000 UTC" firstStartedPulling="2026-04-16 14:52:14.102670678 +0000 UTC m=+3.210299787" lastFinishedPulling="2026-04-16 14:52:31.175984208 +0000 UTC m=+20.283613322" observedRunningTime="2026-04-16 14:52:38.631776042 +0000 UTC m=+27.739405166" watchObservedRunningTime="2026-04-16 14:52:38.632576837 +0000 UTC m=+27.740205960" Apr 16 14:52:39.616545 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:39.616457 2579 generic.go:358] "Generic (PLEG): container finished" podID="cc0a6a72-089b-44bd-97ca-a4963264f458" containerID="d2342789f8ce5002746111855461a9e0e6413d49332f420b141589bdf69ed52d" exitCode=0 Apr 16 14:52:39.616545 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:39.616534 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hkq7w" event={"ID":"cc0a6a72-089b-44bd-97ca-a4963264f458","Type":"ContainerDied","Data":"d2342789f8ce5002746111855461a9e0e6413d49332f420b141589bdf69ed52d"} Apr 16 14:52:40.468002 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:40.467954 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:40.468196 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:40.468087 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:40.468196 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:40.468124 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:40.468321 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:40.468088 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s8pz4" podUID="eb304f66-fefa-4772-b282-3ce9a4298910" Apr 16 14:52:40.468321 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:40.468198 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6kn6f" podUID="274057c1-8751-4b12-8464-7a42a2c6372c" Apr 16 14:52:40.468424 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:40.468309 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8g7qk" podUID="a3db0253-f985-4d95-b46c-abb2acc3e872" Apr 16 14:52:42.468290 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:42.468256 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:42.469022 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:42.468256 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:42.469022 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:42.468375 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6kn6f" podUID="274057c1-8751-4b12-8464-7a42a2c6372c" Apr 16 14:52:42.469022 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:42.468268 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:42.469022 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:42.468448 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-s8pz4" podUID="eb304f66-fefa-4772-b282-3ce9a4298910" Apr 16 14:52:42.469022 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:42.468529 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8g7qk" podUID="a3db0253-f985-4d95-b46c-abb2acc3e872" Apr 16 14:52:44.194847 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.194819 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-105.ec2.internal" event="NodeReady" Apr 16 14:52:44.195321 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.195006 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 14:52:44.233570 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.233534 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9zlt8"] Apr 16 14:52:44.235995 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.235967 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9zlt8" Apr 16 14:52:44.236498 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.236472 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hg2hp"] Apr 16 14:52:44.238348 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.238108 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hg2hp" Apr 16 14:52:44.238348 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.238193 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 14:52:44.238348 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.238270 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9hwfz\"" Apr 16 14:52:44.238647 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.238632 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 14:52:44.239957 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.239932 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 14:52:44.240045 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.239941 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 14:52:44.240553 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.240533 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cvt8w\"" Apr 16 14:52:44.240656 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.240586 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 14:52:44.248385 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.248359 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9zlt8"] Apr 16 14:52:44.248838 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.248813 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hg2hp"] Apr 16 14:52:44.321547 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.321512 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnmjz\" (UniqueName: \"kubernetes.io/projected/be6cb6cd-b928-4807-90d1-c1f8d6657af1-kube-api-access-fnmjz\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:52:44.321727 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.321563 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be6cb6cd-b928-4807-90d1-c1f8d6657af1-config-volume\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:52:44.321727 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.321593 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert\") pod \"ingress-canary-hg2hp\" (UID: \"7c5aa40b-af79-42ef-99df-394eb1b2d683\") " pod="openshift-ingress-canary/ingress-canary-hg2hp" Apr 16 14:52:44.321727 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.321648 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t52n4\" (UniqueName: \"kubernetes.io/projected/7c5aa40b-af79-42ef-99df-394eb1b2d683-kube-api-access-t52n4\") pod \"ingress-canary-hg2hp\" (UID: \"7c5aa40b-af79-42ef-99df-394eb1b2d683\") " pod="openshift-ingress-canary/ingress-canary-hg2hp" Apr 16 14:52:44.321727 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.321703 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:52:44.321907 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.321734 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/be6cb6cd-b928-4807-90d1-c1f8d6657af1-tmp-dir\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:52:44.423007 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.422975 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnmjz\" (UniqueName: \"kubernetes.io/projected/be6cb6cd-b928-4807-90d1-c1f8d6657af1-kube-api-access-fnmjz\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:52:44.423007 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.423015 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be6cb6cd-b928-4807-90d1-c1f8d6657af1-config-volume\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:52:44.423261 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.423040 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert\") pod \"ingress-canary-hg2hp\" (UID: \"7c5aa40b-af79-42ef-99df-394eb1b2d683\") " pod="openshift-ingress-canary/ingress-canary-hg2hp" Apr 16 14:52:44.423261 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:44.423151 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:44.423261 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.423159 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t52n4\" (UniqueName: \"kubernetes.io/projected/7c5aa40b-af79-42ef-99df-394eb1b2d683-kube-api-access-t52n4\") pod \"ingress-canary-hg2hp\" (UID: \"7c5aa40b-af79-42ef-99df-394eb1b2d683\") " pod="openshift-ingress-canary/ingress-canary-hg2hp" Apr 16 14:52:44.423261 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.423207 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:52:44.423261 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.423247 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/be6cb6cd-b928-4807-90d1-c1f8d6657af1-tmp-dir\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:52:44.423557 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:44.423304 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:44.423557 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:44.423334 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert podName:7c5aa40b-af79-42ef-99df-394eb1b2d683 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:44.923315253 +0000 UTC m=+34.030944361 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert") pod "ingress-canary-hg2hp" (UID: "7c5aa40b-af79-42ef-99df-394eb1b2d683") : secret "canary-serving-cert" not found Apr 16 14:52:44.423557 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:44.423354 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls podName:be6cb6cd-b928-4807-90d1-c1f8d6657af1 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:44.923344324 +0000 UTC m=+34.030973426 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls") pod "dns-default-9zlt8" (UID: "be6cb6cd-b928-4807-90d1-c1f8d6657af1") : secret "dns-default-metrics-tls" not found Apr 16 14:52:44.423557 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.423541 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/be6cb6cd-b928-4807-90d1-c1f8d6657af1-tmp-dir\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:52:44.423737 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.423626 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be6cb6cd-b928-4807-90d1-c1f8d6657af1-config-volume\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:52:44.433000 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.432971 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnmjz\" (UniqueName: \"kubernetes.io/projected/be6cb6cd-b928-4807-90d1-c1f8d6657af1-kube-api-access-fnmjz\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:52:44.433149 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.433053 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t52n4\" (UniqueName: \"kubernetes.io/projected/7c5aa40b-af79-42ef-99df-394eb1b2d683-kube-api-access-t52n4\") pod \"ingress-canary-hg2hp\" (UID: \"7c5aa40b-af79-42ef-99df-394eb1b2d683\") " pod="openshift-ingress-canary/ingress-canary-hg2hp" Apr 16 14:52:44.468473 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.468393 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:44.468473 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.468452 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:44.468662 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.468650 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:44.470575 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.470554 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:52:44.470775 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.470753 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:52:44.470775 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.470771 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-fb76v\"" Apr 16 14:52:44.470925 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.470772 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sk8kg\"" Apr 16 14:52:44.470925 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.470808 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:52:44.470925 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.470812 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:52:44.928250 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.927934 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:52:44.928426 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:44.928129 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:44.928426 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:44.928309 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert\") pod \"ingress-canary-hg2hp\" (UID: \"7c5aa40b-af79-42ef-99df-394eb1b2d683\") " pod="openshift-ingress-canary/ingress-canary-hg2hp" Apr 16 14:52:44.928426 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:44.928343 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls podName:be6cb6cd-b928-4807-90d1-c1f8d6657af1 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:45.92831728 +0000 UTC m=+35.035946406 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls") pod "dns-default-9zlt8" (UID: "be6cb6cd-b928-4807-90d1-c1f8d6657af1") : secret "dns-default-metrics-tls" not found Apr 16 14:52:44.928426 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:44.928405 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:44.928637 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:44.928499 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert podName:7c5aa40b-af79-42ef-99df-394eb1b2d683 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:45.928485682 +0000 UTC m=+35.036114784 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert") pod "ingress-canary-hg2hp" (UID: "7c5aa40b-af79-42ef-99df-394eb1b2d683") : secret "canary-serving-cert" not found Apr 16 14:52:45.230769 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:45.230733 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs\") pod \"network-metrics-daemon-8g7qk\" (UID: \"a3db0253-f985-4d95-b46c-abb2acc3e872\") " pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:52:45.231233 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:45.230907 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:52:45.231233 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:45.230919 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmdjx\" (UniqueName: \"kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx\") pod \"network-check-target-6kn6f\" (UID: \"274057c1-8751-4b12-8464-7a42a2c6372c\") " pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:45.231233 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:45.230982 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs podName:a3db0253-f985-4d95-b46c-abb2acc3e872 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:17.230961795 +0000 UTC m=+66.338590905 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs") pod "network-metrics-daemon-8g7qk" (UID: "a3db0253-f985-4d95-b46c-abb2acc3e872") : secret "metrics-daemon-secret" not found Apr 16 14:52:45.233714 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:45.233692 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmdjx\" (UniqueName: \"kubernetes.io/projected/274057c1-8751-4b12-8464-7a42a2c6372c-kube-api-access-cmdjx\") pod \"network-check-target-6kn6f\" (UID: \"274057c1-8751-4b12-8464-7a42a2c6372c\") " pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:45.381298 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:45.381258 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:45.571100 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:45.571055 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6kn6f"] Apr 16 14:52:45.575641 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:45.575613 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod274057c1_8751_4b12_8464_7a42a2c6372c.slice/crio-eafce533e9cbb908b8e712de6f5e7d0b6c13d255ea7edebfd8f8e91aebd58e52 WatchSource:0}: Error finding container eafce533e9cbb908b8e712de6f5e7d0b6c13d255ea7edebfd8f8e91aebd58e52: Status 404 returned error can't find the container with id eafce533e9cbb908b8e712de6f5e7d0b6c13d255ea7edebfd8f8e91aebd58e52 Apr 16 14:52:45.630013 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:45.629973 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6kn6f" event={"ID":"274057c1-8751-4b12-8464-7a42a2c6372c","Type":"ContainerStarted","Data":"eafce533e9cbb908b8e712de6f5e7d0b6c13d255ea7edebfd8f8e91aebd58e52"} Apr 16 14:52:45.632504 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:45.632478 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hkq7w" event={"ID":"cc0a6a72-089b-44bd-97ca-a4963264f458","Type":"ContainerStarted","Data":"303d6b444018ede219bd9e436cea288ab79e21e53db421fd27454aa861d6a305"} Apr 16 14:52:45.836637 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:45.836552 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/eb304f66-fefa-4772-b282-3ce9a4298910-original-pull-secret\") pod \"global-pull-secret-syncer-s8pz4\" (UID: \"eb304f66-fefa-4772-b282-3ce9a4298910\") " pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:45.840032 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:45.840011 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/eb304f66-fefa-4772-b282-3ce9a4298910-original-pull-secret\") pod \"global-pull-secret-syncer-s8pz4\" (UID: \"eb304f66-fefa-4772-b282-3ce9a4298910\") " pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:45.937495 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:45.937455 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:52:45.937669 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:45.937560 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert\") pod \"ingress-canary-hg2hp\" (UID: \"7c5aa40b-af79-42ef-99df-394eb1b2d683\") " pod="openshift-ingress-canary/ingress-canary-hg2hp" Apr 16 14:52:45.937669 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:45.937629 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:45.937811 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:45.937673 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:45.937811 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:45.937697 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls podName:be6cb6cd-b928-4807-90d1-c1f8d6657af1 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:47.937681288 +0000 UTC m=+37.045310389 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls") pod "dns-default-9zlt8" (UID: "be6cb6cd-b928-4807-90d1-c1f8d6657af1") : secret "dns-default-metrics-tls" not found Apr 16 14:52:45.937811 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:45.937721 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert podName:7c5aa40b-af79-42ef-99df-394eb1b2d683 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:47.937704404 +0000 UTC m=+37.045333508 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert") pod "ingress-canary-hg2hp" (UID: "7c5aa40b-af79-42ef-99df-394eb1b2d683") : secret "canary-serving-cert" not found Apr 16 14:52:45.988986 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:45.988947 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s8pz4" Apr 16 14:52:46.117222 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:46.117132 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-s8pz4"] Apr 16 14:52:46.121439 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:52:46.121410 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb304f66_fefa_4772_b282_3ce9a4298910.slice/crio-54373b5127e371cc10e514dbd0c56c2e53077ad7792bcbc5382a31442de5d676 WatchSource:0}: Error finding container 54373b5127e371cc10e514dbd0c56c2e53077ad7792bcbc5382a31442de5d676: Status 404 returned error can't find the container with id 54373b5127e371cc10e514dbd0c56c2e53077ad7792bcbc5382a31442de5d676 Apr 16 14:52:46.638547 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:46.638364 2579 generic.go:358] "Generic (PLEG): container finished" podID="cc0a6a72-089b-44bd-97ca-a4963264f458" containerID="303d6b444018ede219bd9e436cea288ab79e21e53db421fd27454aa861d6a305" exitCode=0 Apr 16 14:52:46.638547 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:46.638463 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hkq7w" event={"ID":"cc0a6a72-089b-44bd-97ca-a4963264f458","Type":"ContainerDied","Data":"303d6b444018ede219bd9e436cea288ab79e21e53db421fd27454aa861d6a305"} Apr 16 14:52:46.640310 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:46.640268 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-s8pz4" event={"ID":"eb304f66-fefa-4772-b282-3ce9a4298910","Type":"ContainerStarted","Data":"54373b5127e371cc10e514dbd0c56c2e53077ad7792bcbc5382a31442de5d676"} Apr 16 14:52:47.646190 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:47.646152 2579 generic.go:358] "Generic (PLEG): container finished" podID="cc0a6a72-089b-44bd-97ca-a4963264f458" containerID="35f7aff33916d871e62ad95bd00fa6453f42b20f7f54442add227fe4424d348c" exitCode=0 Apr 16 14:52:47.647297 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:47.646203 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hkq7w" event={"ID":"cc0a6a72-089b-44bd-97ca-a4963264f458","Type":"ContainerDied","Data":"35f7aff33916d871e62ad95bd00fa6453f42b20f7f54442add227fe4424d348c"} Apr 16 14:52:47.953719 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:47.953683 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:52:47.953909 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:47.953774 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert\") pod \"ingress-canary-hg2hp\" (UID: \"7c5aa40b-af79-42ef-99df-394eb1b2d683\") " pod="openshift-ingress-canary/ingress-canary-hg2hp" Apr 16 14:52:47.953909 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:47.953868 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:47.953909 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:47.953870 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:47.954064 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:47.953931 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert podName:7c5aa40b-af79-42ef-99df-394eb1b2d683 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:51.95390881 +0000 UTC m=+41.061537912 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert") pod "ingress-canary-hg2hp" (UID: "7c5aa40b-af79-42ef-99df-394eb1b2d683") : secret "canary-serving-cert" not found Apr 16 14:52:47.954064 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:47.953949 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls podName:be6cb6cd-b928-4807-90d1-c1f8d6657af1 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:51.953940776 +0000 UTC m=+41.061569882 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls") pod "dns-default-9zlt8" (UID: "be6cb6cd-b928-4807-90d1-c1f8d6657af1") : secret "dns-default-metrics-tls" not found Apr 16 14:52:50.657931 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:50.657821 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6kn6f" event={"ID":"274057c1-8751-4b12-8464-7a42a2c6372c","Type":"ContainerStarted","Data":"3d37f054dad14c5db5bab275d2a33f862dfa3b1d4f5946cbfb2be651710ca93e"} Apr 16 14:52:50.658551 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:50.658223 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:52:50.659731 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:50.659703 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-s8pz4" event={"ID":"eb304f66-fefa-4772-b282-3ce9a4298910","Type":"ContainerStarted","Data":"3085bee29d977e724b1ecbdeea398cc87981bdb9b4cbce317087475fda96d694"} Apr 16 14:52:50.662656 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:50.662633 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hkq7w" event={"ID":"cc0a6a72-089b-44bd-97ca-a4963264f458","Type":"ContainerStarted","Data":"9d6ebc6bf2f1d7e293ac0a4414461d08bc6290e2ce2d691903a631f41fcd9051"} Apr 16 14:52:50.670492 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:50.670452 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-6kn6f" podStartSLOduration=34.897762589 podStartE2EDuration="39.670438065s" podCreationTimestamp="2026-04-16 14:52:11 +0000 UTC" firstStartedPulling="2026-04-16 14:52:45.57765053 +0000 UTC m=+34.685279632" lastFinishedPulling="2026-04-16 14:52:50.350326003 +0000 UTC m=+39.457955108" observedRunningTime="2026-04-16 14:52:50.670329719 +0000 UTC m=+39.777958842" watchObservedRunningTime="2026-04-16 14:52:50.670438065 +0000 UTC m=+39.778067190" Apr 16 14:52:50.682593 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:50.682546 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-s8pz4" podStartSLOduration=17.445915483 podStartE2EDuration="21.682533859s" podCreationTimestamp="2026-04-16 14:52:29 +0000 UTC" firstStartedPulling="2026-04-16 14:52:46.123630072 +0000 UTC m=+35.231259174" lastFinishedPulling="2026-04-16 14:52:50.360248437 +0000 UTC m=+39.467877550" observedRunningTime="2026-04-16 14:52:50.682048228 +0000 UTC m=+39.789677353" watchObservedRunningTime="2026-04-16 14:52:50.682533859 +0000 UTC m=+39.790162982" Apr 16 14:52:50.701398 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:50.701353 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hkq7w" podStartSLOduration=8.402645372 podStartE2EDuration="39.701341309s" podCreationTimestamp="2026-04-16 14:52:11 +0000 UTC" firstStartedPulling="2026-04-16 14:52:14.097461152 +0000 UTC m=+3.205090254" lastFinishedPulling="2026-04-16 14:52:45.396157084 +0000 UTC m=+34.503786191" observedRunningTime="2026-04-16 14:52:50.700790249 +0000 UTC m=+39.808419384" watchObservedRunningTime="2026-04-16 14:52:50.701341309 +0000 UTC m=+39.808970432" Apr 16 14:52:51.982368 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:51.982319 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert\") pod \"ingress-canary-hg2hp\" (UID: \"7c5aa40b-af79-42ef-99df-394eb1b2d683\") " pod="openshift-ingress-canary/ingress-canary-hg2hp" Apr 16 14:52:51.982368 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:52:51.982376 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:52:51.982839 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:51.982472 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:51.982839 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:51.982536 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:51.982839 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:51.982560 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert podName:7c5aa40b-af79-42ef-99df-394eb1b2d683 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:59.982544438 +0000 UTC m=+49.090173540 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert") pod "ingress-canary-hg2hp" (UID: "7c5aa40b-af79-42ef-99df-394eb1b2d683") : secret "canary-serving-cert" not found Apr 16 14:52:51.982839 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:52:51.982590 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls podName:be6cb6cd-b928-4807-90d1-c1f8d6657af1 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:59.982574209 +0000 UTC m=+49.090203327 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls") pod "dns-default-9zlt8" (UID: "be6cb6cd-b928-4807-90d1-c1f8d6657af1") : secret "dns-default-metrics-tls" not found Apr 16 14:53:00.039925 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:53:00.039879 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert\") pod \"ingress-canary-hg2hp\" (UID: \"7c5aa40b-af79-42ef-99df-394eb1b2d683\") " pod="openshift-ingress-canary/ingress-canary-hg2hp" Apr 16 14:53:00.039925 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:53:00.039932 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:53:00.040471 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:53:00.040048 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:00.040471 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:53:00.040045 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:00.040471 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:53:00.040144 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls podName:be6cb6cd-b928-4807-90d1-c1f8d6657af1 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:16.040128369 +0000 UTC m=+65.147757470 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls") pod "dns-default-9zlt8" (UID: "be6cb6cd-b928-4807-90d1-c1f8d6657af1") : secret "dns-default-metrics-tls" not found Apr 16 14:53:00.040471 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:53:00.040179 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert podName:7c5aa40b-af79-42ef-99df-394eb1b2d683 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:16.04016526 +0000 UTC m=+65.147794362 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert") pod "ingress-canary-hg2hp" (UID: "7c5aa40b-af79-42ef-99df-394eb1b2d683") : secret "canary-serving-cert" not found Apr 16 14:53:09.626880 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:53:09.626849 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m882k" Apr 16 14:53:16.052892 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:53:16.052846 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:53:16.053306 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:53:16.052911 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert\") pod \"ingress-canary-hg2hp\" (UID: \"7c5aa40b-af79-42ef-99df-394eb1b2d683\") " pod="openshift-ingress-canary/ingress-canary-hg2hp" Apr 16 14:53:16.053306 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:53:16.053005 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:16.053306 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:53:16.053036 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:16.053306 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:53:16.053116 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls podName:be6cb6cd-b928-4807-90d1-c1f8d6657af1 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:48.053068255 +0000 UTC m=+97.160697370 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls") pod "dns-default-9zlt8" (UID: "be6cb6cd-b928-4807-90d1-c1f8d6657af1") : secret "dns-default-metrics-tls" not found Apr 16 14:53:16.053306 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:53:16.053185 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert podName:7c5aa40b-af79-42ef-99df-394eb1b2d683 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:48.053168523 +0000 UTC m=+97.160797635 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert") pod "ingress-canary-hg2hp" (UID: "7c5aa40b-af79-42ef-99df-394eb1b2d683") : secret "canary-serving-cert" not found Apr 16 14:53:17.261912 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:53:17.261849 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs\") pod \"network-metrics-daemon-8g7qk\" (UID: \"a3db0253-f985-4d95-b46c-abb2acc3e872\") " pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:53:17.262330 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:53:17.261995 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:53:17.262330 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:53:17.262065 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs podName:a3db0253-f985-4d95-b46c-abb2acc3e872 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:21.262048359 +0000 UTC m=+130.369677461 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs") pod "network-metrics-daemon-8g7qk" (UID: "a3db0253-f985-4d95-b46c-abb2acc3e872") : secret "metrics-daemon-secret" not found Apr 16 14:53:21.667312 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:53:21.667283 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-6kn6f" Apr 16 14:53:48.065704 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:53:48.065641 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert\") pod \"ingress-canary-hg2hp\" (UID: \"7c5aa40b-af79-42ef-99df-394eb1b2d683\") " pod="openshift-ingress-canary/ingress-canary-hg2hp" Apr 16 14:53:48.066053 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:53:48.065716 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:53:48.066053 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:53:48.065802 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:48.066053 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:53:48.065874 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert podName:7c5aa40b-af79-42ef-99df-394eb1b2d683 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:52.065858505 +0000 UTC m=+161.173487611 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert") pod "ingress-canary-hg2hp" (UID: "7c5aa40b-af79-42ef-99df-394eb1b2d683") : secret "canary-serving-cert" not found Apr 16 14:53:48.066053 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:53:48.065805 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:48.066053 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:53:48.065928 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls podName:be6cb6cd-b928-4807-90d1-c1f8d6657af1 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:52.065915879 +0000 UTC m=+161.173544980 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls") pod "dns-default-9zlt8" (UID: "be6cb6cd-b928-4807-90d1-c1f8d6657af1") : secret "dns-default-metrics-tls" not found Apr 16 14:54:21.290410 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:21.290357 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs\") pod \"network-metrics-daemon-8g7qk\" (UID: \"a3db0253-f985-4d95-b46c-abb2acc3e872\") " pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:54:21.290904 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:21.290477 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:54:21.290904 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:21.290540 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs podName:a3db0253-f985-4d95-b46c-abb2acc3e872 nodeName:}" failed. No retries permitted until 2026-04-16 14:56:23.290525407 +0000 UTC m=+252.398154509 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs") pod "network-metrics-daemon-8g7qk" (UID: "a3db0253-f985-4d95-b46c-abb2acc3e872") : secret "metrics-daemon-secret" not found Apr 16 14:54:38.586016 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.585983 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-2lhl8"] Apr 16 14:54:38.588913 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.588888 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" Apr 16 14:54:38.590901 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.590875 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k"] Apr 16 14:54:38.591962 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.591933 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 14:54:38.591962 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.591951 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 14:54:38.592183 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.591952 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:54:38.592183 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.591952 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 14:54:38.592370 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.592355 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-vmfpw\"" Apr 16 14:54:38.593540 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.593523 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7468b496d-5qb6b"] Apr 16 14:54:38.593674 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.593658 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k" Apr 16 14:54:38.595417 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.595394 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 14:54:38.595592 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.595450 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-ldmqp\"" Apr 16 14:54:38.595656 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.595630 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 14:54:38.595723 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.595706 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:54:38.596286 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.596256 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:38.597959 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.597943 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 14:54:38.598340 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.598315 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 14:54:38.598471 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.598380 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 14:54:38.598471 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.598422 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 14:54:38.598646 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.598480 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 14:54:38.598646 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.598585 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 14:54:38.598750 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.598733 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 14:54:38.598804 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.598751 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-d94pn\"" Apr 16 14:54:38.603133 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.603093 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-2lhl8"] Apr 16 14:54:38.607728 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.607695 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k"] Apr 16 14:54:38.610467 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.610445 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7468b496d-5qb6b"] Apr 16 14:54:38.708760 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.708725 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6dbffa58-0a86-4116-9fd8-0dca9f45e365-trusted-ca\") pod \"console-operator-d87b8d5fc-2lhl8\" (UID: \"6dbffa58-0a86-4116-9fd8-0dca9f45e365\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" Apr 16 14:54:38.708760 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.708759 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:38.708968 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.708845 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-default-certificate\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:38.708968 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.708872 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-stats-auth\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:38.708968 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.708895 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dbffa58-0a86-4116-9fd8-0dca9f45e365-config\") pod \"console-operator-d87b8d5fc-2lhl8\" (UID: \"6dbffa58-0a86-4116-9fd8-0dca9f45e365\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" Apr 16 14:54:38.708968 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.708912 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dbffa58-0a86-4116-9fd8-0dca9f45e365-serving-cert\") pod \"console-operator-d87b8d5fc-2lhl8\" (UID: \"6dbffa58-0a86-4116-9fd8-0dca9f45e365\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" Apr 16 14:54:38.708968 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.708930 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m4rp\" (UniqueName: \"kubernetes.io/projected/6dbffa58-0a86-4116-9fd8-0dca9f45e365-kube-api-access-8m4rp\") pod \"console-operator-d87b8d5fc-2lhl8\" (UID: \"6dbffa58-0a86-4116-9fd8-0dca9f45e365\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" Apr 16 14:54:38.709158 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.708978 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwqdz\" (UniqueName: \"kubernetes.io/projected/0848e866-88d6-48a6-abea-931262f45c54-kube-api-access-jwqdz\") pod \"cluster-samples-operator-667775844f-ps85k\" (UID: \"0848e866-88d6-48a6-abea-931262f45c54\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k" Apr 16 14:54:38.709158 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.709018 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:38.709158 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.709044 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-ps85k\" (UID: \"0848e866-88d6-48a6-abea-931262f45c54\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k" Apr 16 14:54:38.709158 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.709063 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tltj4\" (UniqueName: \"kubernetes.io/projected/99e316cc-57e8-4d70-bb6b-e957b5bccf87-kube-api-access-tltj4\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:38.809369 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.809326 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:38.809369 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.809379 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-ps85k\" (UID: \"0848e866-88d6-48a6-abea-931262f45c54\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k" Apr 16 14:54:38.809556 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.809402 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tltj4\" (UniqueName: \"kubernetes.io/projected/99e316cc-57e8-4d70-bb6b-e957b5bccf87-kube-api-access-tltj4\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:38.809556 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.809431 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6dbffa58-0a86-4116-9fd8-0dca9f45e365-trusted-ca\") pod \"console-operator-d87b8d5fc-2lhl8\" (UID: \"6dbffa58-0a86-4116-9fd8-0dca9f45e365\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" Apr 16 14:54:38.809556 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.809448 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:38.809556 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.809479 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-default-certificate\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:38.809556 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.809501 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-stats-auth\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:38.809556 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:38.809476 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:54:38.809856 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.809533 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dbffa58-0a86-4116-9fd8-0dca9f45e365-config\") pod \"console-operator-d87b8d5fc-2lhl8\" (UID: \"6dbffa58-0a86-4116-9fd8-0dca9f45e365\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" Apr 16 14:54:38.809856 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:38.809495 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:54:38.809856 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:38.809624 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs podName:99e316cc-57e8-4d70-bb6b-e957b5bccf87 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:39.309599427 +0000 UTC m=+148.417228545 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs") pod "router-default-7468b496d-5qb6b" (UID: "99e316cc-57e8-4d70-bb6b-e957b5bccf87") : secret "router-metrics-certs-default" not found Apr 16 14:54:38.809856 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:38.809691 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle podName:99e316cc-57e8-4d70-bb6b-e957b5bccf87 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:39.309673433 +0000 UTC m=+148.417302536 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle") pod "router-default-7468b496d-5qb6b" (UID: "99e316cc-57e8-4d70-bb6b-e957b5bccf87") : configmap references non-existent config key: service-ca.crt Apr 16 14:54:38.809856 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:38.809705 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls podName:0848e866-88d6-48a6-abea-931262f45c54 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:39.309696878 +0000 UTC m=+148.417325981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls") pod "cluster-samples-operator-667775844f-ps85k" (UID: "0848e866-88d6-48a6-abea-931262f45c54") : secret "samples-operator-tls" not found Apr 16 14:54:38.809856 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.809728 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dbffa58-0a86-4116-9fd8-0dca9f45e365-serving-cert\") pod \"console-operator-d87b8d5fc-2lhl8\" (UID: \"6dbffa58-0a86-4116-9fd8-0dca9f45e365\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" Apr 16 14:54:38.809856 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.809756 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8m4rp\" (UniqueName: \"kubernetes.io/projected/6dbffa58-0a86-4116-9fd8-0dca9f45e365-kube-api-access-8m4rp\") pod \"console-operator-d87b8d5fc-2lhl8\" (UID: \"6dbffa58-0a86-4116-9fd8-0dca9f45e365\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" Apr 16 14:54:38.809856 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.809789 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwqdz\" (UniqueName: \"kubernetes.io/projected/0848e866-88d6-48a6-abea-931262f45c54-kube-api-access-jwqdz\") pod \"cluster-samples-operator-667775844f-ps85k\" (UID: \"0848e866-88d6-48a6-abea-931262f45c54\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k" Apr 16 14:54:38.810414 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.810389 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dbffa58-0a86-4116-9fd8-0dca9f45e365-config\") pod \"console-operator-d87b8d5fc-2lhl8\" (UID: \"6dbffa58-0a86-4116-9fd8-0dca9f45e365\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" Apr 16 14:54:38.810577 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.810526 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6dbffa58-0a86-4116-9fd8-0dca9f45e365-trusted-ca\") pod \"console-operator-d87b8d5fc-2lhl8\" (UID: \"6dbffa58-0a86-4116-9fd8-0dca9f45e365\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" Apr 16 14:54:38.812604 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.812577 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dbffa58-0a86-4116-9fd8-0dca9f45e365-serving-cert\") pod \"console-operator-d87b8d5fc-2lhl8\" (UID: \"6dbffa58-0a86-4116-9fd8-0dca9f45e365\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" Apr 16 14:54:38.812745 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.812725 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-default-certificate\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:38.812884 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.812865 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-stats-auth\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:38.817219 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.817192 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m4rp\" (UniqueName: \"kubernetes.io/projected/6dbffa58-0a86-4116-9fd8-0dca9f45e365-kube-api-access-8m4rp\") pod \"console-operator-d87b8d5fc-2lhl8\" (UID: \"6dbffa58-0a86-4116-9fd8-0dca9f45e365\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" Apr 16 14:54:38.817615 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.817596 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwqdz\" (UniqueName: \"kubernetes.io/projected/0848e866-88d6-48a6-abea-931262f45c54-kube-api-access-jwqdz\") pod \"cluster-samples-operator-667775844f-ps85k\" (UID: \"0848e866-88d6-48a6-abea-931262f45c54\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k" Apr 16 14:54:38.817752 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.817732 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tltj4\" (UniqueName: \"kubernetes.io/projected/99e316cc-57e8-4d70-bb6b-e957b5bccf87-kube-api-access-tltj4\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:38.901377 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:38.901283 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" Apr 16 14:54:39.018386 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:39.018353 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-2lhl8"] Apr 16 14:54:39.021549 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:54:39.021518 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dbffa58_0a86_4116_9fd8_0dca9f45e365.slice/crio-6f212276255c60413072b9094c4bbf77b4d25490b200ba45c929d8a5ef9bcd2b WatchSource:0}: Error finding container 6f212276255c60413072b9094c4bbf77b4d25490b200ba45c929d8a5ef9bcd2b: Status 404 returned error can't find the container with id 6f212276255c60413072b9094c4bbf77b4d25490b200ba45c929d8a5ef9bcd2b Apr 16 14:54:39.314123 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:39.314065 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-ps85k\" (UID: \"0848e866-88d6-48a6-abea-931262f45c54\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k" Apr 16 14:54:39.314333 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:39.314149 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:39.314333 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:39.314197 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:54:39.314333 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:39.314277 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls podName:0848e866-88d6-48a6-abea-931262f45c54 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:40.314253597 +0000 UTC m=+149.421882713 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls") pod "cluster-samples-operator-667775844f-ps85k" (UID: "0848e866-88d6-48a6-abea-931262f45c54") : secret "samples-operator-tls" not found Apr 16 14:54:39.314333 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:39.314276 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:54:39.314333 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:39.314200 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:39.314333 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:39.314304 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle podName:99e316cc-57e8-4d70-bb6b-e957b5bccf87 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:40.314287653 +0000 UTC m=+149.421916759 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle") pod "router-default-7468b496d-5qb6b" (UID: "99e316cc-57e8-4d70-bb6b-e957b5bccf87") : configmap references non-existent config key: service-ca.crt Apr 16 14:54:39.314333 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:39.314336 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs podName:99e316cc-57e8-4d70-bb6b-e957b5bccf87 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:40.314318027 +0000 UTC m=+149.421947136 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs") pod "router-default-7468b496d-5qb6b" (UID: "99e316cc-57e8-4d70-bb6b-e957b5bccf87") : secret "router-metrics-certs-default" not found Apr 16 14:54:39.872133 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:39.872091 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" event={"ID":"6dbffa58-0a86-4116-9fd8-0dca9f45e365","Type":"ContainerStarted","Data":"6f212276255c60413072b9094c4bbf77b4d25490b200ba45c929d8a5ef9bcd2b"} Apr 16 14:54:40.324044 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:40.323997 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:40.324263 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:40.324128 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:40.324263 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:40.324165 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-ps85k\" (UID: \"0848e866-88d6-48a6-abea-931262f45c54\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k" Apr 16 14:54:40.324263 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:40.324207 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle podName:99e316cc-57e8-4d70-bb6b-e957b5bccf87 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:42.324183389 +0000 UTC m=+151.431812516 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle") pod "router-default-7468b496d-5qb6b" (UID: "99e316cc-57e8-4d70-bb6b-e957b5bccf87") : configmap references non-existent config key: service-ca.crt Apr 16 14:54:40.324263 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:40.324260 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:54:40.324486 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:40.324320 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls podName:0848e866-88d6-48a6-abea-931262f45c54 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:42.324303551 +0000 UTC m=+151.431932653 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls") pod "cluster-samples-operator-667775844f-ps85k" (UID: "0848e866-88d6-48a6-abea-931262f45c54") : secret "samples-operator-tls" not found Apr 16 14:54:40.324486 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:40.324326 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:54:40.324486 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:40.324383 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs podName:99e316cc-57e8-4d70-bb6b-e957b5bccf87 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:42.324365961 +0000 UTC m=+151.431995084 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs") pod "router-default-7468b496d-5qb6b" (UID: "99e316cc-57e8-4d70-bb6b-e957b5bccf87") : secret "router-metrics-certs-default" not found Apr 16 14:54:40.875507 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:40.875478 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/0.log" Apr 16 14:54:40.875851 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:40.875520 2579 generic.go:358] "Generic (PLEG): container finished" podID="6dbffa58-0a86-4116-9fd8-0dca9f45e365" containerID="646138369f1203de14ca96b212648414b45f0f8f36e0877f7131e53a3db2b2b2" exitCode=255 Apr 16 14:54:40.875851 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:40.875575 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" event={"ID":"6dbffa58-0a86-4116-9fd8-0dca9f45e365","Type":"ContainerDied","Data":"646138369f1203de14ca96b212648414b45f0f8f36e0877f7131e53a3db2b2b2"} Apr 16 14:54:40.875851 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:40.875822 2579 scope.go:117] "RemoveContainer" containerID="646138369f1203de14ca96b212648414b45f0f8f36e0877f7131e53a3db2b2b2" Apr 16 14:54:41.878948 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:41.878918 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 14:54:41.879382 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:41.879297 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/0.log" Apr 16 14:54:41.879382 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:41.879331 2579 generic.go:358] "Generic (PLEG): container finished" podID="6dbffa58-0a86-4116-9fd8-0dca9f45e365" containerID="1d9abf0c6f83fbc17a2f47f08b9b69803f066ff2201e6a258936cec9715a93f3" exitCode=255 Apr 16 14:54:41.879382 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:41.879359 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" event={"ID":"6dbffa58-0a86-4116-9fd8-0dca9f45e365","Type":"ContainerDied","Data":"1d9abf0c6f83fbc17a2f47f08b9b69803f066ff2201e6a258936cec9715a93f3"} Apr 16 14:54:41.879476 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:41.879387 2579 scope.go:117] "RemoveContainer" containerID="646138369f1203de14ca96b212648414b45f0f8f36e0877f7131e53a3db2b2b2" Apr 16 14:54:41.879667 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:41.879654 2579 scope.go:117] "RemoveContainer" containerID="1d9abf0c6f83fbc17a2f47f08b9b69803f066ff2201e6a258936cec9715a93f3" Apr 16 14:54:41.879887 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:41.879867 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-2lhl8_openshift-console-operator(6dbffa58-0a86-4116-9fd8-0dca9f45e365)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" podUID="6dbffa58-0a86-4116-9fd8-0dca9f45e365" Apr 16 14:54:42.342087 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:42.342011 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-ps85k\" (UID: \"0848e866-88d6-48a6-abea-931262f45c54\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k" Apr 16 14:54:42.342311 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:42.342114 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:42.342311 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:42.342201 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:42.342311 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:42.342221 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:54:42.342311 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:42.342282 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:54:42.342311 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:42.342292 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls podName:0848e866-88d6-48a6-abea-931262f45c54 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:46.342270547 +0000 UTC m=+155.449899662 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls") pod "cluster-samples-operator-667775844f-ps85k" (UID: "0848e866-88d6-48a6-abea-931262f45c54") : secret "samples-operator-tls" not found Apr 16 14:54:42.342508 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:42.342324 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle podName:99e316cc-57e8-4d70-bb6b-e957b5bccf87 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:46.342300726 +0000 UTC m=+155.449929848 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle") pod "router-default-7468b496d-5qb6b" (UID: "99e316cc-57e8-4d70-bb6b-e957b5bccf87") : configmap references non-existent config key: service-ca.crt Apr 16 14:54:42.342508 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:42.342356 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs podName:99e316cc-57e8-4d70-bb6b-e957b5bccf87 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:46.342344417 +0000 UTC m=+155.449973524 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs") pod "router-default-7468b496d-5qb6b" (UID: "99e316cc-57e8-4d70-bb6b-e957b5bccf87") : secret "router-metrics-certs-default" not found Apr 16 14:54:42.882700 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:42.882672 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 14:54:42.883064 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:42.883004 2579 scope.go:117] "RemoveContainer" containerID="1d9abf0c6f83fbc17a2f47f08b9b69803f066ff2201e6a258936cec9715a93f3" Apr 16 14:54:42.883234 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:42.883214 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-2lhl8_openshift-console-operator(6dbffa58-0a86-4116-9fd8-0dca9f45e365)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" podUID="6dbffa58-0a86-4116-9fd8-0dca9f45e365" Apr 16 14:54:43.829344 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:43.829318 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5qg4q_f6ae390c-ede3-458f-8330-0d8d3aad76c2/dns-node-resolver/0.log" Apr 16 14:54:44.629900 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:44.629873 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4d625_c778a259-410c-444b-a486-c230dd795def/node-ca/0.log" Apr 16 14:54:46.377203 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:46.377148 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:46.377661 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:46.377254 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:46.377661 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:46.377286 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-ps85k\" (UID: \"0848e866-88d6-48a6-abea-931262f45c54\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k" Apr 16 14:54:46.377661 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:46.377320 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle podName:99e316cc-57e8-4d70-bb6b-e957b5bccf87 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:54.377302093 +0000 UTC m=+163.484931222 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle") pod "router-default-7468b496d-5qb6b" (UID: "99e316cc-57e8-4d70-bb6b-e957b5bccf87") : configmap references non-existent config key: service-ca.crt Apr 16 14:54:46.377661 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:46.377396 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:54:46.377661 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:46.377452 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs podName:99e316cc-57e8-4d70-bb6b-e957b5bccf87 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:54.377437964 +0000 UTC m=+163.485067066 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs") pod "router-default-7468b496d-5qb6b" (UID: "99e316cc-57e8-4d70-bb6b-e957b5bccf87") : secret "router-metrics-certs-default" not found Apr 16 14:54:46.377661 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:46.377401 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:54:46.377661 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:46.377484 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls podName:0848e866-88d6-48a6-abea-931262f45c54 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:54.377477151 +0000 UTC m=+163.485106252 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls") pod "cluster-samples-operator-667775844f-ps85k" (UID: "0848e866-88d6-48a6-abea-931262f45c54") : secret "samples-operator-tls" not found Apr 16 14:54:47.251453 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:47.251403 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-9zlt8" podUID="be6cb6cd-b928-4807-90d1-c1f8d6657af1" Apr 16 14:54:47.257575 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:47.257537 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-hg2hp" podUID="7c5aa40b-af79-42ef-99df-394eb1b2d683" Apr 16 14:54:47.495434 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:47.495385 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-8g7qk" podUID="a3db0253-f985-4d95-b46c-abb2acc3e872" Apr 16 14:54:47.891677 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:47.891644 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hg2hp" Apr 16 14:54:47.891826 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:47.891801 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9zlt8" Apr 16 14:54:48.901746 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:48.901653 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" Apr 16 14:54:48.901746 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:48.901695 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" Apr 16 14:54:48.902173 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:48.902049 2579 scope.go:117] "RemoveContainer" containerID="1d9abf0c6f83fbc17a2f47f08b9b69803f066ff2201e6a258936cec9715a93f3" Apr 16 14:54:48.902255 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:48.902236 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-2lhl8_openshift-console-operator(6dbffa58-0a86-4116-9fd8-0dca9f45e365)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" podUID="6dbffa58-0a86-4116-9fd8-0dca9f45e365" Apr 16 14:54:50.492437 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:50.492405 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-mgz7w"] Apr 16 14:54:50.496416 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:50.496401 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-mgz7w" Apr 16 14:54:50.498057 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:50.498035 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 14:54:50.498231 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:50.498212 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 14:54:50.498819 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:50.498795 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-lm4zz\"" Apr 16 14:54:50.498924 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:50.498795 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 14:54:50.498924 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:50.498821 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 14:54:50.503656 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:50.503638 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-mgz7w"] Apr 16 14:54:50.612056 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:50.612010 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/29a60a02-b362-4786-b057-8df4ea2185ed-signing-key\") pod \"service-ca-bfc587fb7-mgz7w\" (UID: \"29a60a02-b362-4786-b057-8df4ea2185ed\") " pod="openshift-service-ca/service-ca-bfc587fb7-mgz7w" Apr 16 14:54:50.612056 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:50.612060 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fknl\" (UniqueName: \"kubernetes.io/projected/29a60a02-b362-4786-b057-8df4ea2185ed-kube-api-access-4fknl\") pod \"service-ca-bfc587fb7-mgz7w\" (UID: \"29a60a02-b362-4786-b057-8df4ea2185ed\") " pod="openshift-service-ca/service-ca-bfc587fb7-mgz7w" Apr 16 14:54:50.612284 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:50.612182 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/29a60a02-b362-4786-b057-8df4ea2185ed-signing-cabundle\") pod \"service-ca-bfc587fb7-mgz7w\" (UID: \"29a60a02-b362-4786-b057-8df4ea2185ed\") " pod="openshift-service-ca/service-ca-bfc587fb7-mgz7w" Apr 16 14:54:50.713556 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:50.713520 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/29a60a02-b362-4786-b057-8df4ea2185ed-signing-key\") pod \"service-ca-bfc587fb7-mgz7w\" (UID: \"29a60a02-b362-4786-b057-8df4ea2185ed\") " pod="openshift-service-ca/service-ca-bfc587fb7-mgz7w" Apr 16 14:54:50.713674 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:50.713580 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fknl\" (UniqueName: \"kubernetes.io/projected/29a60a02-b362-4786-b057-8df4ea2185ed-kube-api-access-4fknl\") pod \"service-ca-bfc587fb7-mgz7w\" (UID: \"29a60a02-b362-4786-b057-8df4ea2185ed\") " pod="openshift-service-ca/service-ca-bfc587fb7-mgz7w" Apr 16 14:54:50.713738 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:50.713722 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/29a60a02-b362-4786-b057-8df4ea2185ed-signing-cabundle\") pod \"service-ca-bfc587fb7-mgz7w\" (UID: \"29a60a02-b362-4786-b057-8df4ea2185ed\") " pod="openshift-service-ca/service-ca-bfc587fb7-mgz7w" Apr 16 14:54:50.714445 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:50.714420 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/29a60a02-b362-4786-b057-8df4ea2185ed-signing-cabundle\") pod \"service-ca-bfc587fb7-mgz7w\" (UID: \"29a60a02-b362-4786-b057-8df4ea2185ed\") " pod="openshift-service-ca/service-ca-bfc587fb7-mgz7w" Apr 16 14:54:50.715938 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:50.715920 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/29a60a02-b362-4786-b057-8df4ea2185ed-signing-key\") pod \"service-ca-bfc587fb7-mgz7w\" (UID: \"29a60a02-b362-4786-b057-8df4ea2185ed\") " pod="openshift-service-ca/service-ca-bfc587fb7-mgz7w" Apr 16 14:54:50.720934 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:50.720911 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fknl\" (UniqueName: \"kubernetes.io/projected/29a60a02-b362-4786-b057-8df4ea2185ed-kube-api-access-4fknl\") pod \"service-ca-bfc587fb7-mgz7w\" (UID: \"29a60a02-b362-4786-b057-8df4ea2185ed\") " pod="openshift-service-ca/service-ca-bfc587fb7-mgz7w" Apr 16 14:54:50.805044 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:50.804946 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-mgz7w" Apr 16 14:54:50.919286 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:50.919252 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-mgz7w"] Apr 16 14:54:50.923431 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:54:50.923406 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29a60a02_b362_4786_b057_8df4ea2185ed.slice/crio-b10c9b6adc52be6e150dfe2fef417352afd968039e8bb2d855af6a7586394b9d WatchSource:0}: Error finding container b10c9b6adc52be6e150dfe2fef417352afd968039e8bb2d855af6a7586394b9d: Status 404 returned error can't find the container with id b10c9b6adc52be6e150dfe2fef417352afd968039e8bb2d855af6a7586394b9d Apr 16 14:54:51.900088 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:51.900033 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-mgz7w" event={"ID":"29a60a02-b362-4786-b057-8df4ea2185ed","Type":"ContainerStarted","Data":"b10c9b6adc52be6e150dfe2fef417352afd968039e8bb2d855af6a7586394b9d"} Apr 16 14:54:52.127025 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:52.126978 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:54:52.127220 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:52.127120 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert\") pod \"ingress-canary-hg2hp\" (UID: \"7c5aa40b-af79-42ef-99df-394eb1b2d683\") " pod="openshift-ingress-canary/ingress-canary-hg2hp" Apr 16 14:54:52.127220 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:52.127157 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:54:52.127220 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:52.127220 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls podName:be6cb6cd-b928-4807-90d1-c1f8d6657af1 nodeName:}" failed. No retries permitted until 2026-04-16 14:56:54.127201134 +0000 UTC m=+283.234830259 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls") pod "dns-default-9zlt8" (UID: "be6cb6cd-b928-4807-90d1-c1f8d6657af1") : secret "dns-default-metrics-tls" not found Apr 16 14:54:52.127385 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:52.127226 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:54:52.127385 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:52.127256 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert podName:7c5aa40b-af79-42ef-99df-394eb1b2d683 nodeName:}" failed. No retries permitted until 2026-04-16 14:56:54.127246811 +0000 UTC m=+283.234875917 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert") pod "ingress-canary-hg2hp" (UID: "7c5aa40b-af79-42ef-99df-394eb1b2d683") : secret "canary-serving-cert" not found Apr 16 14:54:52.903777 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:52.903737 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-mgz7w" event={"ID":"29a60a02-b362-4786-b057-8df4ea2185ed","Type":"ContainerStarted","Data":"46b68bc20f34f09275dff00537c05a7180189802366edc645d57eaf8d3fe2380"} Apr 16 14:54:52.916604 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:52.916548 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-mgz7w" podStartSLOduration=1.148820578 podStartE2EDuration="2.916526291s" podCreationTimestamp="2026-04-16 14:54:50 +0000 UTC" firstStartedPulling="2026-04-16 14:54:50.925556674 +0000 UTC m=+160.033185777" lastFinishedPulling="2026-04-16 14:54:52.693262385 +0000 UTC m=+161.800891490" observedRunningTime="2026-04-16 14:54:52.916382043 +0000 UTC m=+162.024011178" watchObservedRunningTime="2026-04-16 14:54:52.916526291 +0000 UTC m=+162.024155415" Apr 16 14:54:54.446656 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:54.446619 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:54.447170 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:54.446692 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:54:54.447170 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:54:54.446716 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-ps85k\" (UID: \"0848e866-88d6-48a6-abea-931262f45c54\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k" Apr 16 14:54:54.447170 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:54.446815 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:54:54.447170 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:54.446862 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls podName:0848e866-88d6-48a6-abea-931262f45c54 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:10.446848877 +0000 UTC m=+179.554477979 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls") pod "cluster-samples-operator-667775844f-ps85k" (UID: "0848e866-88d6-48a6-abea-931262f45c54") : secret "samples-operator-tls" not found Apr 16 14:54:54.447402 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:54.447220 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle podName:99e316cc-57e8-4d70-bb6b-e957b5bccf87 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:10.447203732 +0000 UTC m=+179.554832836 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle") pod "router-default-7468b496d-5qb6b" (UID: "99e316cc-57e8-4d70-bb6b-e957b5bccf87") : configmap references non-existent config key: service-ca.crt Apr 16 14:54:54.447402 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:54.447281 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:54:54.447402 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:54:54.447363 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs podName:99e316cc-57e8-4d70-bb6b-e957b5bccf87 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:10.447344385 +0000 UTC m=+179.554973493 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs") pod "router-default-7468b496d-5qb6b" (UID: "99e316cc-57e8-4d70-bb6b-e957b5bccf87") : secret "router-metrics-certs-default" not found Apr 16 14:55:01.468472 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:01.468393 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:55:04.467684 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:04.467649 2579 scope.go:117] "RemoveContainer" containerID="1d9abf0c6f83fbc17a2f47f08b9b69803f066ff2201e6a258936cec9715a93f3" Apr 16 14:55:04.928463 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:04.928381 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 14:55:04.928463 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:04.928453 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" event={"ID":"6dbffa58-0a86-4116-9fd8-0dca9f45e365","Type":"ContainerStarted","Data":"b1e76775c5871a2844931c52a265e0ead71f0b7f1dbb7631a06dc7e731a8f3bc"} Apr 16 14:55:04.928726 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:04.928702 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" Apr 16 14:55:05.077034 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:05.077002 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" Apr 16 14:55:05.099875 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:05.099828 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-2lhl8" podStartSLOduration=25.386555271 podStartE2EDuration="27.099812097s" podCreationTimestamp="2026-04-16 14:54:38 +0000 UTC" firstStartedPulling="2026-04-16 14:54:39.023332706 +0000 UTC m=+148.130961809" lastFinishedPulling="2026-04-16 14:54:40.736589529 +0000 UTC m=+149.844218635" observedRunningTime="2026-04-16 14:55:04.94245035 +0000 UTC m=+174.050079473" watchObservedRunningTime="2026-04-16 14:55:05.099812097 +0000 UTC m=+174.207441215" Apr 16 14:55:09.334589 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.334552 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-hxs84"] Apr 16 14:55:09.337711 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.337688 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hxs84" Apr 16 14:55:09.339562 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.339537 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:55:09.340108 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.340085 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bg2mj\"" Apr 16 14:55:09.340108 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.340103 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:55:09.340268 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.340151 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:55:09.340337 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.340289 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:55:09.346472 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.346450 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hxs84"] Apr 16 14:55:09.366605 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.366571 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/860d2f9f-664f-4720-9840-cd7e447f9aa3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hxs84\" (UID: \"860d2f9f-664f-4720-9840-cd7e447f9aa3\") " pod="openshift-insights/insights-runtime-extractor-hxs84" Apr 16 14:55:09.366605 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.366609 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j466\" (UniqueName: \"kubernetes.io/projected/860d2f9f-664f-4720-9840-cd7e447f9aa3-kube-api-access-8j466\") pod \"insights-runtime-extractor-hxs84\" (UID: \"860d2f9f-664f-4720-9840-cd7e447f9aa3\") " pod="openshift-insights/insights-runtime-extractor-hxs84" Apr 16 14:55:09.366813 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.366720 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/860d2f9f-664f-4720-9840-cd7e447f9aa3-crio-socket\") pod \"insights-runtime-extractor-hxs84\" (UID: \"860d2f9f-664f-4720-9840-cd7e447f9aa3\") " pod="openshift-insights/insights-runtime-extractor-hxs84" Apr 16 14:55:09.366813 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.366751 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/860d2f9f-664f-4720-9840-cd7e447f9aa3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hxs84\" (UID: \"860d2f9f-664f-4720-9840-cd7e447f9aa3\") " pod="openshift-insights/insights-runtime-extractor-hxs84" Apr 16 14:55:09.366813 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.366785 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/860d2f9f-664f-4720-9840-cd7e447f9aa3-data-volume\") pod \"insights-runtime-extractor-hxs84\" (UID: \"860d2f9f-664f-4720-9840-cd7e447f9aa3\") " pod="openshift-insights/insights-runtime-extractor-hxs84" Apr 16 14:55:09.370497 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.370467 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5985778f7d-g4nxl"] Apr 16 14:55:09.373343 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.373327 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.375287 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.375265 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-g2vw7\"" Apr 16 14:55:09.375386 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.375365 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:55:09.375543 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.375528 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:55:09.375677 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.375564 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:55:09.385389 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.385355 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:55:09.387525 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.387498 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5985778f7d-g4nxl"] Apr 16 14:55:09.467210 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.467173 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/860d2f9f-664f-4720-9840-cd7e447f9aa3-crio-socket\") pod \"insights-runtime-extractor-hxs84\" (UID: \"860d2f9f-664f-4720-9840-cd7e447f9aa3\") " pod="openshift-insights/insights-runtime-extractor-hxs84" Apr 16 14:55:09.467381 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.467224 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/860d2f9f-664f-4720-9840-cd7e447f9aa3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hxs84\" (UID: \"860d2f9f-664f-4720-9840-cd7e447f9aa3\") " pod="openshift-insights/insights-runtime-extractor-hxs84" Apr 16 14:55:09.467381 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.467266 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8j466\" (UniqueName: \"kubernetes.io/projected/860d2f9f-664f-4720-9840-cd7e447f9aa3-kube-api-access-8j466\") pod \"insights-runtime-extractor-hxs84\" (UID: \"860d2f9f-664f-4720-9840-cd7e447f9aa3\") " pod="openshift-insights/insights-runtime-extractor-hxs84" Apr 16 14:55:09.467381 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.467302 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/860d2f9f-664f-4720-9840-cd7e447f9aa3-crio-socket\") pod \"insights-runtime-extractor-hxs84\" (UID: \"860d2f9f-664f-4720-9840-cd7e447f9aa3\") " pod="openshift-insights/insights-runtime-extractor-hxs84" Apr 16 14:55:09.467501 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.467389 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smctm\" (UniqueName: \"kubernetes.io/projected/ea13ce08-5ce8-4080-9afb-976057b2a884-kube-api-access-smctm\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.467534 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.467498 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/860d2f9f-664f-4720-9840-cd7e447f9aa3-data-volume\") pod \"insights-runtime-extractor-hxs84\" (UID: \"860d2f9f-664f-4720-9840-cd7e447f9aa3\") " pod="openshift-insights/insights-runtime-extractor-hxs84" Apr 16 14:55:09.467575 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.467547 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ea13ce08-5ce8-4080-9afb-976057b2a884-installation-pull-secrets\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.467631 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.467617 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ea13ce08-5ce8-4080-9afb-976057b2a884-registry-certificates\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.467682 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.467644 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea13ce08-5ce8-4080-9afb-976057b2a884-trusted-ca\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.467732 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.467694 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ea13ce08-5ce8-4080-9afb-976057b2a884-registry-tls\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.467732 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.467728 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/860d2f9f-664f-4720-9840-cd7e447f9aa3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hxs84\" (UID: \"860d2f9f-664f-4720-9840-cd7e447f9aa3\") " pod="openshift-insights/insights-runtime-extractor-hxs84" Apr 16 14:55:09.467834 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.467775 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ea13ce08-5ce8-4080-9afb-976057b2a884-image-registry-private-configuration\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.467834 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.467793 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea13ce08-5ce8-4080-9afb-976057b2a884-bound-sa-token\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.467834 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.467810 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ea13ce08-5ce8-4080-9afb-976057b2a884-ca-trust-extracted\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.467988 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.467958 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/860d2f9f-664f-4720-9840-cd7e447f9aa3-data-volume\") pod \"insights-runtime-extractor-hxs84\" (UID: \"860d2f9f-664f-4720-9840-cd7e447f9aa3\") " pod="openshift-insights/insights-runtime-extractor-hxs84" Apr 16 14:55:09.468323 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.468306 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/860d2f9f-664f-4720-9840-cd7e447f9aa3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hxs84\" (UID: \"860d2f9f-664f-4720-9840-cd7e447f9aa3\") " pod="openshift-insights/insights-runtime-extractor-hxs84" Apr 16 14:55:09.469820 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.469804 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/860d2f9f-664f-4720-9840-cd7e447f9aa3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hxs84\" (UID: \"860d2f9f-664f-4720-9840-cd7e447f9aa3\") " pod="openshift-insights/insights-runtime-extractor-hxs84" Apr 16 14:55:09.477781 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.477752 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j466\" (UniqueName: \"kubernetes.io/projected/860d2f9f-664f-4720-9840-cd7e447f9aa3-kube-api-access-8j466\") pod \"insights-runtime-extractor-hxs84\" (UID: \"860d2f9f-664f-4720-9840-cd7e447f9aa3\") " pod="openshift-insights/insights-runtime-extractor-hxs84" Apr 16 14:55:09.568168 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.568135 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ea13ce08-5ce8-4080-9afb-976057b2a884-registry-tls\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.568378 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.568190 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ea13ce08-5ce8-4080-9afb-976057b2a884-image-registry-private-configuration\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.568378 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.568209 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea13ce08-5ce8-4080-9afb-976057b2a884-bound-sa-token\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.568378 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.568225 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ea13ce08-5ce8-4080-9afb-976057b2a884-ca-trust-extracted\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.568378 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.568275 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smctm\" (UniqueName: \"kubernetes.io/projected/ea13ce08-5ce8-4080-9afb-976057b2a884-kube-api-access-smctm\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.568378 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.568359 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ea13ce08-5ce8-4080-9afb-976057b2a884-installation-pull-secrets\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.568625 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.568390 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ea13ce08-5ce8-4080-9afb-976057b2a884-registry-certificates\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.568625 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.568424 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea13ce08-5ce8-4080-9afb-976057b2a884-trusted-ca\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.568899 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.568873 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ea13ce08-5ce8-4080-9afb-976057b2a884-ca-trust-extracted\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.569926 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.569897 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ea13ce08-5ce8-4080-9afb-976057b2a884-registry-certificates\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.570043 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.569941 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea13ce08-5ce8-4080-9afb-976057b2a884-trusted-ca\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.570789 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.570761 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ea13ce08-5ce8-4080-9afb-976057b2a884-registry-tls\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.570789 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.570766 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ea13ce08-5ce8-4080-9afb-976057b2a884-image-registry-private-configuration\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.570925 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.570905 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ea13ce08-5ce8-4080-9afb-976057b2a884-installation-pull-secrets\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.578154 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.578121 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea13ce08-5ce8-4080-9afb-976057b2a884-bound-sa-token\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.578301 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.578281 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smctm\" (UniqueName: \"kubernetes.io/projected/ea13ce08-5ce8-4080-9afb-976057b2a884-kube-api-access-smctm\") pod \"image-registry-5985778f7d-g4nxl\" (UID: \"ea13ce08-5ce8-4080-9afb-976057b2a884\") " pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.646912 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.646813 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hxs84" Apr 16 14:55:09.688677 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.688642 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.785299 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.785273 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hxs84"] Apr 16 14:55:09.788563 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:55:09.788519 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod860d2f9f_664f_4720_9840_cd7e447f9aa3.slice/crio-824dc74e8997d0e93a7e7da985543d37f374c086df0199a9c7d6d11aaa7e8552 WatchSource:0}: Error finding container 824dc74e8997d0e93a7e7da985543d37f374c086df0199a9c7d6d11aaa7e8552: Status 404 returned error can't find the container with id 824dc74e8997d0e93a7e7da985543d37f374c086df0199a9c7d6d11aaa7e8552 Apr 16 14:55:09.819603 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.819576 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5985778f7d-g4nxl"] Apr 16 14:55:09.822977 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:55:09.822950 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea13ce08_5ce8_4080_9afb_976057b2a884.slice/crio-806f8e7a0e0e3949b67fa3de0ad3e6cd385b839d3e57925eac20e2c675656a23 WatchSource:0}: Error finding container 806f8e7a0e0e3949b67fa3de0ad3e6cd385b839d3e57925eac20e2c675656a23: Status 404 returned error can't find the container with id 806f8e7a0e0e3949b67fa3de0ad3e6cd385b839d3e57925eac20e2c675656a23 Apr 16 14:55:09.942511 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.942477 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" event={"ID":"ea13ce08-5ce8-4080-9afb-976057b2a884","Type":"ContainerStarted","Data":"b91e110e7d554cc909b004fa37e01ccfe79e01523c79d7d9127834a6e4148d93"} Apr 16 14:55:09.942669 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.942518 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" event={"ID":"ea13ce08-5ce8-4080-9afb-976057b2a884","Type":"ContainerStarted","Data":"806f8e7a0e0e3949b67fa3de0ad3e6cd385b839d3e57925eac20e2c675656a23"} Apr 16 14:55:09.942669 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.942596 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:09.943795 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.943771 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hxs84" event={"ID":"860d2f9f-664f-4720-9840-cd7e447f9aa3","Type":"ContainerStarted","Data":"5afc1f6539f373f9427fe50d5c181c71fc0bab0f44f71de78b7152118d1f511c"} Apr 16 14:55:09.943880 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.943801 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hxs84" event={"ID":"860d2f9f-664f-4720-9840-cd7e447f9aa3","Type":"ContainerStarted","Data":"824dc74e8997d0e93a7e7da985543d37f374c086df0199a9c7d6d11aaa7e8552"} Apr 16 14:55:09.958028 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:09.957982 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" podStartSLOduration=0.957966822 podStartE2EDuration="957.966822ms" podCreationTimestamp="2026-04-16 14:55:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:55:09.957339773 +0000 UTC m=+179.064968897" watchObservedRunningTime="2026-04-16 14:55:09.957966822 +0000 UTC m=+179.065595949" Apr 16 14:55:10.475373 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:10.475343 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:55:10.475680 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:10.475380 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-ps85k\" (UID: \"0848e866-88d6-48a6-abea-931262f45c54\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k" Apr 16 14:55:10.475680 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:10.475424 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:55:10.476010 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:10.475990 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99e316cc-57e8-4d70-bb6b-e957b5bccf87-service-ca-bundle\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:55:10.477777 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:10.477753 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0848e866-88d6-48a6-abea-931262f45c54-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-ps85k\" (UID: \"0848e866-88d6-48a6-abea-931262f45c54\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k" Apr 16 14:55:10.477875 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:10.477753 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99e316cc-57e8-4d70-bb6b-e957b5bccf87-metrics-certs\") pod \"router-default-7468b496d-5qb6b\" (UID: \"99e316cc-57e8-4d70-bb6b-e957b5bccf87\") " pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:55:10.706237 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:10.706205 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k" Apr 16 14:55:10.713145 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:10.712998 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:55:10.826547 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:10.826516 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k"] Apr 16 14:55:10.843901 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:10.843872 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7468b496d-5qb6b"] Apr 16 14:55:10.847102 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:55:10.847045 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99e316cc_57e8_4d70_bb6b_e957b5bccf87.slice/crio-5d4b9e82864ae12b4d0817ee97691b9b8463732eb842a8aa5e811b26883f8ca2 WatchSource:0}: Error finding container 5d4b9e82864ae12b4d0817ee97691b9b8463732eb842a8aa5e811b26883f8ca2: Status 404 returned error can't find the container with id 5d4b9e82864ae12b4d0817ee97691b9b8463732eb842a8aa5e811b26883f8ca2 Apr 16 14:55:10.948322 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:10.948277 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k" event={"ID":"0848e866-88d6-48a6-abea-931262f45c54","Type":"ContainerStarted","Data":"0d4bf02c410a37cb7b9497104fd7f2a5009b2d185cbd3ca8158a8f353f4a43a5"} Apr 16 14:55:10.950174 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:10.950125 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hxs84" event={"ID":"860d2f9f-664f-4720-9840-cd7e447f9aa3","Type":"ContainerStarted","Data":"3711100afc1a7a31258dc4c786c9ba1b305f9f3ba0a9b9578927b750d1cd6e6d"} Apr 16 14:55:10.951491 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:10.951454 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7468b496d-5qb6b" event={"ID":"99e316cc-57e8-4d70-bb6b-e957b5bccf87","Type":"ContainerStarted","Data":"50a9251f84bc80cbba4e04cf4d376988299e581a221d7c438360752ebf8a72a8"} Apr 16 14:55:10.951623 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:10.951493 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7468b496d-5qb6b" event={"ID":"99e316cc-57e8-4d70-bb6b-e957b5bccf87","Type":"ContainerStarted","Data":"5d4b9e82864ae12b4d0817ee97691b9b8463732eb842a8aa5e811b26883f8ca2"} Apr 16 14:55:10.967272 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:10.967187 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7468b496d-5qb6b" podStartSLOduration=32.967171509 podStartE2EDuration="32.967171509s" podCreationTimestamp="2026-04-16 14:54:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:55:10.966122407 +0000 UTC m=+180.073751550" watchObservedRunningTime="2026-04-16 14:55:10.967171509 +0000 UTC m=+180.074800633" Apr 16 14:55:11.713516 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:11.713424 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:55:11.716670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:11.716645 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:55:11.954764 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:11.954727 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:55:11.956186 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:11.956167 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7468b496d-5qb6b" Apr 16 14:55:12.958141 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:12.958102 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hxs84" event={"ID":"860d2f9f-664f-4720-9840-cd7e447f9aa3","Type":"ContainerStarted","Data":"c456627e0caecac04fe69f3e05cff41fd700d1870c77768d55516340412beeb8"} Apr 16 14:55:12.959588 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:12.959562 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k" event={"ID":"0848e866-88d6-48a6-abea-931262f45c54","Type":"ContainerStarted","Data":"2f444ac61693ef5fe1833d37c6ba1b530403acd0582fc83b8571662f2d790b9c"} Apr 16 14:55:12.959688 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:12.959596 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k" event={"ID":"0848e866-88d6-48a6-abea-931262f45c54","Type":"ContainerStarted","Data":"07a6e270c4c79777cd51c6e1f3e9dc94a0528648f13f2588ceda46c374a61b60"} Apr 16 14:55:12.974445 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:12.974396 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-hxs84" podStartSLOduration=1.48975358 podStartE2EDuration="3.974382634s" podCreationTimestamp="2026-04-16 14:55:09 +0000 UTC" firstStartedPulling="2026-04-16 14:55:09.860813693 +0000 UTC m=+178.968442801" lastFinishedPulling="2026-04-16 14:55:12.345442753 +0000 UTC m=+181.453071855" observedRunningTime="2026-04-16 14:55:12.974118752 +0000 UTC m=+182.081747875" watchObservedRunningTime="2026-04-16 14:55:12.974382634 +0000 UTC m=+182.082011758" Apr 16 14:55:12.987370 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:12.987322 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-ps85k" podStartSLOduration=33.472022044 podStartE2EDuration="34.987309854s" podCreationTimestamp="2026-04-16 14:54:38 +0000 UTC" firstStartedPulling="2026-04-16 14:55:10.87660972 +0000 UTC m=+179.984238839" lastFinishedPulling="2026-04-16 14:55:12.391897545 +0000 UTC m=+181.499526649" observedRunningTime="2026-04-16 14:55:12.986956608 +0000 UTC m=+182.094585732" watchObservedRunningTime="2026-04-16 14:55:12.987309854 +0000 UTC m=+182.094938969" Apr 16 14:55:14.568533 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:14.568492 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-7lxqz"] Apr 16 14:55:14.571614 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:14.571596 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-7lxqz" Apr 16 14:55:14.573509 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:14.573488 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 14:55:14.573634 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:14.573563 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-pm575\"" Apr 16 14:55:14.587729 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:14.587703 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-7lxqz"] Apr 16 14:55:14.611939 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:14.611903 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ecc581b0-e89d-4a7f-948b-429c743874bc-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-7lxqz\" (UID: \"ecc581b0-e89d-4a7f-948b-429c743874bc\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-7lxqz" Apr 16 14:55:14.713008 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:14.712967 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ecc581b0-e89d-4a7f-948b-429c743874bc-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-7lxqz\" (UID: \"ecc581b0-e89d-4a7f-948b-429c743874bc\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-7lxqz" Apr 16 14:55:14.713162 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:55:14.713126 2579 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 14:55:14.713225 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:55:14.713196 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecc581b0-e89d-4a7f-948b-429c743874bc-tls-certificates podName:ecc581b0-e89d-4a7f-948b-429c743874bc nodeName:}" failed. No retries permitted until 2026-04-16 14:55:15.213175503 +0000 UTC m=+184.320804605 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/ecc581b0-e89d-4a7f-948b-429c743874bc-tls-certificates") pod "prometheus-operator-admission-webhook-9cb97cd87-7lxqz" (UID: "ecc581b0-e89d-4a7f-948b-429c743874bc") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 14:55:15.216777 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:15.216721 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ecc581b0-e89d-4a7f-948b-429c743874bc-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-7lxqz\" (UID: \"ecc581b0-e89d-4a7f-948b-429c743874bc\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-7lxqz" Apr 16 14:55:15.219250 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:15.219227 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ecc581b0-e89d-4a7f-948b-429c743874bc-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-7lxqz\" (UID: \"ecc581b0-e89d-4a7f-948b-429c743874bc\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-7lxqz" Apr 16 14:55:15.479910 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:15.479827 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-7lxqz" Apr 16 14:55:15.593370 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:15.593337 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-7lxqz"] Apr 16 14:55:15.597123 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:55:15.597053 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecc581b0_e89d_4a7f_948b_429c743874bc.slice/crio-5382d793d2a4e3b5ea2fe33e1ea4decd114b61294ae3b13553af113ab857e834 WatchSource:0}: Error finding container 5382d793d2a4e3b5ea2fe33e1ea4decd114b61294ae3b13553af113ab857e834: Status 404 returned error can't find the container with id 5382d793d2a4e3b5ea2fe33e1ea4decd114b61294ae3b13553af113ab857e834 Apr 16 14:55:15.971503 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:15.971468 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-7lxqz" event={"ID":"ecc581b0-e89d-4a7f-948b-429c743874bc","Type":"ContainerStarted","Data":"5382d793d2a4e3b5ea2fe33e1ea4decd114b61294ae3b13553af113ab857e834"} Apr 16 14:55:16.975662 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:16.975621 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-7lxqz" event={"ID":"ecc581b0-e89d-4a7f-948b-429c743874bc","Type":"ContainerStarted","Data":"48f812cc8e035972a96515a8af26a1a24b45dc24b057a4912f81c9b86c4500d3"} Apr 16 14:55:16.976050 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:16.975823 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-7lxqz" Apr 16 14:55:16.980464 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:16.980440 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-7lxqz" Apr 16 14:55:16.989383 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:16.989338 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-7lxqz" podStartSLOduration=2.046035007 podStartE2EDuration="2.989320936s" podCreationTimestamp="2026-04-16 14:55:14 +0000 UTC" firstStartedPulling="2026-04-16 14:55:15.599380982 +0000 UTC m=+184.707010100" lastFinishedPulling="2026-04-16 14:55:16.542666906 +0000 UTC m=+185.650296029" observedRunningTime="2026-04-16 14:55:16.988814734 +0000 UTC m=+186.096443858" watchObservedRunningTime="2026-04-16 14:55:16.989320936 +0000 UTC m=+186.096950061" Apr 16 14:55:21.956997 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.956960 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-4drxr"] Apr 16 14:55:21.960512 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.960488 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4drxr" Apr 16 14:55:21.962716 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.962696 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 14:55:21.963570 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.963415 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-8xtdm\"" Apr 16 14:55:21.963570 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.963429 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:55:21.963570 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.963463 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 14:55:21.963570 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.963420 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:55:21.963828 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.963752 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:55:21.973135 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.973103 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-4drxr"] Apr 16 14:55:21.974050 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.974031 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-69bnh"] Apr 16 14:55:21.977566 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.977550 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:21.980188 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.979691 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:55:21.980188 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.979937 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-p6x8s\"" Apr 16 14:55:21.980188 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.979983 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:55:21.981756 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.981738 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-cvmbz"] Apr 16 14:55:21.983050 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.983034 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:55:21.986703 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.986683 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:21.988506 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.988487 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 14:55:21.988641 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.988625 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-vxk6c\"" Apr 16 14:55:21.988721 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.988708 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 14:55:21.989147 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:21.989129 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 14:55:22.000939 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.000918 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-cvmbz"] Apr 16 14:55:22.076782 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.076747 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ww2m\" (UniqueName: \"kubernetes.io/projected/23b0fb83-9383-4487-9c59-ea958ea92af3-kube-api-access-9ww2m\") pod \"openshift-state-metrics-5669946b84-4drxr\" (UID: \"23b0fb83-9383-4487-9c59-ea958ea92af3\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4drxr" Apr 16 14:55:22.076782 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.076793 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48808a06-57df-42da-95c5-7989c2599f55-sys\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.077032 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.076850 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c04e1a6b-9b30-411d-9334-189a455233d6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-cvmbz\" (UID: \"c04e1a6b-9b30-411d-9334-189a455233d6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:22.077032 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.076882 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jcr2\" (UniqueName: \"kubernetes.io/projected/48808a06-57df-42da-95c5-7989c2599f55-kube-api-access-9jcr2\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.077032 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.076911 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/23b0fb83-9383-4487-9c59-ea958ea92af3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-4drxr\" (UID: \"23b0fb83-9383-4487-9c59-ea958ea92af3\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4drxr" Apr 16 14:55:22.077032 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.076937 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23b0fb83-9383-4487-9c59-ea958ea92af3-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-4drxr\" (UID: \"23b0fb83-9383-4487-9c59-ea958ea92af3\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4drxr" Apr 16 14:55:22.077032 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.076961 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/48808a06-57df-42da-95c5-7989c2599f55-node-exporter-tls\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.077032 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.076986 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/48808a06-57df-42da-95c5-7989c2599f55-root\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.077032 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.077012 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c04e1a6b-9b30-411d-9334-189a455233d6-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-cvmbz\" (UID: \"c04e1a6b-9b30-411d-9334-189a455233d6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:22.077372 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.077037 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/48808a06-57df-42da-95c5-7989c2599f55-node-exporter-wtmp\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.077372 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.077083 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/23b0fb83-9383-4487-9c59-ea958ea92af3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-4drxr\" (UID: \"23b0fb83-9383-4487-9c59-ea958ea92af3\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4drxr" Apr 16 14:55:22.077372 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.077115 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/48808a06-57df-42da-95c5-7989c2599f55-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.077372 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.077139 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c04e1a6b-9b30-411d-9334-189a455233d6-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-cvmbz\" (UID: \"c04e1a6b-9b30-411d-9334-189a455233d6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:22.077372 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.077167 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r24lt\" (UniqueName: \"kubernetes.io/projected/c04e1a6b-9b30-411d-9334-189a455233d6-kube-api-access-r24lt\") pod \"kube-state-metrics-7479c89684-cvmbz\" (UID: \"c04e1a6b-9b30-411d-9334-189a455233d6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:22.077372 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.077206 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48808a06-57df-42da-95c5-7989c2599f55-metrics-client-ca\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.077372 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.077256 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/48808a06-57df-42da-95c5-7989c2599f55-node-exporter-accelerators-collector-config\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.077372 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.077284 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/48808a06-57df-42da-95c5-7989c2599f55-node-exporter-textfile\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.077372 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.077315 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c04e1a6b-9b30-411d-9334-189a455233d6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-cvmbz\" (UID: \"c04e1a6b-9b30-411d-9334-189a455233d6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:22.077372 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.077351 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c04e1a6b-9b30-411d-9334-189a455233d6-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-cvmbz\" (UID: \"c04e1a6b-9b30-411d-9334-189a455233d6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:22.178489 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.178442 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c04e1a6b-9b30-411d-9334-189a455233d6-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-cvmbz\" (UID: \"c04e1a6b-9b30-411d-9334-189a455233d6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:22.178699 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.178518 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ww2m\" (UniqueName: \"kubernetes.io/projected/23b0fb83-9383-4487-9c59-ea958ea92af3-kube-api-access-9ww2m\") pod \"openshift-state-metrics-5669946b84-4drxr\" (UID: \"23b0fb83-9383-4487-9c59-ea958ea92af3\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4drxr" Apr 16 14:55:22.178699 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.178566 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48808a06-57df-42da-95c5-7989c2599f55-sys\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.178699 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:55:22.178567 2579 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 14:55:22.178699 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.178613 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c04e1a6b-9b30-411d-9334-189a455233d6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-cvmbz\" (UID: \"c04e1a6b-9b30-411d-9334-189a455233d6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:22.178699 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:55:22.178646 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c04e1a6b-9b30-411d-9334-189a455233d6-kube-state-metrics-tls podName:c04e1a6b-9b30-411d-9334-189a455233d6 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:22.678624779 +0000 UTC m=+191.786253895 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/c04e1a6b-9b30-411d-9334-189a455233d6-kube-state-metrics-tls") pod "kube-state-metrics-7479c89684-cvmbz" (UID: "c04e1a6b-9b30-411d-9334-189a455233d6") : secret "kube-state-metrics-tls" not found Apr 16 14:55:22.178699 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.178687 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jcr2\" (UniqueName: \"kubernetes.io/projected/48808a06-57df-42da-95c5-7989c2599f55-kube-api-access-9jcr2\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.178996 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.178687 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48808a06-57df-42da-95c5-7989c2599f55-sys\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.178996 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.178721 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/23b0fb83-9383-4487-9c59-ea958ea92af3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-4drxr\" (UID: \"23b0fb83-9383-4487-9c59-ea958ea92af3\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4drxr" Apr 16 14:55:22.178996 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.178753 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23b0fb83-9383-4487-9c59-ea958ea92af3-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-4drxr\" (UID: \"23b0fb83-9383-4487-9c59-ea958ea92af3\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4drxr" Apr 16 14:55:22.178996 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.178777 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/48808a06-57df-42da-95c5-7989c2599f55-node-exporter-tls\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.178996 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.178804 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/48808a06-57df-42da-95c5-7989c2599f55-root\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.178996 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.178832 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c04e1a6b-9b30-411d-9334-189a455233d6-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-cvmbz\" (UID: \"c04e1a6b-9b30-411d-9334-189a455233d6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:22.178996 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.178860 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/48808a06-57df-42da-95c5-7989c2599f55-node-exporter-wtmp\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.178996 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.178890 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/23b0fb83-9383-4487-9c59-ea958ea92af3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-4drxr\" (UID: \"23b0fb83-9383-4487-9c59-ea958ea92af3\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4drxr" Apr 16 14:55:22.178996 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.178923 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/48808a06-57df-42da-95c5-7989c2599f55-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.178996 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.178948 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c04e1a6b-9b30-411d-9334-189a455233d6-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-cvmbz\" (UID: \"c04e1a6b-9b30-411d-9334-189a455233d6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:22.178996 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:55:22.178953 2579 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 14:55:22.178996 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.178978 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r24lt\" (UniqueName: \"kubernetes.io/projected/c04e1a6b-9b30-411d-9334-189a455233d6-kube-api-access-r24lt\") pod \"kube-state-metrics-7479c89684-cvmbz\" (UID: \"c04e1a6b-9b30-411d-9334-189a455233d6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:22.178996 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:55:22.179007 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b0fb83-9383-4487-9c59-ea958ea92af3-openshift-state-metrics-tls podName:23b0fb83-9383-4487-9c59-ea958ea92af3 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:22.678995878 +0000 UTC m=+191.786624984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/23b0fb83-9383-4487-9c59-ea958ea92af3-openshift-state-metrics-tls") pod "openshift-state-metrics-5669946b84-4drxr" (UID: "23b0fb83-9383-4487-9c59-ea958ea92af3") : secret "openshift-state-metrics-tls" not found Apr 16 14:55:22.179656 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.179042 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48808a06-57df-42da-95c5-7989c2599f55-metrics-client-ca\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.179656 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.179110 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/48808a06-57df-42da-95c5-7989c2599f55-node-exporter-accelerators-collector-config\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.179656 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.179143 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/48808a06-57df-42da-95c5-7989c2599f55-node-exporter-textfile\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.179656 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.179175 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c04e1a6b-9b30-411d-9334-189a455233d6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-cvmbz\" (UID: \"c04e1a6b-9b30-411d-9334-189a455233d6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:22.179656 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.179620 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/48808a06-57df-42da-95c5-7989c2599f55-root\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.179656 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.179634 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c04e1a6b-9b30-411d-9334-189a455233d6-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-cvmbz\" (UID: \"c04e1a6b-9b30-411d-9334-189a455233d6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:22.179948 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.179716 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23b0fb83-9383-4487-9c59-ea958ea92af3-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-4drxr\" (UID: \"23b0fb83-9383-4487-9c59-ea958ea92af3\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4drxr" Apr 16 14:55:22.180199 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.180175 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48808a06-57df-42da-95c5-7989c2599f55-metrics-client-ca\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.180335 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.180279 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/48808a06-57df-42da-95c5-7989c2599f55-node-exporter-textfile\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.180454 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.180373 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/48808a06-57df-42da-95c5-7989c2599f55-node-exporter-accelerators-collector-config\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.180454 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.180365 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/48808a06-57df-42da-95c5-7989c2599f55-node-exporter-wtmp\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.180737 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.180715 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c04e1a6b-9b30-411d-9334-189a455233d6-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-cvmbz\" (UID: \"c04e1a6b-9b30-411d-9334-189a455233d6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:22.180921 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.180898 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c04e1a6b-9b30-411d-9334-189a455233d6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-cvmbz\" (UID: \"c04e1a6b-9b30-411d-9334-189a455233d6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:22.182583 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.182444 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/48808a06-57df-42da-95c5-7989c2599f55-node-exporter-tls\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.182583 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.182466 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/23b0fb83-9383-4487-9c59-ea958ea92af3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-4drxr\" (UID: \"23b0fb83-9383-4487-9c59-ea958ea92af3\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4drxr" Apr 16 14:55:22.183064 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.183034 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c04e1a6b-9b30-411d-9334-189a455233d6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-cvmbz\" (UID: \"c04e1a6b-9b30-411d-9334-189a455233d6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:22.184104 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.184061 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/48808a06-57df-42da-95c5-7989c2599f55-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.191094 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.187172 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ww2m\" (UniqueName: \"kubernetes.io/projected/23b0fb83-9383-4487-9c59-ea958ea92af3-kube-api-access-9ww2m\") pod \"openshift-state-metrics-5669946b84-4drxr\" (UID: \"23b0fb83-9383-4487-9c59-ea958ea92af3\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4drxr" Apr 16 14:55:22.191094 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.187587 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jcr2\" (UniqueName: \"kubernetes.io/projected/48808a06-57df-42da-95c5-7989c2599f55-kube-api-access-9jcr2\") pod \"node-exporter-69bnh\" (UID: \"48808a06-57df-42da-95c5-7989c2599f55\") " pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.191094 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.187663 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r24lt\" (UniqueName: \"kubernetes.io/projected/c04e1a6b-9b30-411d-9334-189a455233d6-kube-api-access-r24lt\") pod \"kube-state-metrics-7479c89684-cvmbz\" (UID: \"c04e1a6b-9b30-411d-9334-189a455233d6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:22.289524 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.289441 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-69bnh" Apr 16 14:55:22.305364 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:55:22.305326 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48808a06_57df_42da_95c5_7989c2599f55.slice/crio-40a240e37481233b3ecef3605915e9124689b03c645c4f084d92fdd7acaeb8ca WatchSource:0}: Error finding container 40a240e37481233b3ecef3605915e9124689b03c645c4f084d92fdd7acaeb8ca: Status 404 returned error can't find the container with id 40a240e37481233b3ecef3605915e9124689b03c645c4f084d92fdd7acaeb8ca Apr 16 14:55:22.682791 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.682696 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c04e1a6b-9b30-411d-9334-189a455233d6-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-cvmbz\" (UID: \"c04e1a6b-9b30-411d-9334-189a455233d6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:22.682791 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.682769 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/23b0fb83-9383-4487-9c59-ea958ea92af3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-4drxr\" (UID: \"23b0fb83-9383-4487-9c59-ea958ea92af3\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4drxr" Apr 16 14:55:22.685487 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.685459 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/23b0fb83-9383-4487-9c59-ea958ea92af3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-4drxr\" (UID: \"23b0fb83-9383-4487-9c59-ea958ea92af3\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4drxr" Apr 16 14:55:22.685616 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.685596 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c04e1a6b-9b30-411d-9334-189a455233d6-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-cvmbz\" (UID: \"c04e1a6b-9b30-411d-9334-189a455233d6\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:22.871865 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.871828 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4drxr" Apr 16 14:55:22.897879 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.897846 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" Apr 16 14:55:22.992280 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:22.992240 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-69bnh" event={"ID":"48808a06-57df-42da-95c5-7989c2599f55","Type":"ContainerStarted","Data":"40a240e37481233b3ecef3605915e9124689b03c645c4f084d92fdd7acaeb8ca"} Apr 16 14:55:23.040727 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.040695 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:55:23.050647 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.050538 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.053015 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.052777 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 14:55:23.053148 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.053099 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 14:55:23.053182 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.053170 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 14:55:23.056858 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.053432 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 14:55:23.056858 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.053432 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 14:55:23.056858 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.054786 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 14:55:23.056858 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.055103 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 14:55:23.056858 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.055374 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 14:55:23.056858 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.056536 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-j5kdg\"" Apr 16 14:55:23.057254 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.056880 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 14:55:23.058371 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.058089 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:55:23.087702 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.087668 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-cvmbz"] Apr 16 14:55:23.093871 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:55:23.093819 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc04e1a6b_9b30_411d_9334_189a455233d6.slice/crio-441b1004d05302da631ed514945bf18f8376b23d98a4d492f62dfb7616ca43f6 WatchSource:0}: Error finding container 441b1004d05302da631ed514945bf18f8376b23d98a4d492f62dfb7616ca43f6: Status 404 returned error can't find the container with id 441b1004d05302da631ed514945bf18f8376b23d98a4d492f62dfb7616ca43f6 Apr 16 14:55:23.104030 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.104004 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-4drxr"] Apr 16 14:55:23.107423 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:55:23.107392 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23b0fb83_9383_4487_9c59_ea958ea92af3.slice/crio-37d999ab05331862df317a197949629562ac10aee2f5011fd74625217e0dcfbd WatchSource:0}: Error finding container 37d999ab05331862df317a197949629562ac10aee2f5011fd74625217e0dcfbd: Status 404 returned error can't find the container with id 37d999ab05331862df317a197949629562ac10aee2f5011fd74625217e0dcfbd Apr 16 14:55:23.188128 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.188094 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6e023847-8b2f-4ce9-8b72-3047e39b02df-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.188272 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.188147 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6e023847-8b2f-4ce9-8b72-3047e39b02df-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.188272 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.188179 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-config-volume\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.188272 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.188200 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.188272 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.188218 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.188272 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.188239 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-web-config\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.188537 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.188327 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e023847-8b2f-4ce9-8b72-3047e39b02df-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.188537 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.188392 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.188537 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.188426 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.188537 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.188451 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftvxh\" (UniqueName: \"kubernetes.io/projected/6e023847-8b2f-4ce9-8b72-3047e39b02df-kube-api-access-ftvxh\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.188537 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.188501 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e023847-8b2f-4ce9-8b72-3047e39b02df-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.188747 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.188566 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6e023847-8b2f-4ce9-8b72-3047e39b02df-config-out\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.188747 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.188594 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.289342 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.289313 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-config-volume\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.289463 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.289346 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.289463 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.289369 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.289463 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.289408 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-web-config\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.289463 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.289433 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e023847-8b2f-4ce9-8b72-3047e39b02df-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.289699 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.289480 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.289699 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.289528 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.289699 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.289553 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftvxh\" (UniqueName: \"kubernetes.io/projected/6e023847-8b2f-4ce9-8b72-3047e39b02df-kube-api-access-ftvxh\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.289699 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.289613 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e023847-8b2f-4ce9-8b72-3047e39b02df-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.289699 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.289679 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6e023847-8b2f-4ce9-8b72-3047e39b02df-config-out\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.289943 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.289705 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.289943 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.289791 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6e023847-8b2f-4ce9-8b72-3047e39b02df-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.289943 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.289816 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6e023847-8b2f-4ce9-8b72-3047e39b02df-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.290899 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.290632 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6e023847-8b2f-4ce9-8b72-3047e39b02df-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.290899 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.290865 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e023847-8b2f-4ce9-8b72-3047e39b02df-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.291180 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.291059 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e023847-8b2f-4ce9-8b72-3047e39b02df-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.294742 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.294702 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.294742 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.294723 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.294889 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.294773 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.294938 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.294907 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-web-config\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.295210 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.295167 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6e023847-8b2f-4ce9-8b72-3047e39b02df-config-out\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.295210 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.295178 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6e023847-8b2f-4ce9-8b72-3047e39b02df-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.295210 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.295192 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-config-volume\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.295535 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.295518 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.296221 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.296195 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6e023847-8b2f-4ce9-8b72-3047e39b02df-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.298116 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.298096 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftvxh\" (UniqueName: \"kubernetes.io/projected/6e023847-8b2f-4ce9-8b72-3047e39b02df-kube-api-access-ftvxh\") pod \"alertmanager-main-0\" (UID: \"6e023847-8b2f-4ce9-8b72-3047e39b02df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.382437 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.382403 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 14:55:23.507017 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.506932 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 14:55:23.510304 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:55:23.510278 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e023847_8b2f_4ce9_8b72_3047e39b02df.slice/crio-b7bde1a7402be2c06968598bd288a7a9edb9ca3b92c3a88638c0c7033b417086 WatchSource:0}: Error finding container b7bde1a7402be2c06968598bd288a7a9edb9ca3b92c3a88638c0c7033b417086: Status 404 returned error can't find the container with id b7bde1a7402be2c06968598bd288a7a9edb9ca3b92c3a88638c0c7033b417086 Apr 16 14:55:23.997284 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.997245 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" event={"ID":"c04e1a6b-9b30-411d-9334-189a455233d6","Type":"ContainerStarted","Data":"441b1004d05302da631ed514945bf18f8376b23d98a4d492f62dfb7616ca43f6"} Apr 16 14:55:23.999092 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.998964 2579 generic.go:358] "Generic (PLEG): container finished" podID="48808a06-57df-42da-95c5-7989c2599f55" containerID="1f8e510a25d68b12bddea31ac7aece197b4afb5fa8132e00cb6de3e52614ffcb" exitCode=0 Apr 16 14:55:23.999092 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:23.999042 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-69bnh" event={"ID":"48808a06-57df-42da-95c5-7989c2599f55","Type":"ContainerDied","Data":"1f8e510a25d68b12bddea31ac7aece197b4afb5fa8132e00cb6de3e52614ffcb"} Apr 16 14:55:24.000910 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:24.000870 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6e023847-8b2f-4ce9-8b72-3047e39b02df","Type":"ContainerStarted","Data":"b7bde1a7402be2c06968598bd288a7a9edb9ca3b92c3a88638c0c7033b417086"} Apr 16 14:55:24.007975 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:24.007940 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4drxr" event={"ID":"23b0fb83-9383-4487-9c59-ea958ea92af3","Type":"ContainerStarted","Data":"218273176c52d981abd917059af3edcd5b74a0cc1dfef9904946e20ba3b120dc"} Apr 16 14:55:24.008115 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:24.007977 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4drxr" event={"ID":"23b0fb83-9383-4487-9c59-ea958ea92af3","Type":"ContainerStarted","Data":"fc108dcbaf56ed8757ea71975a1e7dd407f27fa1406da3c6be810c98bdc5780c"} Apr 16 14:55:24.008115 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:24.007990 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4drxr" event={"ID":"23b0fb83-9383-4487-9c59-ea958ea92af3","Type":"ContainerStarted","Data":"37d999ab05331862df317a197949629562ac10aee2f5011fd74625217e0dcfbd"} Apr 16 14:55:25.012961 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:25.012929 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" event={"ID":"c04e1a6b-9b30-411d-9334-189a455233d6","Type":"ContainerStarted","Data":"95c832e764378bbde139f3b879d4ea0aba5f508093b5d1f3668dc90b415712ba"} Apr 16 14:55:25.013424 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:25.013405 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" event={"ID":"c04e1a6b-9b30-411d-9334-189a455233d6","Type":"ContainerStarted","Data":"f0f546c356ea4cbf3569d67884e96a979c7c0683bd7e6b34fe2f51e106a271ab"} Apr 16 14:55:25.015346 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:25.015285 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-69bnh" event={"ID":"48808a06-57df-42da-95c5-7989c2599f55","Type":"ContainerStarted","Data":"6e03503da46a1ace0bb531f6d78df5c1692b8e73c304c8f4fbeb5864e36ad957"} Apr 16 14:55:25.015346 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:25.015315 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-69bnh" event={"ID":"48808a06-57df-42da-95c5-7989c2599f55","Type":"ContainerStarted","Data":"5804cc74ab42219b7b936eacbd8dfae8d534243fad03942d4ed1b338a826dd6c"} Apr 16 14:55:25.018738 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:25.017627 2579 generic.go:358] "Generic (PLEG): container finished" podID="6e023847-8b2f-4ce9-8b72-3047e39b02df" containerID="454aa4cae713c091daa56271c085b76c9a907338fd7fa1503d9463e81400f372" exitCode=0 Apr 16 14:55:25.018738 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:25.017699 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6e023847-8b2f-4ce9-8b72-3047e39b02df","Type":"ContainerDied","Data":"454aa4cae713c091daa56271c085b76c9a907338fd7fa1503d9463e81400f372"} Apr 16 14:55:25.023391 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:25.023251 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4drxr" event={"ID":"23b0fb83-9383-4487-9c59-ea958ea92af3","Type":"ContainerStarted","Data":"f441d3e01459d3c1440dd503e4c371f14741a350d49f6fa20bf0ab551ac33607"} Apr 16 14:55:25.035109 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:25.034672 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-69bnh" podStartSLOduration=3.3452038379999998 podStartE2EDuration="4.03465104s" podCreationTimestamp="2026-04-16 14:55:21 +0000 UTC" firstStartedPulling="2026-04-16 14:55:22.307607724 +0000 UTC m=+191.415236827" lastFinishedPulling="2026-04-16 14:55:22.997054924 +0000 UTC m=+192.104684029" observedRunningTime="2026-04-16 14:55:25.032586892 +0000 UTC m=+194.140216028" watchObservedRunningTime="2026-04-16 14:55:25.03465104 +0000 UTC m=+194.142280165" Apr 16 14:55:25.071423 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:25.071367 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4drxr" podStartSLOduration=2.636447909 podStartE2EDuration="4.071348486s" podCreationTimestamp="2026-04-16 14:55:21 +0000 UTC" firstStartedPulling="2026-04-16 14:55:23.281876541 +0000 UTC m=+192.389505650" lastFinishedPulling="2026-04-16 14:55:24.716777124 +0000 UTC m=+193.824406227" observedRunningTime="2026-04-16 14:55:25.069699614 +0000 UTC m=+194.177328737" watchObservedRunningTime="2026-04-16 14:55:25.071348486 +0000 UTC m=+194.178977610" Apr 16 14:55:26.028515 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.028475 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" event={"ID":"c04e1a6b-9b30-411d-9334-189a455233d6","Type":"ContainerStarted","Data":"9c908748443d81912daed6f6010001e572539972f732d565de073bcfaf2c5a66"} Apr 16 14:55:26.048411 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.048358 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-cvmbz" podStartSLOduration=3.428129055 podStartE2EDuration="5.048343973s" podCreationTimestamp="2026-04-16 14:55:21 +0000 UTC" firstStartedPulling="2026-04-16 14:55:23.096465903 +0000 UTC m=+192.204095011" lastFinishedPulling="2026-04-16 14:55:24.716680827 +0000 UTC m=+193.824309929" observedRunningTime="2026-04-16 14:55:26.04618863 +0000 UTC m=+195.153817767" watchObservedRunningTime="2026-04-16 14:55:26.048343973 +0000 UTC m=+195.155973097" Apr 16 14:55:26.382641 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.382538 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-74b95c45b8-6gtpp"] Apr 16 14:55:26.385516 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.385487 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.387454 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.387423 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 14:55:26.387716 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.387701 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-zzkkq\"" Apr 16 14:55:26.387821 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.387710 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 14:55:26.387821 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.387801 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-7e8vsd2jm9hjt\"" Apr 16 14:55:26.387821 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.387800 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:55:26.387984 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.387829 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 14:55:26.395859 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.395820 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-74b95c45b8-6gtpp"] Apr 16 14:55:26.519450 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.519420 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8401793b-8665-481a-9533-86b9379b6202-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.519552 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.519464 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8401793b-8665-481a-9533-86b9379b6202-metrics-server-audit-profiles\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.519552 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.519494 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8401793b-8665-481a-9533-86b9379b6202-client-ca-bundle\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.519552 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.519519 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4lvh\" (UniqueName: \"kubernetes.io/projected/8401793b-8665-481a-9533-86b9379b6202-kube-api-access-j4lvh\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.519680 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.519554 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8401793b-8665-481a-9533-86b9379b6202-secret-metrics-server-client-certs\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.519749 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.519723 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8401793b-8665-481a-9533-86b9379b6202-secret-metrics-server-tls\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.519787 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.519762 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8401793b-8665-481a-9533-86b9379b6202-audit-log\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.620961 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.620930 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8401793b-8665-481a-9533-86b9379b6202-secret-metrics-server-tls\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.621059 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.620977 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8401793b-8665-481a-9533-86b9379b6202-audit-log\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.621059 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.621025 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8401793b-8665-481a-9533-86b9379b6202-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.621059 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.621042 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8401793b-8665-481a-9533-86b9379b6202-metrics-server-audit-profiles\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.621177 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.621064 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8401793b-8665-481a-9533-86b9379b6202-client-ca-bundle\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.621177 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.621109 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4lvh\" (UniqueName: \"kubernetes.io/projected/8401793b-8665-481a-9533-86b9379b6202-kube-api-access-j4lvh\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.621177 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.621132 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8401793b-8665-481a-9533-86b9379b6202-secret-metrics-server-client-certs\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.622246 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.622221 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/8401793b-8665-481a-9533-86b9379b6202-metrics-server-audit-profiles\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.622981 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.622940 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/8401793b-8665-481a-9533-86b9379b6202-audit-log\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.623794 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.623748 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/8401793b-8665-481a-9533-86b9379b6202-secret-metrics-server-tls\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.624595 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.624567 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8401793b-8665-481a-9533-86b9379b6202-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.624750 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.624732 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/8401793b-8665-481a-9533-86b9379b6202-secret-metrics-server-client-certs\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.624801 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.624777 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8401793b-8665-481a-9533-86b9379b6202-client-ca-bundle\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.631053 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.631029 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4lvh\" (UniqueName: \"kubernetes.io/projected/8401793b-8665-481a-9533-86b9379b6202-kube-api-access-j4lvh\") pod \"metrics-server-74b95c45b8-6gtpp\" (UID: \"8401793b-8665-481a-9533-86b9379b6202\") " pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.698131 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.698107 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:26.752113 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.752050 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-mcsf6"] Apr 16 14:55:26.755281 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.755257 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-mcsf6" Apr 16 14:55:26.757779 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.757362 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-8qcd5\"" Apr 16 14:55:26.757779 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.757615 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 14:55:26.769331 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.769304 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-mcsf6"] Apr 16 14:55:26.822862 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.822826 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/783b5152-8549-4302-a3b6-604516a6e27d-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-mcsf6\" (UID: \"783b5152-8549-4302-a3b6-604516a6e27d\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-mcsf6" Apr 16 14:55:26.835044 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.835011 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-74b95c45b8-6gtpp"] Apr 16 14:55:26.839415 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:55:26.839378 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8401793b_8665_481a_9533_86b9379b6202.slice/crio-1e1ac267f332d988b90ff2297d57f32713709a5a7caf20100537e25fba356612 WatchSource:0}: Error finding container 1e1ac267f332d988b90ff2297d57f32713709a5a7caf20100537e25fba356612: Status 404 returned error can't find the container with id 1e1ac267f332d988b90ff2297d57f32713709a5a7caf20100537e25fba356612 Apr 16 14:55:26.923582 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.923492 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/783b5152-8549-4302-a3b6-604516a6e27d-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-mcsf6\" (UID: \"783b5152-8549-4302-a3b6-604516a6e27d\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-mcsf6" Apr 16 14:55:26.925838 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:26.925812 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/783b5152-8549-4302-a3b6-604516a6e27d-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-mcsf6\" (UID: \"783b5152-8549-4302-a3b6-604516a6e27d\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-mcsf6" Apr 16 14:55:27.034465 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:27.034418 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6e023847-8b2f-4ce9-8b72-3047e39b02df","Type":"ContainerStarted","Data":"d39f904a4f96aa18a5792ff507739e1b3139f9ba13ed17a04e6f819d9b06103a"} Apr 16 14:55:27.034465 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:27.034466 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6e023847-8b2f-4ce9-8b72-3047e39b02df","Type":"ContainerStarted","Data":"e1c7817276bfaa0d5f09e4392f9161370c703ef6cb5b4f0d01bde8f4a2507e26"} Apr 16 14:55:27.034982 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:27.034482 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6e023847-8b2f-4ce9-8b72-3047e39b02df","Type":"ContainerStarted","Data":"61942b24bdff26b89ee984f90e559f690e8709c6ed314e704237ea1fea6e6536"} Apr 16 14:55:27.034982 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:27.034494 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6e023847-8b2f-4ce9-8b72-3047e39b02df","Type":"ContainerStarted","Data":"7f132ef3420fedbd4b19e7f6c0ea82a215798acd0390efe49fd7fcf3fd439f68"} Apr 16 14:55:27.034982 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:27.034507 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6e023847-8b2f-4ce9-8b72-3047e39b02df","Type":"ContainerStarted","Data":"cc1c6763167a3f4c0244cf38fe5e78a9e405f05153b9c5fcdd6babf14e6a1309"} Apr 16 14:55:27.035735 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:27.035704 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" event={"ID":"8401793b-8665-481a-9533-86b9379b6202","Type":"ContainerStarted","Data":"1e1ac267f332d988b90ff2297d57f32713709a5a7caf20100537e25fba356612"} Apr 16 14:55:27.068503 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:27.068461 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-mcsf6" Apr 16 14:55:27.203777 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:27.203734 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-mcsf6"] Apr 16 14:55:27.245697 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:55:27.245663 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod783b5152_8549_4302_a3b6_604516a6e27d.slice/crio-7d568aaf6f90e5f577afa3ea6ad10923b053f4593e3a4086786501259894d677 WatchSource:0}: Error finding container 7d568aaf6f90e5f577afa3ea6ad10923b053f4593e3a4086786501259894d677: Status 404 returned error can't find the container with id 7d568aaf6f90e5f577afa3ea6ad10923b053f4593e3a4086786501259894d677 Apr 16 14:55:28.039917 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.039864 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-mcsf6" event={"ID":"783b5152-8549-4302-a3b6-604516a6e27d","Type":"ContainerStarted","Data":"7d568aaf6f90e5f577afa3ea6ad10923b053f4593e3a4086786501259894d677"} Apr 16 14:55:28.043350 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.043316 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6e023847-8b2f-4ce9-8b72-3047e39b02df","Type":"ContainerStarted","Data":"86424a871ce2baf92f53970d819dabd4e8f6765517ade81694aa2a968205b881"} Apr 16 14:55:28.068057 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.067458 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.291500925 podStartE2EDuration="5.06744017s" podCreationTimestamp="2026-04-16 14:55:23 +0000 UTC" firstStartedPulling="2026-04-16 14:55:23.512188703 +0000 UTC m=+192.619817808" lastFinishedPulling="2026-04-16 14:55:27.288127929 +0000 UTC m=+196.395757053" observedRunningTime="2026-04-16 14:55:28.065302655 +0000 UTC m=+197.172931814" watchObservedRunningTime="2026-04-16 14:55:28.06744017 +0000 UTC m=+197.175069296" Apr 16 14:55:28.282910 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.282878 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:55:28.286995 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.286971 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.289068 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.289038 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 14:55:28.289219 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.289087 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 14:55:28.289219 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.289211 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 14:55:28.289378 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.289361 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 14:55:28.289428 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.289381 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 14:55:28.289428 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.289392 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 14:55:28.289513 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.289500 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 14:55:28.289564 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.289549 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 14:55:28.289790 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.289771 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 14:55:28.290055 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.289997 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jtt54\"" Apr 16 14:55:28.290137 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.290090 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 14:55:28.290276 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.290262 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-28euvfec3m3om\"" Apr 16 14:55:28.296670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.296637 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 14:55:28.302399 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.301548 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 14:55:28.302830 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.302801 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:55:28.439189 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.439145 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.439364 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.439204 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.439364 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.439233 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.439364 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.439288 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.439364 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.439314 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.439364 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.439334 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.439597 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.439367 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.439597 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.439398 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-config\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.439597 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.439475 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m985n\" (UniqueName: \"kubernetes.io/projected/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-kube-api-access-m985n\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.439597 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.439527 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.439597 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.439587 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.439826 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.439621 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.439826 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.439646 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.439826 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.439669 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.439826 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.439734 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-config-out\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.439826 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.439758 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.439826 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.439777 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-web-config\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.440094 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.439836 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.540728 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.540628 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m985n\" (UniqueName: \"kubernetes.io/projected/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-kube-api-access-m985n\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.540728 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.540698 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.540959 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.540742 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.540959 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.540775 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.540959 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.540804 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.540959 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.540831 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.540959 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.540864 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-config-out\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.540959 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.540893 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.540959 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.540918 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-web-config\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.540959 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.540961 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.541386 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.540995 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.541386 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.541029 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.541386 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.541059 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.541386 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.541143 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.541386 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.541168 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.541386 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.541190 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.541386 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.541229 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.541386 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.541266 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-config\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.542445 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.542106 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.542445 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.542201 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.544454 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.544426 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.545049 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.544926 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-config-out\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.545049 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.544943 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.545860 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.545837 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.546513 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.546413 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.546513 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.546418 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.546772 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.546566 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.547017 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.546956 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-config\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.547138 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.547114 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.548331 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.548273 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.548470 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.548444 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.548678 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.548656 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.549223 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.549179 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.549475 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.549428 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.550769 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.550749 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-web-config\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.551408 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.551388 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m985n\" (UniqueName: \"kubernetes.io/projected/25954c20-e9bd-4893-a5f7-ae5ad88a0cbe-kube-api-access-m985n\") pod \"prometheus-k8s-0\" (UID: \"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.605560 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.605514 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:28.784259 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:28.784230 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:55:28.785669 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:55:28.785638 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25954c20_e9bd_4893_a5f7_ae5ad88a0cbe.slice/crio-4d69d9d7ff2a0a522aeca0f4681a3fbf355457f62c42d7616bf42314cd8adbfb WatchSource:0}: Error finding container 4d69d9d7ff2a0a522aeca0f4681a3fbf355457f62c42d7616bf42314cd8adbfb: Status 404 returned error can't find the container with id 4d69d9d7ff2a0a522aeca0f4681a3fbf355457f62c42d7616bf42314cd8adbfb Apr 16 14:55:29.047888 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:29.047796 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" event={"ID":"8401793b-8665-481a-9533-86b9379b6202","Type":"ContainerStarted","Data":"b5ded76caf1a208e165a06f69d6c45b7f3821b811bc5bb598b058ed5a4c79fd3"} Apr 16 14:55:29.049228 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:29.049201 2579 generic.go:358] "Generic (PLEG): container finished" podID="25954c20-e9bd-4893-a5f7-ae5ad88a0cbe" containerID="91f91b4ec86a00df9214cca5fc4d32247228f0153215eadd717ef6ea5e68971e" exitCode=0 Apr 16 14:55:29.049315 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:29.049292 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe","Type":"ContainerDied","Data":"91f91b4ec86a00df9214cca5fc4d32247228f0153215eadd717ef6ea5e68971e"} Apr 16 14:55:29.049367 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:29.049320 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe","Type":"ContainerStarted","Data":"4d69d9d7ff2a0a522aeca0f4681a3fbf355457f62c42d7616bf42314cd8adbfb"} Apr 16 14:55:29.050759 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:29.050658 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-mcsf6" event={"ID":"783b5152-8549-4302-a3b6-604516a6e27d","Type":"ContainerStarted","Data":"1ffe3d32ca4241da8045e3103dcae19e91c780ce0da9204eaabe526b720c97ee"} Apr 16 14:55:29.051120 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:29.051098 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-mcsf6" Apr 16 14:55:29.055787 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:29.055764 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-mcsf6" Apr 16 14:55:29.062607 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:29.062552 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" podStartSLOduration=1.2591547539999999 podStartE2EDuration="3.062536111s" podCreationTimestamp="2026-04-16 14:55:26 +0000 UTC" firstStartedPulling="2026-04-16 14:55:26.841417679 +0000 UTC m=+195.949046789" lastFinishedPulling="2026-04-16 14:55:28.644799039 +0000 UTC m=+197.752428146" observedRunningTime="2026-04-16 14:55:29.061871928 +0000 UTC m=+198.169501054" watchObservedRunningTime="2026-04-16 14:55:29.062536111 +0000 UTC m=+198.170165233" Apr 16 14:55:29.100062 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:29.100005 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-mcsf6" podStartSLOduration=1.70108863 podStartE2EDuration="3.099988257s" podCreationTimestamp="2026-04-16 14:55:26 +0000 UTC" firstStartedPulling="2026-04-16 14:55:27.247500263 +0000 UTC m=+196.355129364" lastFinishedPulling="2026-04-16 14:55:28.64639988 +0000 UTC m=+197.754028991" observedRunningTime="2026-04-16 14:55:29.098518688 +0000 UTC m=+198.206147815" watchObservedRunningTime="2026-04-16 14:55:29.099988257 +0000 UTC m=+198.207617380" Apr 16 14:55:29.693593 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:29.693533 2579 patch_prober.go:28] interesting pod/image-registry-5985778f7d-g4nxl container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:55:29.693759 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:29.693634 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" podUID="ea13ce08-5ce8-4080-9afb-976057b2a884" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:55:30.956259 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:30.956223 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5985778f7d-g4nxl" Apr 16 14:55:32.067146 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:32.067106 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe","Type":"ContainerStarted","Data":"289cbff7d88d9a22180fd539a4c488158e70b0494d527910890482b12f47a922"} Apr 16 14:55:32.067612 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:32.067154 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe","Type":"ContainerStarted","Data":"156b3e89bba0f8caf887c0715e08356ed7bae4b73fef63411058fbb3c752cac3"} Apr 16 14:55:34.076892 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:34.076855 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe","Type":"ContainerStarted","Data":"b216e649b262861d6dd2eeefb186c6e230ac32b16c57aab75406f2d4f880ad7a"} Apr 16 14:55:34.076892 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:34.076893 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe","Type":"ContainerStarted","Data":"9fc9ec830a58de944e5c8bd9d1a97eeff78a10b51b5069d4a97712f01aed2ea5"} Apr 16 14:55:34.077324 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:34.076903 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe","Type":"ContainerStarted","Data":"39651e9c090d00627c09eaf0afdfad891b2618056a303af4be378e0bace6a8e1"} Apr 16 14:55:34.077324 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:34.076913 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"25954c20-e9bd-4893-a5f7-ae5ad88a0cbe","Type":"ContainerStarted","Data":"66fe1141420371960f3582fcbef116a15a1ac74b1c5d4283e529ee02fb49c59b"} Apr 16 14:55:34.102227 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:34.102171 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.9113415790000001 podStartE2EDuration="6.102150493s" podCreationTimestamp="2026-04-16 14:55:28 +0000 UTC" firstStartedPulling="2026-04-16 14:55:29.0504233 +0000 UTC m=+198.158052401" lastFinishedPulling="2026-04-16 14:55:33.241232213 +0000 UTC m=+202.348861315" observedRunningTime="2026-04-16 14:55:34.099779957 +0000 UTC m=+203.207409082" watchObservedRunningTime="2026-04-16 14:55:34.102150493 +0000 UTC m=+203.209779617" Apr 16 14:55:38.605994 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:38.605942 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.698854 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:46.698811 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:55:46.699264 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:55:46.698882 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:56:06.703807 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:06.703775 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:56:06.707772 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:06.707748 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-74b95c45b8-6gtpp" Apr 16 14:56:07.732645 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:07.732621 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5qg4q_f6ae390c-ede3-458f-8330-0d8d3aad76c2/dns-node-resolver/0.log" Apr 16 14:56:23.342569 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:23.342529 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs\") pod \"network-metrics-daemon-8g7qk\" (UID: \"a3db0253-f985-4d95-b46c-abb2acc3e872\") " pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:56:23.344835 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:23.344811 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3db0253-f985-4d95-b46c-abb2acc3e872-metrics-certs\") pod \"network-metrics-daemon-8g7qk\" (UID: \"a3db0253-f985-4d95-b46c-abb2acc3e872\") " pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:56:23.371585 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:23.371549 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-fb76v\"" Apr 16 14:56:23.379954 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:23.379926 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8g7qk" Apr 16 14:56:23.501338 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:23.501302 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8g7qk"] Apr 16 14:56:23.504577 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:56:23.504536 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3db0253_f985_4d95_b46c_abb2acc3e872.slice/crio-fd019394c9ce6dc8d729615faf92387ceaaa018606ba6d88738674ab7352afde WatchSource:0}: Error finding container fd019394c9ce6dc8d729615faf92387ceaaa018606ba6d88738674ab7352afde: Status 404 returned error can't find the container with id fd019394c9ce6dc8d729615faf92387ceaaa018606ba6d88738674ab7352afde Apr 16 14:56:24.233670 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:24.233624 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8g7qk" event={"ID":"a3db0253-f985-4d95-b46c-abb2acc3e872","Type":"ContainerStarted","Data":"fd019394c9ce6dc8d729615faf92387ceaaa018606ba6d88738674ab7352afde"} Apr 16 14:56:25.238097 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:25.238042 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8g7qk" event={"ID":"a3db0253-f985-4d95-b46c-abb2acc3e872","Type":"ContainerStarted","Data":"521c8e3fd1064ea29fb25197710ddfafd920926d4eb8045a125e0ac3a5bc09ac"} Apr 16 14:56:25.238487 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:25.238107 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8g7qk" event={"ID":"a3db0253-f985-4d95-b46c-abb2acc3e872","Type":"ContainerStarted","Data":"7b49dd35c7f5ae350e311b49381da037f10adb56c998b0a74ea1735a41b2b159"} Apr 16 14:56:25.251774 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:25.251722 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8g7qk" podStartSLOduration=253.327936579 podStartE2EDuration="4m14.251702667s" podCreationTimestamp="2026-04-16 14:52:11 +0000 UTC" firstStartedPulling="2026-04-16 14:56:23.506392982 +0000 UTC m=+252.614022085" lastFinishedPulling="2026-04-16 14:56:24.430159071 +0000 UTC m=+253.537788173" observedRunningTime="2026-04-16 14:56:25.250931701 +0000 UTC m=+254.358560826" watchObservedRunningTime="2026-04-16 14:56:25.251702667 +0000 UTC m=+254.359331792" Apr 16 14:56:28.605908 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:28.605806 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:28.621762 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:28.621734 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:29.267513 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:29.267484 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:50.892483 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:56:50.892440 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-9zlt8" podUID="be6cb6cd-b928-4807-90d1-c1f8d6657af1" Apr 16 14:56:50.892873 ip-10-0-129-105 kubenswrapper[2579]: E0416 14:56:50.892440 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-hg2hp" podUID="7c5aa40b-af79-42ef-99df-394eb1b2d683" Apr 16 14:56:51.316152 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:51.316117 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hg2hp" Apr 16 14:56:51.316152 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:51.316137 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9zlt8" Apr 16 14:56:54.214407 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:54.214355 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert\") pod \"ingress-canary-hg2hp\" (UID: \"7c5aa40b-af79-42ef-99df-394eb1b2d683\") " pod="openshift-ingress-canary/ingress-canary-hg2hp" Apr 16 14:56:54.214407 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:54.214413 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:56:54.216809 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:54.216781 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be6cb6cd-b928-4807-90d1-c1f8d6657af1-metrics-tls\") pod \"dns-default-9zlt8\" (UID: \"be6cb6cd-b928-4807-90d1-c1f8d6657af1\") " pod="openshift-dns/dns-default-9zlt8" Apr 16 14:56:54.216916 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:54.216868 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c5aa40b-af79-42ef-99df-394eb1b2d683-cert\") pod \"ingress-canary-hg2hp\" (UID: \"7c5aa40b-af79-42ef-99df-394eb1b2d683\") " pod="openshift-ingress-canary/ingress-canary-hg2hp" Apr 16 14:56:54.318788 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:54.318754 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cvt8w\"" Apr 16 14:56:54.319248 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:54.319230 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9hwfz\"" Apr 16 14:56:54.327485 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:54.327466 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9zlt8" Apr 16 14:56:54.327607 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:54.327521 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hg2hp" Apr 16 14:56:54.474408 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:54.474278 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hg2hp"] Apr 16 14:56:54.480180 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:56:54.480030 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c5aa40b_af79_42ef_99df_394eb1b2d683.slice/crio-98b6a213e4c66ffe10b47ed830acbb15836cee0b81ef036d67f9fe6358e7be67 WatchSource:0}: Error finding container 98b6a213e4c66ffe10b47ed830acbb15836cee0b81ef036d67f9fe6358e7be67: Status 404 returned error can't find the container with id 98b6a213e4c66ffe10b47ed830acbb15836cee0b81ef036d67f9fe6358e7be67 Apr 16 14:56:54.498449 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:54.498424 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9zlt8"] Apr 16 14:56:54.501221 ip-10-0-129-105 kubenswrapper[2579]: W0416 14:56:54.501194 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe6cb6cd_b928_4807_90d1_c1f8d6657af1.slice/crio-749d04489557a4dc778e7e89d39ad9c51a2d8412e618541ad4df4cbc587c78ab WatchSource:0}: Error finding container 749d04489557a4dc778e7e89d39ad9c51a2d8412e618541ad4df4cbc587c78ab: Status 404 returned error can't find the container with id 749d04489557a4dc778e7e89d39ad9c51a2d8412e618541ad4df4cbc587c78ab Apr 16 14:56:55.329617 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:55.329580 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hg2hp" event={"ID":"7c5aa40b-af79-42ef-99df-394eb1b2d683","Type":"ContainerStarted","Data":"98b6a213e4c66ffe10b47ed830acbb15836cee0b81ef036d67f9fe6358e7be67"} Apr 16 14:56:55.331431 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:55.331397 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9zlt8" event={"ID":"be6cb6cd-b928-4807-90d1-c1f8d6657af1","Type":"ContainerStarted","Data":"749d04489557a4dc778e7e89d39ad9c51a2d8412e618541ad4df4cbc587c78ab"} Apr 16 14:56:57.340009 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:57.339971 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hg2hp" event={"ID":"7c5aa40b-af79-42ef-99df-394eb1b2d683","Type":"ContainerStarted","Data":"d3f6abae1de2d3517f1a94078058acf81ca0e7725139226535b9c7260365887a"} Apr 16 14:56:57.341420 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:57.341400 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9zlt8" event={"ID":"be6cb6cd-b928-4807-90d1-c1f8d6657af1","Type":"ContainerStarted","Data":"3cc22e35f6dd6f47c750d9267b9fb9214d636e77db8ec55d20bcbb136c4d142c"} Apr 16 14:56:57.341495 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:57.341426 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9zlt8" event={"ID":"be6cb6cd-b928-4807-90d1-c1f8d6657af1","Type":"ContainerStarted","Data":"a7f42cc1188d11007164bb7162611987297cced34430a22c64a5e3686a4129da"} Apr 16 14:56:57.341532 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:57.341525 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-9zlt8" Apr 16 14:56:57.352905 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:57.352854 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hg2hp" podStartSLOduration=251.551940117 podStartE2EDuration="4m13.352842131s" podCreationTimestamp="2026-04-16 14:52:44 +0000 UTC" firstStartedPulling="2026-04-16 14:56:54.482456422 +0000 UTC m=+283.590085525" lastFinishedPulling="2026-04-16 14:56:56.283358433 +0000 UTC m=+285.390987539" observedRunningTime="2026-04-16 14:56:57.352645189 +0000 UTC m=+286.460274314" watchObservedRunningTime="2026-04-16 14:56:57.352842131 +0000 UTC m=+286.460471255" Apr 16 14:56:57.367170 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:56:57.367123 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9zlt8" podStartSLOduration=251.590333345 podStartE2EDuration="4m13.367107439s" podCreationTimestamp="2026-04-16 14:52:44 +0000 UTC" firstStartedPulling="2026-04-16 14:56:54.502830764 +0000 UTC m=+283.610459865" lastFinishedPulling="2026-04-16 14:56:56.279604842 +0000 UTC m=+285.387233959" observedRunningTime="2026-04-16 14:56:57.36517693 +0000 UTC m=+286.472806055" watchObservedRunningTime="2026-04-16 14:56:57.367107439 +0000 UTC m=+286.474736562" Apr 16 14:57:07.345872 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:57:07.345836 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9zlt8" Apr 16 14:57:11.378567 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:57:11.378534 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 14:57:11.378984 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:57:11.378534 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 14:57:11.382894 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:57:11.382859 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 14:57:11.389209 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:57:11.389173 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 14:57:11.395431 ip-10-0-129-105 kubenswrapper[2579]: I0416 14:57:11.395405 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 15:02:11.407374 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:02:11.407346 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 15:02:11.408770 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:02:11.408744 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 15:02:11.410286 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:02:11.410268 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 15:02:11.411792 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:02:11.411768 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 15:04:56.990902 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:04:56.990869 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-6fqg8"] Apr 16 15:04:56.993125 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:04:56.993110 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-6fqg8" Apr 16 15:04:56.995119 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:04:56.995096 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 15:04:56.995119 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:04:56.995107 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-r6gvl\"" Apr 16 15:04:56.995826 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:04:56.995810 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 15:04:56.995876 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:04:56.995813 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 15:04:56.999991 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:04:56.999969 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-6fqg8"] Apr 16 15:04:57.097005 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:04:57.096970 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dnvr\" (UniqueName: \"kubernetes.io/projected/35c0566b-14ea-463a-966e-6e877ace977d-kube-api-access-8dnvr\") pod \"s3-init-6fqg8\" (UID: \"35c0566b-14ea-463a-966e-6e877ace977d\") " pod="kserve/s3-init-6fqg8" Apr 16 15:04:57.198442 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:04:57.198408 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dnvr\" (UniqueName: \"kubernetes.io/projected/35c0566b-14ea-463a-966e-6e877ace977d-kube-api-access-8dnvr\") pod \"s3-init-6fqg8\" (UID: \"35c0566b-14ea-463a-966e-6e877ace977d\") " pod="kserve/s3-init-6fqg8" Apr 16 15:04:57.205711 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:04:57.205674 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dnvr\" (UniqueName: \"kubernetes.io/projected/35c0566b-14ea-463a-966e-6e877ace977d-kube-api-access-8dnvr\") pod \"s3-init-6fqg8\" (UID: \"35c0566b-14ea-463a-966e-6e877ace977d\") " pod="kserve/s3-init-6fqg8" Apr 16 15:04:57.310651 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:04:57.310552 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-6fqg8" Apr 16 15:04:57.430221 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:04:57.430188 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-6fqg8"] Apr 16 15:04:57.433387 ip-10-0-129-105 kubenswrapper[2579]: W0416 15:04:57.433354 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35c0566b_14ea_463a_966e_6e877ace977d.slice/crio-e41171587803844a4e6366828e544ba3a053acfaaaa09287991724868a3155b5 WatchSource:0}: Error finding container e41171587803844a4e6366828e544ba3a053acfaaaa09287991724868a3155b5: Status 404 returned error can't find the container with id e41171587803844a4e6366828e544ba3a053acfaaaa09287991724868a3155b5 Apr 16 15:04:57.435251 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:04:57.435234 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:04:57.755485 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:04:57.755448 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-6fqg8" event={"ID":"35c0566b-14ea-463a-966e-6e877ace977d","Type":"ContainerStarted","Data":"e41171587803844a4e6366828e544ba3a053acfaaaa09287991724868a3155b5"} Apr 16 15:05:02.772885 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:05:02.772845 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-6fqg8" event={"ID":"35c0566b-14ea-463a-966e-6e877ace977d","Type":"ContainerStarted","Data":"34fbb8e8960d5d253d7454cca879f667f11e22273db9b57fbc8c09e25f9a5b8a"} Apr 16 15:05:02.786810 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:05:02.786761 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-6fqg8" podStartSLOduration=2.412107021 podStartE2EDuration="6.786744826s" podCreationTimestamp="2026-04-16 15:04:56 +0000 UTC" firstStartedPulling="2026-04-16 15:04:57.435356928 +0000 UTC m=+766.542986030" lastFinishedPulling="2026-04-16 15:05:01.809994723 +0000 UTC m=+770.917623835" observedRunningTime="2026-04-16 15:05:02.784966627 +0000 UTC m=+771.892595750" watchObservedRunningTime="2026-04-16 15:05:02.786744826 +0000 UTC m=+771.894373949" Apr 16 15:05:05.783332 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:05:05.783297 2579 generic.go:358] "Generic (PLEG): container finished" podID="35c0566b-14ea-463a-966e-6e877ace977d" containerID="34fbb8e8960d5d253d7454cca879f667f11e22273db9b57fbc8c09e25f9a5b8a" exitCode=0 Apr 16 15:05:05.783782 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:05:05.783355 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-6fqg8" event={"ID":"35c0566b-14ea-463a-966e-6e877ace977d","Type":"ContainerDied","Data":"34fbb8e8960d5d253d7454cca879f667f11e22273db9b57fbc8c09e25f9a5b8a"} Apr 16 15:05:06.919626 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:05:06.919601 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-6fqg8" Apr 16 15:05:07.093208 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:05:07.093116 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dnvr\" (UniqueName: \"kubernetes.io/projected/35c0566b-14ea-463a-966e-6e877ace977d-kube-api-access-8dnvr\") pod \"35c0566b-14ea-463a-966e-6e877ace977d\" (UID: \"35c0566b-14ea-463a-966e-6e877ace977d\") " Apr 16 15:05:07.095374 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:05:07.095344 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c0566b-14ea-463a-966e-6e877ace977d-kube-api-access-8dnvr" (OuterVolumeSpecName: "kube-api-access-8dnvr") pod "35c0566b-14ea-463a-966e-6e877ace977d" (UID: "35c0566b-14ea-463a-966e-6e877ace977d"). InnerVolumeSpecName "kube-api-access-8dnvr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:05:07.194021 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:05:07.193986 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8dnvr\" (UniqueName: \"kubernetes.io/projected/35c0566b-14ea-463a-966e-6e877ace977d-kube-api-access-8dnvr\") on node \"ip-10-0-129-105.ec2.internal\" DevicePath \"\"" Apr 16 15:05:07.796142 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:05:07.796047 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-6fqg8" event={"ID":"35c0566b-14ea-463a-966e-6e877ace977d","Type":"ContainerDied","Data":"e41171587803844a4e6366828e544ba3a053acfaaaa09287991724868a3155b5"} Apr 16 15:05:07.796142 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:05:07.796101 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e41171587803844a4e6366828e544ba3a053acfaaaa09287991724868a3155b5" Apr 16 15:05:07.796142 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:05:07.796105 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-6fqg8" Apr 16 15:07:11.432820 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:07:11.432741 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 15:07:11.433412 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:07:11.433342 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 15:07:11.442310 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:07:11.442289 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 15:07:11.443022 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:07:11.443004 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 15:12:11.461017 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:12:11.460981 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 15:12:11.463669 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:12:11.463646 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 15:12:11.464015 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:12:11.463995 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 15:12:11.467131 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:12:11.467115 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 15:17:11.483599 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:17:11.483567 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 15:17:11.486842 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:17:11.486816 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 15:17:11.488399 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:17:11.488368 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 15:17:11.491125 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:17:11.491108 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 15:22:11.506316 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:22:11.506285 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 15:22:11.509888 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:22:11.509854 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 15:22:11.511003 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:22:11.510980 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 15:22:11.513883 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:22:11.513863 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 15:27:11.528687 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:27:11.528657 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 15:27:11.532049 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:27:11.532011 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 15:27:11.533665 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:27:11.533646 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 15:27:11.536627 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:27:11.536610 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 15:32:11.551284 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:32:11.551169 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 15:32:11.555565 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:32:11.554434 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 15:32:11.556081 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:32:11.556049 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 15:32:11.558993 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:32:11.558977 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 15:37:11.573717 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:37:11.573620 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 15:37:11.576479 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:37:11.576448 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 15:37:11.578315 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:37:11.578294 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 15:37:11.581313 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:37:11.581297 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 15:42:11.595807 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:42:11.595696 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 15:42:11.600198 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:42:11.598854 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 15:42:11.600370 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:42:11.600353 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 15:42:11.603449 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:42:11.603431 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 15:44:58.048943 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:44:58.048910 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sm4xg/must-gather-gcvsl"] Apr 16 15:44:58.049407 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:44:58.049263 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="35c0566b-14ea-463a-966e-6e877ace977d" containerName="s3-init" Apr 16 15:44:58.049407 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:44:58.049275 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c0566b-14ea-463a-966e-6e877ace977d" containerName="s3-init" Apr 16 15:44:58.049407 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:44:58.049333 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="35c0566b-14ea-463a-966e-6e877ace977d" containerName="s3-init" Apr 16 15:44:58.052447 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:44:58.052431 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sm4xg/must-gather-gcvsl" Apr 16 15:44:58.054289 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:44:58.054270 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sm4xg\"/\"kube-root-ca.crt\"" Apr 16 15:44:58.054702 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:44:58.054687 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sm4xg\"/\"openshift-service-ca.crt\"" Apr 16 15:44:58.054741 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:44:58.054703 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-sm4xg\"/\"default-dockercfg-6dm8x\"" Apr 16 15:44:58.063846 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:44:58.063822 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sm4xg/must-gather-gcvsl"] Apr 16 15:44:58.104490 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:44:58.104454 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/75318b36-d773-44e1-b710-4e270d866ee6-must-gather-output\") pod \"must-gather-gcvsl\" (UID: \"75318b36-d773-44e1-b710-4e270d866ee6\") " pod="openshift-must-gather-sm4xg/must-gather-gcvsl" Apr 16 15:44:58.104670 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:44:58.104536 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmg9q\" (UniqueName: \"kubernetes.io/projected/75318b36-d773-44e1-b710-4e270d866ee6-kube-api-access-bmg9q\") pod \"must-gather-gcvsl\" (UID: \"75318b36-d773-44e1-b710-4e270d866ee6\") " pod="openshift-must-gather-sm4xg/must-gather-gcvsl" Apr 16 15:44:58.205229 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:44:58.205192 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/75318b36-d773-44e1-b710-4e270d866ee6-must-gather-output\") pod \"must-gather-gcvsl\" (UID: \"75318b36-d773-44e1-b710-4e270d866ee6\") " pod="openshift-must-gather-sm4xg/must-gather-gcvsl" Apr 16 15:44:58.205394 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:44:58.205256 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmg9q\" (UniqueName: \"kubernetes.io/projected/75318b36-d773-44e1-b710-4e270d866ee6-kube-api-access-bmg9q\") pod \"must-gather-gcvsl\" (UID: \"75318b36-d773-44e1-b710-4e270d866ee6\") " pod="openshift-must-gather-sm4xg/must-gather-gcvsl" Apr 16 15:44:58.205613 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:44:58.205587 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/75318b36-d773-44e1-b710-4e270d866ee6-must-gather-output\") pod \"must-gather-gcvsl\" (UID: \"75318b36-d773-44e1-b710-4e270d866ee6\") " pod="openshift-must-gather-sm4xg/must-gather-gcvsl" Apr 16 15:44:58.212191 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:44:58.212166 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmg9q\" (UniqueName: \"kubernetes.io/projected/75318b36-d773-44e1-b710-4e270d866ee6-kube-api-access-bmg9q\") pod \"must-gather-gcvsl\" (UID: \"75318b36-d773-44e1-b710-4e270d866ee6\") " pod="openshift-must-gather-sm4xg/must-gather-gcvsl" Apr 16 15:44:58.377131 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:44:58.377007 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sm4xg/must-gather-gcvsl" Apr 16 15:44:58.499014 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:44:58.498816 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sm4xg/must-gather-gcvsl"] Apr 16 15:44:58.501743 ip-10-0-129-105 kubenswrapper[2579]: W0416 15:44:58.501707 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75318b36_d773_44e1_b710_4e270d866ee6.slice/crio-eebd7b5f35986da09cf89847e047073b3b8f366a8a52d9d853bd7554f1a07f4a WatchSource:0}: Error finding container eebd7b5f35986da09cf89847e047073b3b8f366a8a52d9d853bd7554f1a07f4a: Status 404 returned error can't find the container with id eebd7b5f35986da09cf89847e047073b3b8f366a8a52d9d853bd7554f1a07f4a Apr 16 15:44:58.503424 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:44:58.503406 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:44:58.884160 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:44:58.884115 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sm4xg/must-gather-gcvsl" event={"ID":"75318b36-d773-44e1-b710-4e270d866ee6","Type":"ContainerStarted","Data":"eebd7b5f35986da09cf89847e047073b3b8f366a8a52d9d853bd7554f1a07f4a"} Apr 16 15:45:03.903199 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:03.903156 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sm4xg/must-gather-gcvsl" event={"ID":"75318b36-d773-44e1-b710-4e270d866ee6","Type":"ContainerStarted","Data":"8ee9dd7ffedcdaa815dceff7225d0188cec423c21414d746b060e700efab388c"} Apr 16 15:45:03.903199 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:03.903203 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sm4xg/must-gather-gcvsl" event={"ID":"75318b36-d773-44e1-b710-4e270d866ee6","Type":"ContainerStarted","Data":"30f6ae28b34501a5f98396fe3c2bb93aa897175db777df74212c47ffd3fe48f0"} Apr 16 15:45:03.917240 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:03.917185 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sm4xg/must-gather-gcvsl" podStartSLOduration=1.553074027 podStartE2EDuration="5.917166029s" podCreationTimestamp="2026-04-16 15:44:58 +0000 UTC" firstStartedPulling="2026-04-16 15:44:58.503559505 +0000 UTC m=+3167.611188606" lastFinishedPulling="2026-04-16 15:45:02.867651504 +0000 UTC m=+3171.975280608" observedRunningTime="2026-04-16 15:45:03.915263295 +0000 UTC m=+3173.022892418" watchObservedRunningTime="2026-04-16 15:45:03.917166029 +0000 UTC m=+3173.024795153" Apr 16 15:45:21.963065 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:21.963029 2579 generic.go:358] "Generic (PLEG): container finished" podID="75318b36-d773-44e1-b710-4e270d866ee6" containerID="30f6ae28b34501a5f98396fe3c2bb93aa897175db777df74212c47ffd3fe48f0" exitCode=0 Apr 16 15:45:21.963493 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:21.963104 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sm4xg/must-gather-gcvsl" event={"ID":"75318b36-d773-44e1-b710-4e270d866ee6","Type":"ContainerDied","Data":"30f6ae28b34501a5f98396fe3c2bb93aa897175db777df74212c47ffd3fe48f0"} Apr 16 15:45:21.963493 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:21.963465 2579 scope.go:117] "RemoveContainer" containerID="30f6ae28b34501a5f98396fe3c2bb93aa897175db777df74212c47ffd3fe48f0" Apr 16 15:45:22.198024 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:22.197987 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sm4xg_must-gather-gcvsl_75318b36-d773-44e1-b710-4e270d866ee6/gather/0.log" Apr 16 15:45:25.637356 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:25.637314 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-s8pz4_eb304f66-fefa-4772-b282-3ce9a4298910/global-pull-secret-syncer/0.log" Apr 16 15:45:25.781488 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:25.781457 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-nxthn_2b4700c8-468e-4f4d-9f39-760f7db3a824/konnectivity-agent/0.log" Apr 16 15:45:25.801793 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:25.801756 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-105.ec2.internal_c7b6d0dbc6e31e2e27484e63c6997a81/haproxy/0.log" Apr 16 15:45:27.615617 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:27.615583 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sm4xg/must-gather-gcvsl"] Apr 16 15:45:27.616130 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:27.615820 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-sm4xg/must-gather-gcvsl" podUID="75318b36-d773-44e1-b710-4e270d866ee6" containerName="copy" containerID="cri-o://8ee9dd7ffedcdaa815dceff7225d0188cec423c21414d746b060e700efab388c" gracePeriod=2 Apr 16 15:45:27.620638 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:27.620613 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sm4xg/must-gather-gcvsl"] Apr 16 15:45:27.841745 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:27.841716 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sm4xg_must-gather-gcvsl_75318b36-d773-44e1-b710-4e270d866ee6/copy/0.log" Apr 16 15:45:27.842046 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:27.842032 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sm4xg/must-gather-gcvsl" Apr 16 15:45:27.843523 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:27.843501 2579 status_manager.go:895] "Failed to get status for pod" podUID="75318b36-d773-44e1-b710-4e270d866ee6" pod="openshift-must-gather-sm4xg/must-gather-gcvsl" err="pods \"must-gather-gcvsl\" is forbidden: User \"system:node:ip-10-0-129-105.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-sm4xg\": no relationship found between node 'ip-10-0-129-105.ec2.internal' and this object" Apr 16 15:45:27.862584 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:27.862557 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmg9q\" (UniqueName: \"kubernetes.io/projected/75318b36-d773-44e1-b710-4e270d866ee6-kube-api-access-bmg9q\") pod \"75318b36-d773-44e1-b710-4e270d866ee6\" (UID: \"75318b36-d773-44e1-b710-4e270d866ee6\") " Apr 16 15:45:27.862681 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:27.862604 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/75318b36-d773-44e1-b710-4e270d866ee6-must-gather-output\") pod \"75318b36-d773-44e1-b710-4e270d866ee6\" (UID: \"75318b36-d773-44e1-b710-4e270d866ee6\") " Apr 16 15:45:27.863791 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:27.863768 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75318b36-d773-44e1-b710-4e270d866ee6-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "75318b36-d773-44e1-b710-4e270d866ee6" (UID: "75318b36-d773-44e1-b710-4e270d866ee6"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:45:27.864726 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:27.864703 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75318b36-d773-44e1-b710-4e270d866ee6-kube-api-access-bmg9q" (OuterVolumeSpecName: "kube-api-access-bmg9q") pod "75318b36-d773-44e1-b710-4e270d866ee6" (UID: "75318b36-d773-44e1-b710-4e270d866ee6"). InnerVolumeSpecName "kube-api-access-bmg9q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:45:27.963233 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:27.963194 2579 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/75318b36-d773-44e1-b710-4e270d866ee6-must-gather-output\") on node \"ip-10-0-129-105.ec2.internal\" DevicePath \"\"" Apr 16 15:45:27.963233 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:27.963229 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bmg9q\" (UniqueName: \"kubernetes.io/projected/75318b36-d773-44e1-b710-4e270d866ee6-kube-api-access-bmg9q\") on node \"ip-10-0-129-105.ec2.internal\" DevicePath \"\"" Apr 16 15:45:27.981516 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:27.981487 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sm4xg_must-gather-gcvsl_75318b36-d773-44e1-b710-4e270d866ee6/copy/0.log" Apr 16 15:45:27.981793 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:27.981768 2579 generic.go:358] "Generic (PLEG): container finished" podID="75318b36-d773-44e1-b710-4e270d866ee6" containerID="8ee9dd7ffedcdaa815dceff7225d0188cec423c21414d746b060e700efab388c" exitCode=143 Apr 16 15:45:27.981848 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:27.981823 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sm4xg/must-gather-gcvsl" Apr 16 15:45:27.981893 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:27.981878 2579 scope.go:117] "RemoveContainer" containerID="8ee9dd7ffedcdaa815dceff7225d0188cec423c21414d746b060e700efab388c" Apr 16 15:45:27.983532 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:27.983510 2579 status_manager.go:895] "Failed to get status for pod" podUID="75318b36-d773-44e1-b710-4e270d866ee6" pod="openshift-must-gather-sm4xg/must-gather-gcvsl" err="pods \"must-gather-gcvsl\" is forbidden: User \"system:node:ip-10-0-129-105.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-sm4xg\": no relationship found between node 'ip-10-0-129-105.ec2.internal' and this object" Apr 16 15:45:27.990232 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:27.990208 2579 scope.go:117] "RemoveContainer" containerID="30f6ae28b34501a5f98396fe3c2bb93aa897175db777df74212c47ffd3fe48f0" Apr 16 15:45:27.991203 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:27.991181 2579 status_manager.go:895] "Failed to get status for pod" podUID="75318b36-d773-44e1-b710-4e270d866ee6" pod="openshift-must-gather-sm4xg/must-gather-gcvsl" err="pods \"must-gather-gcvsl\" is forbidden: User \"system:node:ip-10-0-129-105.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-sm4xg\": no relationship found between node 'ip-10-0-129-105.ec2.internal' and this object" Apr 16 15:45:28.002267 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:28.002246 2579 scope.go:117] "RemoveContainer" containerID="8ee9dd7ffedcdaa815dceff7225d0188cec423c21414d746b060e700efab388c" Apr 16 15:45:28.002532 ip-10-0-129-105 kubenswrapper[2579]: E0416 15:45:28.002511 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ee9dd7ffedcdaa815dceff7225d0188cec423c21414d746b060e700efab388c\": container with ID starting with 8ee9dd7ffedcdaa815dceff7225d0188cec423c21414d746b060e700efab388c not found: ID does not exist" containerID="8ee9dd7ffedcdaa815dceff7225d0188cec423c21414d746b060e700efab388c" Apr 16 15:45:28.002593 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:28.002541 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ee9dd7ffedcdaa815dceff7225d0188cec423c21414d746b060e700efab388c"} err="failed to get container status \"8ee9dd7ffedcdaa815dceff7225d0188cec423c21414d746b060e700efab388c\": rpc error: code = NotFound desc = could not find container \"8ee9dd7ffedcdaa815dceff7225d0188cec423c21414d746b060e700efab388c\": container with ID starting with 8ee9dd7ffedcdaa815dceff7225d0188cec423c21414d746b060e700efab388c not found: ID does not exist" Apr 16 15:45:28.002593 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:28.002572 2579 scope.go:117] "RemoveContainer" containerID="30f6ae28b34501a5f98396fe3c2bb93aa897175db777df74212c47ffd3fe48f0" Apr 16 15:45:28.002829 ip-10-0-129-105 kubenswrapper[2579]: E0416 15:45:28.002810 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30f6ae28b34501a5f98396fe3c2bb93aa897175db777df74212c47ffd3fe48f0\": container with ID starting with 30f6ae28b34501a5f98396fe3c2bb93aa897175db777df74212c47ffd3fe48f0 not found: ID does not exist" containerID="30f6ae28b34501a5f98396fe3c2bb93aa897175db777df74212c47ffd3fe48f0" Apr 16 15:45:28.002877 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:28.002834 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30f6ae28b34501a5f98396fe3c2bb93aa897175db777df74212c47ffd3fe48f0"} err="failed to get container status \"30f6ae28b34501a5f98396fe3c2bb93aa897175db777df74212c47ffd3fe48f0\": rpc error: code = NotFound desc = could not find container \"30f6ae28b34501a5f98396fe3c2bb93aa897175db777df74212c47ffd3fe48f0\": container with ID starting with 30f6ae28b34501a5f98396fe3c2bb93aa897175db777df74212c47ffd3fe48f0 not found: ID does not exist" Apr 16 15:45:29.038927 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.038863 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6e023847-8b2f-4ce9-8b72-3047e39b02df/alertmanager/0.log" Apr 16 15:45:29.061337 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.061312 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6e023847-8b2f-4ce9-8b72-3047e39b02df/config-reloader/0.log" Apr 16 15:45:29.085061 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.085037 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6e023847-8b2f-4ce9-8b72-3047e39b02df/kube-rbac-proxy-web/0.log" Apr 16 15:45:29.110903 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.110867 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6e023847-8b2f-4ce9-8b72-3047e39b02df/kube-rbac-proxy/0.log" Apr 16 15:45:29.130743 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.130717 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6e023847-8b2f-4ce9-8b72-3047e39b02df/kube-rbac-proxy-metric/0.log" Apr 16 15:45:29.154478 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.154454 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6e023847-8b2f-4ce9-8b72-3047e39b02df/prom-label-proxy/0.log" Apr 16 15:45:29.177743 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.177722 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6e023847-8b2f-4ce9-8b72-3047e39b02df/init-config-reloader/0.log" Apr 16 15:45:29.259555 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.259524 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-cvmbz_c04e1a6b-9b30-411d-9334-189a455233d6/kube-state-metrics/0.log" Apr 16 15:45:29.282383 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.282356 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-cvmbz_c04e1a6b-9b30-411d-9334-189a455233d6/kube-rbac-proxy-main/0.log" Apr 16 15:45:29.304036 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.303955 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-cvmbz_c04e1a6b-9b30-411d-9334-189a455233d6/kube-rbac-proxy-self/0.log" Apr 16 15:45:29.332431 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.332403 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-74b95c45b8-6gtpp_8401793b-8665-481a-9533-86b9379b6202/metrics-server/0.log" Apr 16 15:45:29.357050 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.357021 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-mcsf6_783b5152-8549-4302-a3b6-604516a6e27d/monitoring-plugin/0.log" Apr 16 15:45:29.387624 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.387598 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-69bnh_48808a06-57df-42da-95c5-7989c2599f55/node-exporter/0.log" Apr 16 15:45:29.409029 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.409002 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-69bnh_48808a06-57df-42da-95c5-7989c2599f55/kube-rbac-proxy/0.log" Apr 16 15:45:29.444492 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.444465 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-69bnh_48808a06-57df-42da-95c5-7989c2599f55/init-textfile/0.log" Apr 16 15:45:29.472158 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.472125 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75318b36-d773-44e1-b710-4e270d866ee6" path="/var/lib/kubelet/pods/75318b36-d773-44e1-b710-4e270d866ee6/volumes" Apr 16 15:45:29.612459 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.612372 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-4drxr_23b0fb83-9383-4487-9c59-ea958ea92af3/kube-rbac-proxy-main/0.log" Apr 16 15:45:29.635519 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.635491 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-4drxr_23b0fb83-9383-4487-9c59-ea958ea92af3/kube-rbac-proxy-self/0.log" Apr 16 15:45:29.656876 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.656844 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-4drxr_23b0fb83-9383-4487-9c59-ea958ea92af3/openshift-state-metrics/0.log" Apr 16 15:45:29.695523 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.695495 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_25954c20-e9bd-4893-a5f7-ae5ad88a0cbe/prometheus/0.log" Apr 16 15:45:29.713019 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.712996 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_25954c20-e9bd-4893-a5f7-ae5ad88a0cbe/config-reloader/0.log" Apr 16 15:45:29.733472 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.733426 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_25954c20-e9bd-4893-a5f7-ae5ad88a0cbe/thanos-sidecar/0.log" Apr 16 15:45:29.753800 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.753763 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_25954c20-e9bd-4893-a5f7-ae5ad88a0cbe/kube-rbac-proxy-web/0.log" Apr 16 15:45:29.775233 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.775207 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_25954c20-e9bd-4893-a5f7-ae5ad88a0cbe/kube-rbac-proxy/0.log" Apr 16 15:45:29.795933 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.795905 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_25954c20-e9bd-4893-a5f7-ae5ad88a0cbe/kube-rbac-proxy-thanos/0.log" Apr 16 15:45:29.816993 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.816966 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_25954c20-e9bd-4893-a5f7-ae5ad88a0cbe/init-config-reloader/0.log" Apr 16 15:45:29.891437 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:29.891412 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-7lxqz_ecc581b0-e89d-4a7f-948b-429c743874bc/prometheus-operator-admission-webhook/0.log" Apr 16 15:45:31.615629 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:31.615590 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/1.log" Apr 16 15:45:31.620282 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:31.620253 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2lhl8_6dbffa58-0a86-4116-9fd8-0dca9f45e365/console-operator/2.log" Apr 16 15:45:32.277398 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.277361 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5"] Apr 16 15:45:32.277691 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.277678 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75318b36-d773-44e1-b710-4e270d866ee6" containerName="gather" Apr 16 15:45:32.277748 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.277693 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="75318b36-d773-44e1-b710-4e270d866ee6" containerName="gather" Apr 16 15:45:32.277748 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.277719 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75318b36-d773-44e1-b710-4e270d866ee6" containerName="copy" Apr 16 15:45:32.277748 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.277724 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="75318b36-d773-44e1-b710-4e270d866ee6" containerName="copy" Apr 16 15:45:32.277859 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.277770 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="75318b36-d773-44e1-b710-4e270d866ee6" containerName="copy" Apr 16 15:45:32.277859 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.277779 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="75318b36-d773-44e1-b710-4e270d866ee6" containerName="gather" Apr 16 15:45:32.283046 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.283020 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" Apr 16 15:45:32.285034 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.285008 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2ndkd\"/\"kube-root-ca.crt\"" Apr 16 15:45:32.285492 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.285470 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2ndkd\"/\"default-dockercfg-jmzsl\"" Apr 16 15:45:32.285574 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.285470 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2ndkd\"/\"openshift-service-ca.crt\"" Apr 16 15:45:32.290405 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.290368 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5"] Apr 16 15:45:32.297762 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.297730 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07c90554-6ed3-40bd-950c-80dfd2eeb118-sys\") pod \"perf-node-gather-daemonset-7n7x5\" (UID: \"07c90554-6ed3-40bd-950c-80dfd2eeb118\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" Apr 16 15:45:32.297762 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.297764 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/07c90554-6ed3-40bd-950c-80dfd2eeb118-proc\") pod \"perf-node-gather-daemonset-7n7x5\" (UID: \"07c90554-6ed3-40bd-950c-80dfd2eeb118\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" Apr 16 15:45:32.297950 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.297796 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07c90554-6ed3-40bd-950c-80dfd2eeb118-lib-modules\") pod \"perf-node-gather-daemonset-7n7x5\" (UID: \"07c90554-6ed3-40bd-950c-80dfd2eeb118\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" Apr 16 15:45:32.297950 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.297902 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/07c90554-6ed3-40bd-950c-80dfd2eeb118-podres\") pod \"perf-node-gather-daemonset-7n7x5\" (UID: \"07c90554-6ed3-40bd-950c-80dfd2eeb118\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" Apr 16 15:45:32.298016 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.297976 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc6tj\" (UniqueName: \"kubernetes.io/projected/07c90554-6ed3-40bd-950c-80dfd2eeb118-kube-api-access-cc6tj\") pod \"perf-node-gather-daemonset-7n7x5\" (UID: \"07c90554-6ed3-40bd-950c-80dfd2eeb118\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" Apr 16 15:45:32.399384 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.399342 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07c90554-6ed3-40bd-950c-80dfd2eeb118-lib-modules\") pod \"perf-node-gather-daemonset-7n7x5\" (UID: \"07c90554-6ed3-40bd-950c-80dfd2eeb118\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" Apr 16 15:45:32.399560 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.399398 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/07c90554-6ed3-40bd-950c-80dfd2eeb118-podres\") pod \"perf-node-gather-daemonset-7n7x5\" (UID: \"07c90554-6ed3-40bd-950c-80dfd2eeb118\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" Apr 16 15:45:32.399560 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.399437 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cc6tj\" (UniqueName: \"kubernetes.io/projected/07c90554-6ed3-40bd-950c-80dfd2eeb118-kube-api-access-cc6tj\") pod \"perf-node-gather-daemonset-7n7x5\" (UID: \"07c90554-6ed3-40bd-950c-80dfd2eeb118\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" Apr 16 15:45:32.399560 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.399473 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07c90554-6ed3-40bd-950c-80dfd2eeb118-sys\") pod \"perf-node-gather-daemonset-7n7x5\" (UID: \"07c90554-6ed3-40bd-950c-80dfd2eeb118\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" Apr 16 15:45:32.399560 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.399487 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/07c90554-6ed3-40bd-950c-80dfd2eeb118-proc\") pod \"perf-node-gather-daemonset-7n7x5\" (UID: \"07c90554-6ed3-40bd-950c-80dfd2eeb118\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" Apr 16 15:45:32.399691 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.399550 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07c90554-6ed3-40bd-950c-80dfd2eeb118-lib-modules\") pod \"perf-node-gather-daemonset-7n7x5\" (UID: \"07c90554-6ed3-40bd-950c-80dfd2eeb118\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" Apr 16 15:45:32.399691 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.399569 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07c90554-6ed3-40bd-950c-80dfd2eeb118-sys\") pod \"perf-node-gather-daemonset-7n7x5\" (UID: \"07c90554-6ed3-40bd-950c-80dfd2eeb118\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" Apr 16 15:45:32.399691 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.399564 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/07c90554-6ed3-40bd-950c-80dfd2eeb118-proc\") pod \"perf-node-gather-daemonset-7n7x5\" (UID: \"07c90554-6ed3-40bd-950c-80dfd2eeb118\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" Apr 16 15:45:32.399691 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.399605 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/07c90554-6ed3-40bd-950c-80dfd2eeb118-podres\") pod \"perf-node-gather-daemonset-7n7x5\" (UID: \"07c90554-6ed3-40bd-950c-80dfd2eeb118\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" Apr 16 15:45:32.406819 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.406783 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc6tj\" (UniqueName: \"kubernetes.io/projected/07c90554-6ed3-40bd-950c-80dfd2eeb118-kube-api-access-cc6tj\") pod \"perf-node-gather-daemonset-7n7x5\" (UID: \"07c90554-6ed3-40bd-950c-80dfd2eeb118\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" Apr 16 15:45:32.595212 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.595107 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" Apr 16 15:45:32.729940 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.729900 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5"] Apr 16 15:45:32.734486 ip-10-0-129-105 kubenswrapper[2579]: W0416 15:45:32.734452 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod07c90554_6ed3_40bd_950c_80dfd2eeb118.slice/crio-ad901db5a5b89e7bfd8ff6f90709baacf5e5999a1476ee396f237cbde35e2286 WatchSource:0}: Error finding container ad901db5a5b89e7bfd8ff6f90709baacf5e5999a1476ee396f237cbde35e2286: Status 404 returned error can't find the container with id ad901db5a5b89e7bfd8ff6f90709baacf5e5999a1476ee396f237cbde35e2286 Apr 16 15:45:32.999620 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.999584 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" event={"ID":"07c90554-6ed3-40bd-950c-80dfd2eeb118","Type":"ContainerStarted","Data":"b231e359e889a08dd36d0a87aac2b988ed383e49e758fe6818615d413b9d3159"} Apr 16 15:45:32.999620 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.999620 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" event={"ID":"07c90554-6ed3-40bd-950c-80dfd2eeb118","Type":"ContainerStarted","Data":"ad901db5a5b89e7bfd8ff6f90709baacf5e5999a1476ee396f237cbde35e2286"} Apr 16 15:45:32.999823 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:32.999663 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" Apr 16 15:45:33.000904 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:33.000887 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9zlt8_be6cb6cd-b928-4807-90d1-c1f8d6657af1/dns/0.log" Apr 16 15:45:33.013671 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:33.013621 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" podStartSLOduration=1.013605928 podStartE2EDuration="1.013605928s" podCreationTimestamp="2026-04-16 15:45:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:45:33.012732122 +0000 UTC m=+3202.120361247" watchObservedRunningTime="2026-04-16 15:45:33.013605928 +0000 UTC m=+3202.121235052" Apr 16 15:45:33.024864 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:33.024836 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9zlt8_be6cb6cd-b928-4807-90d1-c1f8d6657af1/kube-rbac-proxy/0.log" Apr 16 15:45:33.170007 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:33.169974 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5qg4q_f6ae390c-ede3-458f-8330-0d8d3aad76c2/dns-node-resolver/0.log" Apr 16 15:45:33.566827 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:33.566791 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5985778f7d-g4nxl_ea13ce08-5ce8-4080-9afb-976057b2a884/registry/0.log" Apr 16 15:45:33.590349 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:33.590325 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4d625_c778a259-410c-444b-a486-c230dd795def/node-ca/0.log" Apr 16 15:45:34.258677 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:34.258648 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7468b496d-5qb6b_99e316cc-57e8-4d70-bb6b-e957b5bccf87/router/0.log" Apr 16 15:45:34.575653 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:34.575571 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-hg2hp_7c5aa40b-af79-42ef-99df-394eb1b2d683/serve-healthcheck-canary/0.log" Apr 16 15:45:34.938517 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:34.938490 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hxs84_860d2f9f-664f-4720-9840-cd7e447f9aa3/kube-rbac-proxy/0.log" Apr 16 15:45:34.958421 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:34.958390 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hxs84_860d2f9f-664f-4720-9840-cd7e447f9aa3/exporter/0.log" Apr 16 15:45:34.979109 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:34.979058 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hxs84_860d2f9f-664f-4720-9840-cd7e447f9aa3/extractor/0.log" Apr 16 15:45:37.201119 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:37.201090 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-6fqg8_35c0566b-14ea-463a-966e-6e877ace977d/s3-init/0.log" Apr 16 15:45:39.012809 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:39.012773 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-7n7x5" Apr 16 15:45:41.996899 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:41.996865 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-47k57_ca6006fe-c049-4f20-b847-d14270e6af58/kube-multus/0.log" Apr 16 15:45:42.198526 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:42.198500 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hkq7w_cc0a6a72-089b-44bd-97ca-a4963264f458/kube-multus-additional-cni-plugins/0.log" Apr 16 15:45:42.219038 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:42.219012 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hkq7w_cc0a6a72-089b-44bd-97ca-a4963264f458/egress-router-binary-copy/0.log" Apr 16 15:45:42.242619 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:42.242592 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hkq7w_cc0a6a72-089b-44bd-97ca-a4963264f458/cni-plugins/0.log" Apr 16 15:45:42.263577 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:42.263542 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hkq7w_cc0a6a72-089b-44bd-97ca-a4963264f458/bond-cni-plugin/0.log" Apr 16 15:45:42.284158 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:42.284132 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hkq7w_cc0a6a72-089b-44bd-97ca-a4963264f458/routeoverride-cni/0.log" Apr 16 15:45:42.304329 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:42.304296 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hkq7w_cc0a6a72-089b-44bd-97ca-a4963264f458/whereabouts-cni-bincopy/0.log" Apr 16 15:45:42.324402 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:42.324370 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hkq7w_cc0a6a72-089b-44bd-97ca-a4963264f458/whereabouts-cni/0.log" Apr 16 15:45:42.600658 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:42.600570 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8g7qk_a3db0253-f985-4d95-b46c-abb2acc3e872/network-metrics-daemon/0.log" Apr 16 15:45:42.620323 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:42.620294 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8g7qk_a3db0253-f985-4d95-b46c-abb2acc3e872/kube-rbac-proxy/0.log" Apr 16 15:45:43.701019 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:43.700982 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-controller/0.log" Apr 16 15:45:43.720733 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:43.720701 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/0.log" Apr 16 15:45:43.735634 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:43.735607 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovn-acl-logging/1.log" Apr 16 15:45:43.753952 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:43.753923 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/kube-rbac-proxy-node/0.log" Apr 16 15:45:43.776126 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:43.776098 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 15:45:43.794188 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:43.794160 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/northd/0.log" Apr 16 15:45:43.816426 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:43.816400 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/nbdb/0.log" Apr 16 15:45:43.837968 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:43.837944 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/sbdb/0.log" Apr 16 15:45:43.938645 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:43.938611 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m882k_539ba0b2-e94b-4e6d-9955-d2325acb7a00/ovnkube-controller/0.log" Apr 16 15:45:45.180769 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:45.180738 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-6kn6f_274057c1-8751-4b12-8464-7a42a2c6372c/network-check-target-container/0.log" Apr 16 15:45:46.041841 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:46.041807 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-cl4w9_9e9ac884-30e2-4486-9e89-d541e73ee8c4/iptables-alerter/0.log" Apr 16 15:45:46.630067 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:46.630035 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-jpzc4_4c666548-852c-40b9-aa2a-ee177bbfb811/tuned/0.log" Apr 16 15:45:48.214086 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:48.214038 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-667775844f-ps85k_0848e866-88d6-48a6-abea-931262f45c54/cluster-samples-operator/0.log" Apr 16 15:45:48.230095 ip-10-0-129-105 kubenswrapper[2579]: I0416 15:45:48.230044 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-667775844f-ps85k_0848e866-88d6-48a6-abea-931262f45c54/cluster-samples-operator-watch/0.log"