Apr 16 20:11:18.972814 ip-10-0-129-41 systemd[1]: Starting Kubernetes Kubelet... Apr 16 20:11:19.422473 ip-10-0-129-41 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:19.422473 ip-10-0-129-41 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 20:11:19.422473 ip-10-0-129-41 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:19.422473 ip-10-0-129-41 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 20:11:19.422473 ip-10-0-129-41 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:19.423292 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.423152 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 20:11:19.432595 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432563 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:19.432595 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432584 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:19.432595 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432588 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:19.432595 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432591 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:19.432595 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432594 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:19.432595 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432598 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:19.432595 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432601 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:19.432595 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432604 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:19.432595 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432607 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:19.432595 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432610 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:19.432942 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432613 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:19.432942 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432615 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:19.432942 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432618 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:19.432942 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432621 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:19.432942 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432624 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:19.432942 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432626 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:19.432942 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432629 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:19.432942 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432632 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:19.432942 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432634 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:19.432942 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432637 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:19.432942 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432639 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:19.432942 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432642 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:19.432942 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432644 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:19.432942 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432647 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:19.432942 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432649 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:19.432942 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432652 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:19.432942 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432657 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:19.432942 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432660 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:19.432942 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432663 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:19.433396 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432665 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:19.433396 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432668 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:19.433396 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432670 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:19.433396 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432673 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:19.433396 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432676 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:19.433396 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432678 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:19.433396 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432683 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:19.433396 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432687 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:19.433396 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432690 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:19.433396 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432692 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:19.433396 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432695 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:19.433396 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432698 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:19.433396 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432701 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:19.433396 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432703 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:19.433396 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432707 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:19.433396 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432710 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:19.433396 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432712 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:19.433396 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432715 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:19.433396 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432718 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:19.433396 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432720 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:19.433914 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432724 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:19.433914 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432728 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:19.433914 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432731 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:19.433914 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432733 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:19.433914 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432736 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:19.433914 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432738 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:19.433914 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432741 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:19.433914 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432743 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:19.433914 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432746 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:19.433914 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432749 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:19.433914 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432752 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:19.433914 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432754 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:19.433914 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432757 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:19.433914 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432760 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:19.433914 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432763 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:19.433914 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432766 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:19.433914 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432768 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:19.433914 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432771 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:19.433914 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432773 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:19.434384 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432776 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:19.434384 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432778 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:19.434384 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432782 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:19.434384 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432784 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:19.434384 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432787 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:19.434384 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432790 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:19.434384 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432792 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:19.434384 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432795 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:19.434384 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432798 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:19.434384 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432800 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:19.434384 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432804 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:19.434384 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432807 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:19.434384 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432809 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:19.434384 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432811 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:19.434384 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432814 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:19.434384 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432816 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:19.434384 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432819 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:19.434384 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.432821 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:19.434384 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433222 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:19.434384 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433227 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:19.434890 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433230 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:19.434890 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433233 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:19.434890 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433236 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:19.434890 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433239 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:19.434890 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433242 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:19.434890 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433245 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:19.434890 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433248 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:19.434890 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433251 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:19.434890 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433253 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:19.434890 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433256 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:19.434890 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433258 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:19.434890 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433261 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:19.434890 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433264 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:19.434890 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433266 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:19.434890 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433269 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:19.434890 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433271 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:19.434890 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433274 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:19.434890 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433276 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:19.434890 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433279 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:19.434890 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433283 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:19.435379 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433285 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:19.435379 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433288 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:19.435379 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433291 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:19.435379 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433294 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:19.435379 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433314 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:19.435379 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433318 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:19.435379 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433321 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:19.435379 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433323 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:19.435379 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433326 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:19.435379 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433329 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:19.435379 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433333 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:19.435379 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433336 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:19.435379 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433338 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:19.435379 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433341 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:19.435379 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433344 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:19.435379 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433347 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:19.435379 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433349 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:19.435379 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433352 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:19.435379 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433354 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:19.435888 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433357 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:19.435888 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433360 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:19.435888 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433362 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:19.435888 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433365 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:19.435888 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433367 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:19.435888 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433370 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:19.435888 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433372 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:19.435888 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433375 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:19.435888 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433377 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:19.435888 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433380 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:19.435888 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433382 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:19.435888 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433385 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:19.435888 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433387 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:19.435888 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433390 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:19.435888 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433392 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:19.435888 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433395 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:19.435888 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433398 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:19.435888 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433401 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:19.435888 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433403 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:19.435888 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433406 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:19.436366 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433409 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:19.436366 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433411 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:19.436366 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433414 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:19.436366 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433416 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:19.436366 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433418 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:19.436366 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433421 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:19.436366 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433423 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:19.436366 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433426 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:19.436366 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433429 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:19.436366 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433431 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:19.436366 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433434 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:19.436366 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433436 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:19.436366 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433441 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:19.436366 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433445 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:19.436366 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433448 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:19.436366 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433450 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:19.436366 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433454 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:19.436366 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433457 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433460 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433463 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433466 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433468 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433471 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433473 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.433476 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433570 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433581 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433604 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433610 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433615 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433618 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433623 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433627 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433631 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433634 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433638 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433641 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433644 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433647 2571 flags.go:64] FLAG: --cgroup-root="" Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433650 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 20:11:19.436847 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433653 2571 flags.go:64] FLAG: --client-ca-file="" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433656 2571 flags.go:64] FLAG: --cloud-config="" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433659 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433662 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433666 2571 flags.go:64] FLAG: --cluster-domain="" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433669 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433672 2571 flags.go:64] FLAG: --config-dir="" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433675 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433678 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433682 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433685 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433689 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433692 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433695 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433698 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433701 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433705 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433708 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433713 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433715 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433718 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433721 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433725 2571 flags.go:64] FLAG: --enable-server="true" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433728 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433733 2571 flags.go:64] FLAG: --event-burst="100" Apr 16 20:11:19.437412 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433736 2571 flags.go:64] FLAG: --event-qps="50" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433739 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433742 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433745 2571 flags.go:64] FLAG: --eviction-hard="" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433748 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433751 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433758 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433762 2571 flags.go:64] FLAG: --eviction-soft="" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433765 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433767 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433771 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433773 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433776 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433779 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433782 2571 flags.go:64] FLAG: --feature-gates="" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433785 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433788 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433791 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433795 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433797 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433800 2571 flags.go:64] FLAG: --help="false" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433803 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-129-41.ec2.internal" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433806 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433809 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 20:11:19.438070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433812 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433815 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433819 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433822 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433824 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433827 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433830 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433834 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433837 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433840 2571 flags.go:64] FLAG: --kube-reserved="" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433842 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433845 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433848 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433851 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433855 2571 flags.go:64] FLAG: --lock-file="" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433858 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433861 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433864 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433869 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433873 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433876 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433878 2571 flags.go:64] FLAG: --logging-format="text" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433881 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433885 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 20:11:19.438672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433887 2571 flags.go:64] FLAG: --manifest-url="" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433890 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433895 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433898 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433904 2571 flags.go:64] FLAG: --max-pods="110" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433907 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433910 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433914 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433917 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433920 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433923 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433926 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433934 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433937 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433940 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433944 2571 flags.go:64] FLAG: --pod-cidr="" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433946 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433953 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433957 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433959 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433962 2571 flags.go:64] FLAG: --port="10250" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433965 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433968 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0dd7e90c617e62177" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433972 2571 flags.go:64] FLAG: --qos-reserved="" Apr 16 20:11:19.439245 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433975 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433978 2571 flags.go:64] FLAG: --register-node="true" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433981 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433984 2571 flags.go:64] FLAG: --register-with-taints="" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433988 2571 flags.go:64] FLAG: --registry-burst="10" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433990 2571 flags.go:64] FLAG: --registry-qps="5" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433993 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.433996 2571 flags.go:64] FLAG: --reserved-memory="" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434000 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434003 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434005 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434008 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434011 2571 flags.go:64] FLAG: --runonce="false" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434014 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434016 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434024 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434026 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434039 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434043 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434046 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434049 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434052 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434055 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434058 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434062 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434065 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 20:11:19.439967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434068 2571 flags.go:64] FLAG: --system-cgroups="" Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434070 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434077 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434079 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434082 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434088 2571 flags.go:64] FLAG: --tls-min-version="" Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434091 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434094 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434097 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434100 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434103 2571 flags.go:64] FLAG: --v="2" Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434107 2571 flags.go:64] FLAG: --version="false" Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434112 2571 flags.go:64] FLAG: --vmodule="" Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434116 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.434119 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434219 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434223 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434226 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434229 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434232 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434234 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434238 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434241 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:19.440618 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434244 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:19.441188 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434246 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:19.441188 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434249 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:19.441188 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434252 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:19.441188 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434254 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:19.441188 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434257 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:19.441188 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434259 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:19.441188 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434262 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:19.441188 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434264 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:19.441188 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434267 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:19.441188 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434270 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:19.441188 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434272 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:19.441188 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434275 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:19.441188 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434278 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:19.441188 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434283 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:19.441188 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434285 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:19.441188 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434288 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:19.441188 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434290 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:19.441188 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434293 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:19.441188 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434295 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:19.441188 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434298 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:19.441718 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434300 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:19.441718 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434303 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:19.441718 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434305 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:19.441718 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434308 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:19.441718 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434310 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:19.441718 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434313 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:19.441718 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434316 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:19.441718 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434318 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:19.441718 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434320 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:19.441718 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434324 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:19.441718 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434327 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:19.441718 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434329 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:19.441718 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434332 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:19.441718 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434334 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:19.441718 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434336 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:19.441718 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434339 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:19.441718 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434341 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:19.441718 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434344 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:19.441718 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434346 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:19.441718 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434349 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:19.442217 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434353 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:19.442217 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434355 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:19.442217 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434358 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:19.442217 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434360 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:19.442217 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434363 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:19.442217 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434366 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:19.442217 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434369 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:19.442217 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434372 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:19.442217 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434374 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:19.442217 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434377 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:19.442217 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434379 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:19.442217 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434382 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:19.442217 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434384 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:19.442217 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434387 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:19.442217 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434389 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:19.442217 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434392 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:19.442217 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434395 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:19.442217 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434397 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:19.442217 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434400 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:19.442726 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434404 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:19.442726 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434407 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:19.442726 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434411 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:19.442726 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434414 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:19.442726 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434416 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:19.442726 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434419 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:19.442726 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434421 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:19.442726 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434424 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:19.442726 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434426 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:19.442726 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434428 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:19.442726 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434431 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:19.442726 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434434 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:19.442726 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434436 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:19.442726 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434439 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:19.442726 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434442 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:19.442726 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434445 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:19.442726 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434447 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:19.442726 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.434449 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:19.443204 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.435316 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:19.443204 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.441935 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 20:11:19.443204 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.441954 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 20:11:19.443204 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442003 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:19.443204 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442009 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:19.443204 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442012 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:19.443204 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442016 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:19.443204 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442019 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:19.443204 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442022 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:19.443204 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442025 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:19.443204 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442028 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:19.443204 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442039 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:19.443204 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442042 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:19.443204 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442045 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:19.443204 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442048 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:19.443204 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442050 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:19.443640 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442053 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:19.443640 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442055 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:19.443640 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442058 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:19.443640 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442061 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:19.443640 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442064 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:19.443640 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442068 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:19.443640 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442073 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:19.443640 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442076 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:19.443640 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442079 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:19.443640 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442082 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:19.443640 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442085 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:19.443640 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442087 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:19.443640 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442090 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:19.443640 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442093 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:19.443640 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442096 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:19.443640 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442099 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:19.443640 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442101 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:19.443640 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442104 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:19.443640 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442109 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:19.444110 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442113 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:19.444110 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442115 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:19.444110 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442118 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:19.444110 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442121 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:19.444110 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442123 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:19.444110 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442126 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:19.444110 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442128 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:19.444110 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442132 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:19.444110 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442134 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:19.444110 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442136 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:19.444110 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442139 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:19.444110 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442141 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:19.444110 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442144 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:19.444110 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442146 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:19.444110 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442148 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:19.444110 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442151 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:19.444110 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442153 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:19.444110 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442156 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:19.444110 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442158 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:19.444110 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442161 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:19.444609 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442163 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:19.444609 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442165 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:19.444609 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442168 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:19.444609 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442171 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:19.444609 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442173 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:19.444609 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442175 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:19.444609 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442178 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:19.444609 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442180 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:19.444609 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442183 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:19.444609 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442186 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:19.444609 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442189 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:19.444609 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442193 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:19.444609 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442197 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:19.444609 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442199 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:19.444609 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442202 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:19.444609 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442204 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:19.444609 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442207 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:19.444609 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442209 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:19.444609 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442212 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:19.445070 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442214 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:19.445070 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442217 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:19.445070 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442220 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:19.445070 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442222 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:19.445070 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442225 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:19.445070 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442229 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:19.445070 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442231 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:19.445070 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442234 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:19.445070 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442236 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:19.445070 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442239 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:19.445070 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442242 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:19.445070 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442244 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:19.445070 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442247 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:19.445070 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442249 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:19.445070 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442252 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:19.445070 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.442257 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:19.445472 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442359 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:19.445472 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442365 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:19.445472 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442368 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:19.445472 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442371 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:19.445472 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442374 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:19.445472 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442377 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:19.445472 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442380 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:19.445472 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442382 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:19.445472 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442385 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:19.445472 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442389 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:19.445472 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442392 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:19.445472 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442395 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:19.445472 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442398 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:19.445472 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442401 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:19.445472 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442403 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:19.445472 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442406 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:19.445472 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442408 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:19.445472 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442411 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:19.445472 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442413 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:19.445955 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442416 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:19.445955 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442419 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:19.445955 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442421 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:19.445955 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442424 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:19.445955 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442426 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:19.445955 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442430 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:19.445955 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442433 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:19.445955 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442436 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:19.445955 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442439 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:19.445955 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442441 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:19.445955 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442444 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:19.445955 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442446 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:19.445955 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442449 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:19.445955 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442451 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:19.445955 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442453 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:19.445955 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442456 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:19.445955 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442458 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:19.445955 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442460 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:19.445955 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442463 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:19.446425 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442466 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:19.446425 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442468 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:19.446425 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442471 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:19.446425 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442473 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:19.446425 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442476 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:19.446425 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442479 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:19.446425 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442481 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:19.446425 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442484 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:19.446425 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442487 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:19.446425 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442489 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:19.446425 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442491 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:19.446425 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442494 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:19.446425 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442496 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:19.446425 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442499 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:19.446425 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442501 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:19.446425 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442504 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:19.446425 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442506 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:19.446425 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442508 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:19.446425 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442511 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:19.446425 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442513 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:19.446925 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442515 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:19.446925 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442518 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:19.446925 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442521 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:19.446925 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442523 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:19.446925 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442544 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:19.446925 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442547 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:19.446925 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442549 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:19.446925 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442552 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:19.446925 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442554 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:19.446925 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442557 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:19.446925 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442559 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:19.446925 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442561 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:19.446925 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442564 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:19.446925 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442566 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:19.446925 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442569 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:19.446925 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442571 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:19.446925 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442573 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:19.446925 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442576 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:19.446925 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442579 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:19.446925 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442582 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:19.447404 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442584 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:19.447404 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442587 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:19.447404 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442590 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:19.447404 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442593 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:19.447404 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442596 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:19.447404 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442598 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:19.447404 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442601 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:19.447404 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:19.442603 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:19.447404 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.442608 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:19.447404 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.443490 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 20:11:19.447404 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.446403 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 20:11:19.447404 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.447358 2571 server.go:1019] "Starting client certificate rotation" Apr 16 20:11:19.447740 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.447458 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:11:19.447740 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.447496 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:11:19.474170 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.474145 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:11:19.477064 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.476995 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:11:19.494482 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.494452 2571 log.go:25] "Validated CRI v1 runtime API" Apr 16 20:11:19.500291 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.500268 2571 log.go:25] "Validated CRI v1 image API" Apr 16 20:11:19.501464 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.501439 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 20:11:19.504631 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.504614 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:11:19.505650 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.505626 2571 fs.go:135] Filesystem UUIDs: map[62f03c6e-dc70-4e58-8553-2422e01da74d:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 c5fd7282-c688-4c81-bd71-085d40d7d5ff:/dev/nvme0n1p4] Apr 16 20:11:19.505698 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.505650 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 20:11:19.512088 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.511967 2571 manager.go:217] Machine: {Timestamp:2026-04-16 20:11:19.509443249 +0000 UTC m=+0.416788659 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099625 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec205736af4748d95cc1ef8851e10613 SystemUUID:ec205736-af47-48d9-5cc1-ef8851e10613 BootID:8f309758-fc66-42ff-9b49-91a0b6d6fb0d Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1c:4d:42:eb:75 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1c:4d:42:eb:75 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ca:e0:33:fc:81:ee Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 20:11:19.512088 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.512083 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 20:11:19.512191 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.512169 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 20:11:19.513260 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.513239 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 20:11:19.513423 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.513262 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-41.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 20:11:19.513471 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.513434 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 20:11:19.513471 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.513443 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 20:11:19.513471 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.513456 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:11:19.514572 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.514561 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:11:19.515314 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.515304 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:11:19.515421 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.515412 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 20:11:19.518177 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.518166 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 16 20:11:19.518220 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.518185 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 20:11:19.518220 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.518198 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 20:11:19.518220 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.518209 2571 kubelet.go:397] "Adding apiserver pod source" Apr 16 20:11:19.518220 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.518218 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 20:11:19.519281 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.519269 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:11:19.519331 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.519288 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:11:19.522188 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.522175 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 20:11:19.523405 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.523391 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 20:11:19.525019 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.525003 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 20:11:19.525079 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.525031 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 20:11:19.525079 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.525040 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 20:11:19.525079 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.525047 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 20:11:19.525079 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.525052 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 20:11:19.525079 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.525059 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 20:11:19.525079 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.525066 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 20:11:19.525079 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.525072 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 20:11:19.525079 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.525080 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 20:11:19.525280 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.525087 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 20:11:19.525280 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.525096 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 20:11:19.525280 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.525105 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 20:11:19.526099 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.526089 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 20:11:19.526133 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.526100 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 20:11:19.531276 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.531257 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 20:11:19.531398 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.531311 2571 server.go:1295] "Started kubelet" Apr 16 20:11:19.531482 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.531429 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 20:11:19.531538 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.531468 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 20:11:19.532133 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.531505 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 20:11:19.532307 ip-10-0-129-41 systemd[1]: Started Kubernetes Kubelet. Apr 16 20:11:19.533757 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.533737 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 20:11:19.534142 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.534130 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 16 20:11:19.535813 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.535793 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-41.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 20:11:19.535917 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:19.535840 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 20:11:19.535917 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:19.535841 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-41.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 20:11:19.538494 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.538470 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 20:11:19.538981 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.538966 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 20:11:19.539701 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.539681 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 20:11:19.539779 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.539705 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 20:11:19.539851 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.539681 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 20:11:19.540078 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.539932 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 16 20:11:19.540078 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.539940 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 16 20:11:19.540078 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:19.539999 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-41.ec2.internal\" not found" Apr 16 20:11:19.540283 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.540262 2571 factory.go:55] Registering systemd factory Apr 16 20:11:19.540283 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.540287 2571 factory.go:223] Registration of the systemd container factory successfully Apr 16 20:11:19.540903 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.540887 2571 factory.go:153] Registering CRI-O factory Apr 16 20:11:19.541010 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.540996 2571 factory.go:223] Registration of the crio container factory successfully Apr 16 20:11:19.541138 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.541128 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 20:11:19.541234 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.541224 2571 factory.go:103] Registering Raw factory Apr 16 20:11:19.541286 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.541258 2571 manager.go:1196] Started watching for new ooms in manager Apr 16 20:11:19.541713 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.541695 2571 manager.go:319] Starting recovery of all containers Apr 16 20:11:19.542018 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:19.541972 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 20:11:19.547333 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:19.547134 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-41.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 20:11:19.547424 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:19.547406 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 20:11:19.548675 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:19.547559 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-41.ec2.internal.18a6ef5b8d05dca3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-41.ec2.internal,UID:ip-10-0-129-41.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-41.ec2.internal,},FirstTimestamp:2026-04-16 20:11:19.531273379 +0000 UTC m=+0.438618790,LastTimestamp:2026-04-16 20:11:19.531273379 +0000 UTC m=+0.438618790,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-41.ec2.internal,}" Apr 16 20:11:19.550952 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.550921 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nm6p5" Apr 16 20:11:19.552179 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.552152 2571 manager.go:324] Recovery completed Apr 16 20:11:19.554770 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:19.554728 2571 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 20:11:19.557791 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.557779 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:19.558612 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.558595 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nm6p5" Apr 16 20:11:19.560256 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.560236 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-41.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:19.560333 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.560272 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-41.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:19.560333 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.560285 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-41.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:19.560833 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.560821 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 20:11:19.560833 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.560834 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 20:11:19.560927 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.560848 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:11:19.562520 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:19.562446 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-41.ec2.internal.18a6ef5b8ec0211c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-41.ec2.internal,UID:ip-10-0-129-41.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-41.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-41.ec2.internal,},FirstTimestamp:2026-04-16 20:11:19.56025782 +0000 UTC m=+0.467603238,LastTimestamp:2026-04-16 20:11:19.56025782 +0000 UTC m=+0.467603238,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-41.ec2.internal,}" Apr 16 20:11:19.562814 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.562803 2571 policy_none.go:49] "None policy: Start" Apr 16 20:11:19.562847 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.562820 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 20:11:19.562847 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.562829 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 16 20:11:19.597167 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.597151 2571 manager.go:341] "Starting Device Plugin manager" Apr 16 20:11:19.597247 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:19.597196 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 20:11:19.597247 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.597208 2571 server.go:85] "Starting device plugin registration server" Apr 16 20:11:19.597500 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.597487 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 20:11:19.597570 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.597501 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 20:11:19.597667 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.597649 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 20:11:19.598093 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.597771 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 20:11:19.598093 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.597780 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 20:11:19.615882 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:19.598291 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 20:11:19.615882 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:19.598339 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-41.ec2.internal\" not found" Apr 16 20:11:19.615882 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.612611 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 20:11:19.615882 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.614001 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 20:11:19.615882 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.614041 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 20:11:19.615882 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.614065 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 20:11:19.615882 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.614076 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 20:11:19.615882 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:19.614122 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 20:11:19.616356 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.616342 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:19.698591 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.698472 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:19.700953 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.700933 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-41.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:19.701081 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.700969 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-41.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:19.701081 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.700983 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-41.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:19.701081 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.701015 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-41.ec2.internal" Apr 16 20:11:19.712216 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.712196 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-41.ec2.internal" Apr 16 20:11:19.712329 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:19.712220 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-41.ec2.internal\": node \"ip-10-0-129-41.ec2.internal\" not found" Apr 16 20:11:19.715071 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.715054 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-41.ec2.internal"] Apr 16 20:11:19.715126 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.715117 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:19.717894 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.717877 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-41.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:19.718022 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.717904 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-41.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:19.718022 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.717918 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-41.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:19.719225 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.719209 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:19.719379 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.719365 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal" Apr 16 20:11:19.719445 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.719401 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:19.722961 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.722937 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-41.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:19.722961 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.722949 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-41.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:19.723108 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.722968 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-41.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:19.723108 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.722984 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-41.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:19.723108 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.722970 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-41.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:19.723108 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.723061 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-41.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:19.724144 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.724128 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-41.ec2.internal" Apr 16 20:11:19.724227 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.724158 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:19.725166 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.725151 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-41.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:19.725240 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.725171 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-41.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:19.725240 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.725184 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-41.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:19.728103 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:19.728083 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-41.ec2.internal\" not found" Apr 16 20:11:19.740665 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.740636 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b356ff33c9b62e243b4c712e3bd6b686-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal\" (UID: \"b356ff33c9b62e243b4c712e3bd6b686\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal" Apr 16 20:11:19.740759 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.740677 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b356ff33c9b62e243b4c712e3bd6b686-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal\" (UID: \"b356ff33c9b62e243b4c712e3bd6b686\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal" Apr 16 20:11:19.740759 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.740698 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d35c82c9bfffde60f444dc698a8db807-config\") pod \"kube-apiserver-proxy-ip-10-0-129-41.ec2.internal\" (UID: \"d35c82c9bfffde60f444dc698a8db807\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-41.ec2.internal" Apr 16 20:11:19.746158 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:19.746138 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-41.ec2.internal\" not found" node="ip-10-0-129-41.ec2.internal" Apr 16 20:11:19.750052 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:19.750026 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-41.ec2.internal\" not found" node="ip-10-0-129-41.ec2.internal" Apr 16 20:11:19.829011 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:19.828982 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-41.ec2.internal\" not found" Apr 16 20:11:19.841313 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.841287 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d35c82c9bfffde60f444dc698a8db807-config\") pod \"kube-apiserver-proxy-ip-10-0-129-41.ec2.internal\" (UID: \"d35c82c9bfffde60f444dc698a8db807\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-41.ec2.internal" Apr 16 20:11:19.841370 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.841320 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b356ff33c9b62e243b4c712e3bd6b686-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal\" (UID: \"b356ff33c9b62e243b4c712e3bd6b686\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal" Apr 16 20:11:19.841370 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.841337 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b356ff33c9b62e243b4c712e3bd6b686-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal\" (UID: \"b356ff33c9b62e243b4c712e3bd6b686\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal" Apr 16 20:11:19.841434 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.841379 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b356ff33c9b62e243b4c712e3bd6b686-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal\" (UID: \"b356ff33c9b62e243b4c712e3bd6b686\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal" Apr 16 20:11:19.841434 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.841380 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d35c82c9bfffde60f444dc698a8db807-config\") pod \"kube-apiserver-proxy-ip-10-0-129-41.ec2.internal\" (UID: \"d35c82c9bfffde60f444dc698a8db807\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-41.ec2.internal" Apr 16 20:11:19.841434 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:19.841418 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b356ff33c9b62e243b4c712e3bd6b686-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal\" (UID: \"b356ff33c9b62e243b4c712e3bd6b686\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal" Apr 16 20:11:19.929460 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:19.929431 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-41.ec2.internal\" not found" Apr 16 20:11:20.030315 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:20.030246 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-41.ec2.internal\" not found" Apr 16 20:11:20.047812 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.047788 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal" Apr 16 20:11:20.052250 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.052233 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-41.ec2.internal" Apr 16 20:11:20.130915 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:20.130874 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-41.ec2.internal\" not found" Apr 16 20:11:20.231375 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:20.231344 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-41.ec2.internal\" not found" Apr 16 20:11:20.332041 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:20.331971 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-41.ec2.internal\" not found" Apr 16 20:11:20.432495 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:20.432456 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-41.ec2.internal\" not found" Apr 16 20:11:20.447020 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.446996 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 20:11:20.447162 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.447147 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:11:20.474765 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.474735 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:20.518868 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.518843 2571 apiserver.go:52] "Watching apiserver" Apr 16 20:11:20.529608 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.529575 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 20:11:20.531815 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.530706 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-2gf7l","openshift-network-diagnostics/network-check-target-zdf2s","openshift-ovn-kubernetes/ovnkube-node-blr8v","kube-system/konnectivity-agent-s4k6r","openshift-cluster-node-tuning-operator/tuned-2qpdd","openshift-multus/multus-additional-cni-plugins-fbpzt","openshift-multus/network-metrics-daemon-h5ntx","openshift-network-operator/iptables-alerter-f69jd","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk","openshift-image-registry/node-ca-xz7cv"] Apr 16 20:11:20.533111 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.533084 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.534451 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.534242 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:20.534451 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:20.534374 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zdf2s" podUID="ade2626d-4dc5-4796-9c91-0c0699095807" Apr 16 20:11:20.535545 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.535502 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.535788 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.535770 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 20:11:20.535885 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.535770 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 20:11:20.535957 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.535778 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-6dh22\"" Apr 16 20:11:20.536011 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.535873 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 20:11:20.536011 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.535935 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 20:11:20.536651 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.536630 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-s4k6r" Apr 16 20:11:20.537800 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.537778 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 20:11:20.538061 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.538046 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 20:11:20.538194 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.538092 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 20:11:20.538466 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.538442 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.538634 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.538509 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 20:11:20.538634 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.538544 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 20:11:20.538634 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.538558 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 20:11:20.538822 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.538804 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 20:11:20.538886 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.538829 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-b48jt\"" Apr 16 20:11:20.538886 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.538865 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wrh2w\"" Apr 16 20:11:20.538886 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.538865 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 20:11:20.539048 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.539036 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 20:11:20.539173 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.539159 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal" Apr 16 20:11:20.539774 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.539761 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.540757 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.540740 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:20.541029 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.541010 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:20.541141 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.541107 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zf9lf\"" Apr 16 20:11:20.541141 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:20.541127 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5ntx" podUID="ee593b7f-fc54-40a3-af7d-f5643196a107" Apr 16 20:11:20.541250 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.541163 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:20.542128 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.542103 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 20:11:20.542213 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.542171 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-p9nzq\"" Apr 16 20:11:20.542270 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.542120 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 20:11:20.542578 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.542562 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f69jd" Apr 16 20:11:20.543873 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.543855 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.545212 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545192 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xz7cv" Apr 16 20:11:20.545300 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545248 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-sysctl-conf\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.545300 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545278 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-sys\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.545376 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545308 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z78n\" (UniqueName: \"kubernetes.io/projected/ee593b7f-fc54-40a3-af7d-f5643196a107-kube-api-access-2z78n\") pod \"network-metrics-daemon-h5ntx\" (UID: \"ee593b7f-fc54-40a3-af7d-f5643196a107\") " pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:20.545376 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545333 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-host-var-lib-kubelet\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.545439 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545396 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-multus-conf-dir\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.545439 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545421 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-sysctl-d\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.545505 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545460 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2xj7\" (UniqueName: \"kubernetes.io/projected/1f2eb888-db83-4f12-83ec-2f634c4cf807-kube-api-access-l2xj7\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.545505 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545488 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-slash\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.545635 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545505 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-run-openvswitch\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.545635 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545558 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ebce4803-f39d-4f96-8ee2-9b2eab78da74-tmp\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.545635 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545588 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9c611987-0423-4488-b0f7-408d1c68cda1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.545635 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545605 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-run\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.545635 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545620 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-var-lib-kubelet\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.545866 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545640 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-host-var-lib-cni-multus\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.545866 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545688 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-os-release\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.545866 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545719 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-host-var-lib-cni-bin\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.545866 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545746 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9c611987-0423-4488-b0f7-408d1c68cda1-cnibin\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.545866 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545771 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-system-cni-dir\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.545866 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545786 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/844406c9-7055-481f-ae73-5d4d7500e71d-multus-daemon-config\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.545866 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545827 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-etc-kubernetes\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.545866 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545864 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-run-systemd\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.546261 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545891 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-etc-openvswitch\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.546261 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545917 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-kubernetes\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.546261 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545966 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c611987-0423-4488-b0f7-408d1c68cda1-system-cni-dir\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.546261 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.545994 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-host-run-multus-certs\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.546261 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546041 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-multus-socket-dir-parent\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.546261 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546082 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-log-socket\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.546261 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546106 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-cni-bin\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.546261 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546132 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1f2eb888-db83-4f12-83ec-2f634c4cf807-ovn-node-metrics-cert\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.546261 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546155 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-lib-modules\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.546261 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546177 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9c611987-0423-4488-b0f7-408d1c68cda1-cni-binary-copy\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.546261 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546199 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-cnibin\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.546261 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546223 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-var-lib-openvswitch\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.546261 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546244 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1f2eb888-db83-4f12-83ec-2f634c4cf807-env-overrides\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.546261 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546268 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8d864121-3e7d-4667-a357-2bc3c0ff03ca-agent-certs\") pod \"konnectivity-agent-s4k6r\" (UID: \"8d864121-3e7d-4667-a357-2bc3c0ff03ca\") " pod="kube-system/konnectivity-agent-s4k6r" Apr 16 20:11:20.546925 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546297 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-kubelet\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.546925 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546341 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-run-ovn\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.546925 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546367 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-node-log\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.546925 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546387 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1f2eb888-db83-4f12-83ec-2f634c4cf807-ovnkube-script-lib\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.546925 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546405 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-sysconfig\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.546925 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546443 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9c611987-0423-4488-b0f7-408d1c68cda1-os-release\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.546925 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546469 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv66t\" (UniqueName: \"kubernetes.io/projected/9c611987-0423-4488-b0f7-408d1c68cda1-kube-api-access-qv66t\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.546925 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546492 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-systemd-units\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.546925 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546549 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-run-ovn-kubernetes\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.546925 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546571 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-tuned\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.546925 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546597 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlhdl\" (UniqueName: \"kubernetes.io/projected/ebce4803-f39d-4f96-8ee2-9b2eab78da74-kube-api-access-mlhdl\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.546925 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546619 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9c611987-0423-4488-b0f7-408d1c68cda1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.546925 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546643 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-host-run-netns\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.546925 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546662 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-multus-cni-dir\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.546925 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546696 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8d864121-3e7d-4667-a357-2bc3c0ff03ca-konnectivity-ca\") pod \"konnectivity-agent-s4k6r\" (UID: \"8d864121-3e7d-4667-a357-2bc3c0ff03ca\") " pod="kube-system/konnectivity-agent-s4k6r" Apr 16 20:11:20.546925 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546732 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9c611987-0423-4488-b0f7-408d1c68cda1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.547672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546766 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/844406c9-7055-481f-ae73-5d4d7500e71d-cni-binary-copy\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.547672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546788 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-hostroot\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.547672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546805 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78742\" (UniqueName: \"kubernetes.io/projected/844406c9-7055-481f-ae73-5d4d7500e71d-kube-api-access-78742\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.547672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546836 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4xg7\" (UniqueName: \"kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7\") pod \"network-check-target-zdf2s\" (UID: \"ade2626d-4dc5-4796-9c91-0c0699095807\") " pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:20.547672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546857 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-run-netns\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.547672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546901 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.547672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546960 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1f2eb888-db83-4f12-83ec-2f634c4cf807-ovnkube-config\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.547672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.546997 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-modprobe-d\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.547672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.547020 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-host\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.547672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.547069 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs\") pod \"network-metrics-daemon-h5ntx\" (UID: \"ee593b7f-fc54-40a3-af7d-f5643196a107\") " pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:20.547672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.547098 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-host-run-k8s-cni-cncf-io\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.547672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.547124 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-cni-netd\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.547672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.547147 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-systemd\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.547672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.547304 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:20.547672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.547568 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:20.548372 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.547737 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 20:11:20.548372 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.547760 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 20:11:20.548372 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.547864 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 20:11:20.548372 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.547914 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 20:11:20.548372 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.548035 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8bb64\"" Apr 16 20:11:20.548372 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.548061 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 20:11:20.548372 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.548165 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 20:11:20.548372 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.548184 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6tprs\"" Apr 16 20:11:20.548372 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.548277 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 20:11:20.548947 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.548668 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vc456\"" Apr 16 20:11:20.559181 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.559154 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal"] Apr 16 20:11:20.559637 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.559616 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:11:20.559692 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.559682 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-41.ec2.internal" Apr 16 20:11:20.560353 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.560328 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 20:06:19 +0000 UTC" deadline="2027-10-20 15:56:41.974164141 +0000 UTC" Apr 16 20:11:20.560406 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.560355 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13243h45m21.413812618s" Apr 16 20:11:20.561585 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.561569 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:11:20.573379 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.573358 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-129-41.ec2.internal"] Apr 16 20:11:20.573546 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.573433 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:11:20.578203 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.578185 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8pmhc" Apr 16 20:11:20.587778 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.587752 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8pmhc" Apr 16 20:11:20.622938 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:20.622899 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd35c82c9bfffde60f444dc698a8db807.slice/crio-a5576ddec5f759d78ddd36195b92690882d041b6aa61bf456a421d83b11ca754 WatchSource:0}: Error finding container a5576ddec5f759d78ddd36195b92690882d041b6aa61bf456a421d83b11ca754: Status 404 returned error can't find the container with id a5576ddec5f759d78ddd36195b92690882d041b6aa61bf456a421d83b11ca754 Apr 16 20:11:20.624820 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:20.624796 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb356ff33c9b62e243b4c712e3bd6b686.slice/crio-106564778efca2107098bcfd0b625717a2cec6063b82fb7a1812433d3afa8a7a WatchSource:0}: Error finding container 106564778efca2107098bcfd0b625717a2cec6063b82fb7a1812433d3afa8a7a: Status 404 returned error can't find the container with id 106564778efca2107098bcfd0b625717a2cec6063b82fb7a1812433d3afa8a7a Apr 16 20:11:20.629157 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.629133 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:11:20.640499 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.640479 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 20:11:20.647670 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.647643 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-cnibin\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.647775 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.647675 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b52a462e-c070-4f15-8a70-36589ff82e9b-sys-fs\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.647775 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.647695 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-var-lib-openvswitch\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.647775 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.647713 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1f2eb888-db83-4f12-83ec-2f634c4cf807-env-overrides\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.647775 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.647737 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8d864121-3e7d-4667-a357-2bc3c0ff03ca-agent-certs\") pod \"konnectivity-agent-s4k6r\" (UID: \"8d864121-3e7d-4667-a357-2bc3c0ff03ca\") " pod="kube-system/konnectivity-agent-s4k6r" Apr 16 20:11:20.647775 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.647765 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-cnibin\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.647953 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.647773 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-var-lib-openvswitch\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.647953 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.647787 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7aa3c845-972e-41e1-89d2-9126f2eb4905-serviceca\") pod \"node-ca-xz7cv\" (UID: \"7aa3c845-972e-41e1-89d2-9126f2eb4905\") " pod="openshift-image-registry/node-ca-xz7cv" Apr 16 20:11:20.647953 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.647812 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-kubelet\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.647953 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.647829 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-run-ovn\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.647953 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.647858 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-node-log\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.647953 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.647881 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-kubelet\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.647953 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.647891 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1f2eb888-db83-4f12-83ec-2f634c4cf807-ovnkube-script-lib\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.647953 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.647925 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-sysconfig\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.647953 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.647900 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-run-ovn\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.647953 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.647935 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-node-log\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.647953 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.647952 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9c611987-0423-4488-b0f7-408d1c68cda1-os-release\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.648672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.647981 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qv66t\" (UniqueName: \"kubernetes.io/projected/9c611987-0423-4488-b0f7-408d1c68cda1-kube-api-access-qv66t\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.648672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648009 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-systemd-units\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.648672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648007 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-sysconfig\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.648672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648037 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-run-ovn-kubernetes\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.648672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648059 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9c611987-0423-4488-b0f7-408d1c68cda1-os-release\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.648672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648079 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-run-ovn-kubernetes\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.648672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648090 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-tuned\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.648672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648086 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-systemd-units\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.648672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648120 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlhdl\" (UniqueName: \"kubernetes.io/projected/ebce4803-f39d-4f96-8ee2-9b2eab78da74-kube-api-access-mlhdl\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.648672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648147 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9c611987-0423-4488-b0f7-408d1c68cda1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.648672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648265 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-host-run-netns\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.648672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648279 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1f2eb888-db83-4f12-83ec-2f634c4cf807-env-overrides\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.648672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648309 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-multus-cni-dir\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.648672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648350 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8d864121-3e7d-4667-a357-2bc3c0ff03ca-konnectivity-ca\") pod \"konnectivity-agent-s4k6r\" (UID: \"8d864121-3e7d-4667-a357-2bc3c0ff03ca\") " pod="kube-system/konnectivity-agent-s4k6r" Apr 16 20:11:20.648672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648627 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-host-run-netns\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.649299 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648700 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 20:11:20.649299 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648748 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-multus-cni-dir\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.649299 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648750 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9c611987-0423-4488-b0f7-408d1c68cda1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.649299 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648799 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/844406c9-7055-481f-ae73-5d4d7500e71d-cni-binary-copy\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.649299 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648831 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-hostroot\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.649299 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648861 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78742\" (UniqueName: \"kubernetes.io/projected/844406c9-7055-481f-ae73-5d4d7500e71d-kube-api-access-78742\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.649299 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648894 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4xg7\" (UniqueName: \"kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7\") pod \"network-check-target-zdf2s\" (UID: \"ade2626d-4dc5-4796-9c91-0c0699095807\") " pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:20.649299 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648926 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpvkk\" (UniqueName: \"kubernetes.io/projected/b52a462e-c070-4f15-8a70-36589ff82e9b-kube-api-access-hpvkk\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.649299 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648956 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7aa3c845-972e-41e1-89d2-9126f2eb4905-host\") pod \"node-ca-xz7cv\" (UID: \"7aa3c845-972e-41e1-89d2-9126f2eb4905\") " pod="openshift-image-registry/node-ca-xz7cv" Apr 16 20:11:20.649299 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648956 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1f2eb888-db83-4f12-83ec-2f634c4cf807-ovnkube-script-lib\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.649299 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.648986 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/933ff827-8c81-4476-a08c-6f416ce84bd6-host-slash\") pod \"iptables-alerter-f69jd\" (UID: \"933ff827-8c81-4476-a08c-6f416ce84bd6\") " pod="openshift-network-operator/iptables-alerter-f69jd" Apr 16 20:11:20.649299 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649020 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-run-netns\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.649299 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649062 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.649299 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649079 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9c611987-0423-4488-b0f7-408d1c68cda1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.649299 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649098 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1f2eb888-db83-4f12-83ec-2f634c4cf807-ovnkube-config\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.649299 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649243 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-modprobe-d\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.649299 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649253 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-hostroot\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.650015 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649343 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-run-netns\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.650015 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649355 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-host\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.650015 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649386 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9c611987-0423-4488-b0f7-408d1c68cda1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.650015 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649400 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs\") pod \"network-metrics-daemon-h5ntx\" (UID: \"ee593b7f-fc54-40a3-af7d-f5643196a107\") " pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:20.650015 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649463 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.650015 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649471 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-host-run-k8s-cni-cncf-io\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.650015 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649507 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-cni-netd\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.650015 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649512 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-host\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.650015 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649551 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-modprobe-d\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.650015 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649594 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-host-run-k8s-cni-cncf-io\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.650015 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649605 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-cni-netd\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.650015 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649642 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-systemd\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.650015 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649686 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b52a462e-c070-4f15-8a70-36589ff82e9b-socket-dir\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.650015 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649779 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-systemd\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.650015 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649826 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-sysctl-conf\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.650015 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:20.649921 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:20.650015 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.649974 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-sys\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.650793 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650039 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z78n\" (UniqueName: \"kubernetes.io/projected/ee593b7f-fc54-40a3-af7d-f5643196a107-kube-api-access-2z78n\") pod \"network-metrics-daemon-h5ntx\" (UID: \"ee593b7f-fc54-40a3-af7d-f5643196a107\") " pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:20.650793 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650131 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1f2eb888-db83-4f12-83ec-2f634c4cf807-ovnkube-config\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.650793 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650175 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-sysctl-conf\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.650793 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650191 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-sys\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.650793 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650260 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-host-var-lib-kubelet\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.650793 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:20.650290 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs podName:ee593b7f-fc54-40a3-af7d-f5643196a107 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:21.150248356 +0000 UTC m=+2.057593777 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs") pod "network-metrics-daemon-h5ntx" (UID: "ee593b7f-fc54-40a3-af7d-f5643196a107") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:20.650793 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650304 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-host-var-lib-kubelet\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.650793 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650340 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-multus-conf-dir\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.650793 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650386 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b52a462e-c070-4f15-8a70-36589ff82e9b-etc-selinux\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.650793 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650418 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-sysctl-d\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.650793 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650456 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2xj7\" (UniqueName: \"kubernetes.io/projected/1f2eb888-db83-4f12-83ec-2f634c4cf807-kube-api-access-l2xj7\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.650793 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650411 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-multus-conf-dir\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.650793 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650509 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b52a462e-c070-4f15-8a70-36589ff82e9b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.650793 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650562 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b52a462e-c070-4f15-8a70-36589ff82e9b-device-dir\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.650793 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650595 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/933ff827-8c81-4476-a08c-6f416ce84bd6-iptables-alerter-script\") pod \"iptables-alerter-f69jd\" (UID: \"933ff827-8c81-4476-a08c-6f416ce84bd6\") " pod="openshift-network-operator/iptables-alerter-f69jd" Apr 16 20:11:20.650793 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650670 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-sysctl-d\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.651450 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650689 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8d864121-3e7d-4667-a357-2bc3c0ff03ca-konnectivity-ca\") pod \"konnectivity-agent-s4k6r\" (UID: \"8d864121-3e7d-4667-a357-2bc3c0ff03ca\") " pod="kube-system/konnectivity-agent-s4k6r" Apr 16 20:11:20.651450 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650716 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-slash\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.651450 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650765 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-run-openvswitch\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.651450 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650819 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ebce4803-f39d-4f96-8ee2-9b2eab78da74-tmp\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.651450 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650852 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9c611987-0423-4488-b0f7-408d1c68cda1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.651450 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650884 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-run\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.651450 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.650973 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-slash\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.651450 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.651017 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9c611987-0423-4488-b0f7-408d1c68cda1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.651450 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.651031 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-run\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.651450 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.651031 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-run-openvswitch\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.651450 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.651058 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-var-lib-kubelet\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.651450 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.651121 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-host-var-lib-cni-multus\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.651450 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.651131 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/844406c9-7055-481f-ae73-5d4d7500e71d-cni-binary-copy\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.651450 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.651154 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-os-release\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.651450 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.651161 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-var-lib-kubelet\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.651450 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.651206 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-host-var-lib-cni-multus\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.651450 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.651290 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-os-release\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.651450 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.651238 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-host-var-lib-cni-bin\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.652251 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.651344 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-host-var-lib-cni-bin\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.652251 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.651356 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4tmn\" (UniqueName: \"kubernetes.io/projected/933ff827-8c81-4476-a08c-6f416ce84bd6-kube-api-access-b4tmn\") pod \"iptables-alerter-f69jd\" (UID: \"933ff827-8c81-4476-a08c-6f416ce84bd6\") " pod="openshift-network-operator/iptables-alerter-f69jd" Apr 16 20:11:20.652251 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.651390 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9c611987-0423-4488-b0f7-408d1c68cda1-cnibin\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.652251 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.651437 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-system-cni-dir\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.652251 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.651477 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9c611987-0423-4488-b0f7-408d1c68cda1-cnibin\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.652251 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.651483 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/844406c9-7055-481f-ae73-5d4d7500e71d-multus-daemon-config\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.652251 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.651543 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-etc-kubernetes\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.652251 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.651582 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b52a462e-c070-4f15-8a70-36589ff82e9b-registration-dir\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.652251 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652062 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-run-systemd\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.652251 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652081 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/844406c9-7055-481f-ae73-5d4d7500e71d-multus-daemon-config\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.652251 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652099 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-etc-openvswitch\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.652251 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652132 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-kubernetes\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.652251 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652163 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c611987-0423-4488-b0f7-408d1c68cda1-system-cni-dir\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.652251 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652167 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-system-cni-dir\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.652251 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652196 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-host-run-multus-certs\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.652251 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652257 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c611987-0423-4488-b0f7-408d1c68cda1-system-cni-dir\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.652931 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652271 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-etc-kubernetes\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.652931 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652307 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-run-systemd\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.652931 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652355 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-etc-openvswitch\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.652931 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652410 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-kubernetes\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.652931 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652449 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj4qt\" (UniqueName: \"kubernetes.io/projected/7aa3c845-972e-41e1-89d2-9126f2eb4905-kube-api-access-wj4qt\") pod \"node-ca-xz7cv\" (UID: \"7aa3c845-972e-41e1-89d2-9126f2eb4905\") " pod="openshift-image-registry/node-ca-xz7cv" Apr 16 20:11:20.652931 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652515 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-host-run-multus-certs\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.652931 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652566 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-multus-socket-dir-parent\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.652931 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652599 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-log-socket\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.652931 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652626 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-cni-bin\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.652931 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652661 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1f2eb888-db83-4f12-83ec-2f634c4cf807-ovn-node-metrics-cert\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.652931 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652687 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-lib-modules\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.652931 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652704 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/844406c9-7055-481f-ae73-5d4d7500e71d-multus-socket-dir-parent\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.652931 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.652720 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9c611987-0423-4488-b0f7-408d1c68cda1-cni-binary-copy\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.653474 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.653215 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9c611987-0423-4488-b0f7-408d1c68cda1-cni-binary-copy\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.653474 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.653281 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-log-socket\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.653474 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.653348 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1f2eb888-db83-4f12-83ec-2f634c4cf807-host-cni-bin\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.653474 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.653435 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ebce4803-f39d-4f96-8ee2-9b2eab78da74-lib-modules\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.653670 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.653511 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ebce4803-f39d-4f96-8ee2-9b2eab78da74-etc-tuned\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.653735 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.653718 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ebce4803-f39d-4f96-8ee2-9b2eab78da74-tmp\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.653841 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.653826 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8d864121-3e7d-4667-a357-2bc3c0ff03ca-agent-certs\") pod \"konnectivity-agent-s4k6r\" (UID: \"8d864121-3e7d-4667-a357-2bc3c0ff03ca\") " pod="kube-system/konnectivity-agent-s4k6r" Apr 16 20:11:20.655764 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.655745 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1f2eb888-db83-4f12-83ec-2f634c4cf807-ovn-node-metrics-cert\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.659014 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.658994 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv66t\" (UniqueName: \"kubernetes.io/projected/9c611987-0423-4488-b0f7-408d1c68cda1-kube-api-access-qv66t\") pod \"multus-additional-cni-plugins-fbpzt\" (UID: \"9c611987-0423-4488-b0f7-408d1c68cda1\") " pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.660603 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:20.660585 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:20.660687 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:20.660606 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:20.660687 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:20.660619 2571 projected.go:194] Error preparing data for projected volume kube-api-access-s4xg7 for pod openshift-network-diagnostics/network-check-target-zdf2s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:20.660687 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:20.660680 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7 podName:ade2626d-4dc5-4796-9c91-0c0699095807 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:21.16066331 +0000 UTC m=+2.068008727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s4xg7" (UniqueName: "kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7") pod "network-check-target-zdf2s" (UID: "ade2626d-4dc5-4796-9c91-0c0699095807") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:20.662660 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.662637 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlhdl\" (UniqueName: \"kubernetes.io/projected/ebce4803-f39d-4f96-8ee2-9b2eab78da74-kube-api-access-mlhdl\") pod \"tuned-2qpdd\" (UID: \"ebce4803-f39d-4f96-8ee2-9b2eab78da74\") " pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.663194 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.663166 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2xj7\" (UniqueName: \"kubernetes.io/projected/1f2eb888-db83-4f12-83ec-2f634c4cf807-kube-api-access-l2xj7\") pod \"ovnkube-node-blr8v\" (UID: \"1f2eb888-db83-4f12-83ec-2f634c4cf807\") " pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.663413 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.663396 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78742\" (UniqueName: \"kubernetes.io/projected/844406c9-7055-481f-ae73-5d4d7500e71d-kube-api-access-78742\") pod \"multus-2gf7l\" (UID: \"844406c9-7055-481f-ae73-5d4d7500e71d\") " pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.663701 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.663687 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z78n\" (UniqueName: \"kubernetes.io/projected/ee593b7f-fc54-40a3-af7d-f5643196a107-kube-api-access-2z78n\") pod \"network-metrics-daemon-h5ntx\" (UID: \"ee593b7f-fc54-40a3-af7d-f5643196a107\") " pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:20.753827 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.753792 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b52a462e-c070-4f15-8a70-36589ff82e9b-registration-dir\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.753827 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.753832 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wj4qt\" (UniqueName: \"kubernetes.io/projected/7aa3c845-972e-41e1-89d2-9126f2eb4905-kube-api-access-wj4qt\") pod \"node-ca-xz7cv\" (UID: \"7aa3c845-972e-41e1-89d2-9126f2eb4905\") " pod="openshift-image-registry/node-ca-xz7cv" Apr 16 20:11:20.754081 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.753851 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b52a462e-c070-4f15-8a70-36589ff82e9b-sys-fs\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.754081 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.753867 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7aa3c845-972e-41e1-89d2-9126f2eb4905-serviceca\") pod \"node-ca-xz7cv\" (UID: \"7aa3c845-972e-41e1-89d2-9126f2eb4905\") " pod="openshift-image-registry/node-ca-xz7cv" Apr 16 20:11:20.754081 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.753900 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpvkk\" (UniqueName: \"kubernetes.io/projected/b52a462e-c070-4f15-8a70-36589ff82e9b-kube-api-access-hpvkk\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.754081 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.753914 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7aa3c845-972e-41e1-89d2-9126f2eb4905-host\") pod \"node-ca-xz7cv\" (UID: \"7aa3c845-972e-41e1-89d2-9126f2eb4905\") " pod="openshift-image-registry/node-ca-xz7cv" Apr 16 20:11:20.754081 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.753928 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/933ff827-8c81-4476-a08c-6f416ce84bd6-host-slash\") pod \"iptables-alerter-f69jd\" (UID: \"933ff827-8c81-4476-a08c-6f416ce84bd6\") " pod="openshift-network-operator/iptables-alerter-f69jd" Apr 16 20:11:20.754081 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.753925 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b52a462e-c070-4f15-8a70-36589ff82e9b-registration-dir\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.754081 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.753953 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b52a462e-c070-4f15-8a70-36589ff82e9b-socket-dir\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.754081 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.753973 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b52a462e-c070-4f15-8a70-36589ff82e9b-etc-selinux\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.754081 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.753926 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b52a462e-c070-4f15-8a70-36589ff82e9b-sys-fs\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.754081 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.754012 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7aa3c845-972e-41e1-89d2-9126f2eb4905-host\") pod \"node-ca-xz7cv\" (UID: \"7aa3c845-972e-41e1-89d2-9126f2eb4905\") " pod="openshift-image-registry/node-ca-xz7cv" Apr 16 20:11:20.754081 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.753997 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b52a462e-c070-4f15-8a70-36589ff82e9b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.754081 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.754066 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/933ff827-8c81-4476-a08c-6f416ce84bd6-host-slash\") pod \"iptables-alerter-f69jd\" (UID: \"933ff827-8c81-4476-a08c-6f416ce84bd6\") " pod="openshift-network-operator/iptables-alerter-f69jd" Apr 16 20:11:20.754081 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.754053 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b52a462e-c070-4f15-8a70-36589ff82e9b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.754481 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.754091 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b52a462e-c070-4f15-8a70-36589ff82e9b-device-dir\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.754481 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.754119 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/933ff827-8c81-4476-a08c-6f416ce84bd6-iptables-alerter-script\") pod \"iptables-alerter-f69jd\" (UID: \"933ff827-8c81-4476-a08c-6f416ce84bd6\") " pod="openshift-network-operator/iptables-alerter-f69jd" Apr 16 20:11:20.754481 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.754136 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b52a462e-c070-4f15-8a70-36589ff82e9b-device-dir\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.754481 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.754142 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b52a462e-c070-4f15-8a70-36589ff82e9b-socket-dir\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.754481 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.754157 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b4tmn\" (UniqueName: \"kubernetes.io/projected/933ff827-8c81-4476-a08c-6f416ce84bd6-kube-api-access-b4tmn\") pod \"iptables-alerter-f69jd\" (UID: \"933ff827-8c81-4476-a08c-6f416ce84bd6\") " pod="openshift-network-operator/iptables-alerter-f69jd" Apr 16 20:11:20.754481 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.754139 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b52a462e-c070-4f15-8a70-36589ff82e9b-etc-selinux\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.754481 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.754419 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7aa3c845-972e-41e1-89d2-9126f2eb4905-serviceca\") pod \"node-ca-xz7cv\" (UID: \"7aa3c845-972e-41e1-89d2-9126f2eb4905\") " pod="openshift-image-registry/node-ca-xz7cv" Apr 16 20:11:20.754701 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.754557 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/933ff827-8c81-4476-a08c-6f416ce84bd6-iptables-alerter-script\") pod \"iptables-alerter-f69jd\" (UID: \"933ff827-8c81-4476-a08c-6f416ce84bd6\") " pod="openshift-network-operator/iptables-alerter-f69jd" Apr 16 20:11:20.763175 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.763146 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpvkk\" (UniqueName: \"kubernetes.io/projected/b52a462e-c070-4f15-8a70-36589ff82e9b-kube-api-access-hpvkk\") pod \"aws-ebs-csi-driver-node-jprdk\" (UID: \"b52a462e-c070-4f15-8a70-36589ff82e9b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.763284 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.763252 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4tmn\" (UniqueName: \"kubernetes.io/projected/933ff827-8c81-4476-a08c-6f416ce84bd6-kube-api-access-b4tmn\") pod \"iptables-alerter-f69jd\" (UID: \"933ff827-8c81-4476-a08c-6f416ce84bd6\") " pod="openshift-network-operator/iptables-alerter-f69jd" Apr 16 20:11:20.763326 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.763271 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj4qt\" (UniqueName: \"kubernetes.io/projected/7aa3c845-972e-41e1-89d2-9126f2eb4905-kube-api-access-wj4qt\") pod \"node-ca-xz7cv\" (UID: \"7aa3c845-972e-41e1-89d2-9126f2eb4905\") " pod="openshift-image-registry/node-ca-xz7cv" Apr 16 20:11:20.852599 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.852489 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2gf7l" Apr 16 20:11:20.859459 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:20.859430 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod844406c9_7055_481f_ae73_5d4d7500e71d.slice/crio-e428a44d624e3514add9cab2e99195c36f08845f946878a3daa27933d25eec2b WatchSource:0}: Error finding container e428a44d624e3514add9cab2e99195c36f08845f946878a3daa27933d25eec2b: Status 404 returned error can't find the container with id e428a44d624e3514add9cab2e99195c36f08845f946878a3daa27933d25eec2b Apr 16 20:11:20.873913 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.873892 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" Apr 16 20:11:20.879943 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:20.879914 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb52a462e_c070_4f15_8a70_36589ff82e9b.slice/crio-a23779ae61e5d8287466a57135ae48d5ba8b8e3ca06e785c941d754bff38fbec WatchSource:0}: Error finding container a23779ae61e5d8287466a57135ae48d5ba8b8e3ca06e785c941d754bff38fbec: Status 404 returned error can't find the container with id a23779ae61e5d8287466a57135ae48d5ba8b8e3ca06e785c941d754bff38fbec Apr 16 20:11:20.890634 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.890614 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:20.897179 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:20.897155 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f2eb888_db83_4f12_83ec_2f634c4cf807.slice/crio-5ff932af4c29e5550822bd4094debd98b0e3601d15e26b1109d57a46cb644968 WatchSource:0}: Error finding container 5ff932af4c29e5550822bd4094debd98b0e3601d15e26b1109d57a46cb644968: Status 404 returned error can't find the container with id 5ff932af4c29e5550822bd4094debd98b0e3601d15e26b1109d57a46cb644968 Apr 16 20:11:20.901304 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.901281 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-s4k6r" Apr 16 20:11:20.907301 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:20.907275 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d864121_3e7d_4667_a357_2bc3c0ff03ca.slice/crio-eeb5b74ce123a4aa7b5c2c431e88dcbffb7cf4e2b2dab43b5dadc18cef189212 WatchSource:0}: Error finding container eeb5b74ce123a4aa7b5c2c431e88dcbffb7cf4e2b2dab43b5dadc18cef189212: Status 404 returned error can't find the container with id eeb5b74ce123a4aa7b5c2c431e88dcbffb7cf4e2b2dab43b5dadc18cef189212 Apr 16 20:11:20.921725 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.921705 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:20.929873 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.929853 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" Apr 16 20:11:20.936054 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.936038 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fbpzt" Apr 16 20:11:20.936336 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:20.936312 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebce4803_f39d_4f96_8ee2_9b2eab78da74.slice/crio-733c70a4aada171fc168601f56886ed1657540bff53f9611e812bfca0f306b49 WatchSource:0}: Error finding container 733c70a4aada171fc168601f56886ed1657540bff53f9611e812bfca0f306b49: Status 404 returned error can't find the container with id 733c70a4aada171fc168601f56886ed1657540bff53f9611e812bfca0f306b49 Apr 16 20:11:20.942065 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.942048 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f69jd" Apr 16 20:11:20.942612 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:20.942505 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c611987_0423_4488_b0f7_408d1c68cda1.slice/crio-2c2c53e8baa6549d36d1cedc59ce18fa5ad29a8cfa9862fd30bd508ae159a76c WatchSource:0}: Error finding container 2c2c53e8baa6549d36d1cedc59ce18fa5ad29a8cfa9862fd30bd508ae159a76c: Status 404 returned error can't find the container with id 2c2c53e8baa6549d36d1cedc59ce18fa5ad29a8cfa9862fd30bd508ae159a76c Apr 16 20:11:20.943352 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:20.943333 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xz7cv" Apr 16 20:11:20.949580 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:20.949561 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod933ff827_8c81_4476_a08c_6f416ce84bd6.slice/crio-535504ba24116e1f69cf95b3e2f4682870fd853acbe4c2f970bbda8312dfd523 WatchSource:0}: Error finding container 535504ba24116e1f69cf95b3e2f4682870fd853acbe4c2f970bbda8312dfd523: Status 404 returned error can't find the container with id 535504ba24116e1f69cf95b3e2f4682870fd853acbe4c2f970bbda8312dfd523 Apr 16 20:11:20.951786 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:20.951767 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7aa3c845_972e_41e1_89d2_9126f2eb4905.slice/crio-ac13afb06069f55ab83344deb5a20bafca36f3d63408cbf24f4c71834971b544 WatchSource:0}: Error finding container ac13afb06069f55ab83344deb5a20bafca36f3d63408cbf24f4c71834971b544: Status 404 returned error can't find the container with id ac13afb06069f55ab83344deb5a20bafca36f3d63408cbf24f4c71834971b544 Apr 16 20:11:21.024559 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:21.024513 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:21.157465 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:21.157389 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs\") pod \"network-metrics-daemon-h5ntx\" (UID: \"ee593b7f-fc54-40a3-af7d-f5643196a107\") " pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:21.157642 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:21.157589 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:21.157695 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:21.157673 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs podName:ee593b7f-fc54-40a3-af7d-f5643196a107 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:22.157646744 +0000 UTC m=+3.064992149 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs") pod "network-metrics-daemon-h5ntx" (UID: "ee593b7f-fc54-40a3-af7d-f5643196a107") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:21.256806 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:21.256775 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:21.258324 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:21.258292 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4xg7\" (UniqueName: \"kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7\") pod \"network-check-target-zdf2s\" (UID: \"ade2626d-4dc5-4796-9c91-0c0699095807\") " pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:21.258502 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:21.258482 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:21.258602 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:21.258510 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:21.258602 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:21.258542 2571 projected.go:194] Error preparing data for projected volume kube-api-access-s4xg7 for pod openshift-network-diagnostics/network-check-target-zdf2s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:21.258746 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:21.258612 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7 podName:ade2626d-4dc5-4796-9c91-0c0699095807 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:22.258591548 +0000 UTC m=+3.165936951 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s4xg7" (UniqueName: "kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7") pod "network-check-target-zdf2s" (UID: "ade2626d-4dc5-4796-9c91-0c0699095807") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:21.588578 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:21.588464 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:06:20 +0000 UTC" deadline="2027-10-16 22:53:22.626603533 +0000 UTC" Apr 16 20:11:21.588578 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:21.588508 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13154h42m1.038099417s" Apr 16 20:11:21.637156 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:21.637089 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-s4k6r" event={"ID":"8d864121-3e7d-4667-a357-2bc3c0ff03ca","Type":"ContainerStarted","Data":"eeb5b74ce123a4aa7b5c2c431e88dcbffb7cf4e2b2dab43b5dadc18cef189212"} Apr 16 20:11:21.640675 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:21.640643 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" event={"ID":"1f2eb888-db83-4f12-83ec-2f634c4cf807","Type":"ContainerStarted","Data":"5ff932af4c29e5550822bd4094debd98b0e3601d15e26b1109d57a46cb644968"} Apr 16 20:11:21.653547 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:21.653498 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" event={"ID":"b52a462e-c070-4f15-8a70-36589ff82e9b","Type":"ContainerStarted","Data":"a23779ae61e5d8287466a57135ae48d5ba8b8e3ca06e785c941d754bff38fbec"} Apr 16 20:11:21.662674 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:21.662632 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal" event={"ID":"b356ff33c9b62e243b4c712e3bd6b686","Type":"ContainerStarted","Data":"106564778efca2107098bcfd0b625717a2cec6063b82fb7a1812433d3afa8a7a"} Apr 16 20:11:21.674628 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:21.674588 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xz7cv" event={"ID":"7aa3c845-972e-41e1-89d2-9126f2eb4905","Type":"ContainerStarted","Data":"ac13afb06069f55ab83344deb5a20bafca36f3d63408cbf24f4c71834971b544"} Apr 16 20:11:21.685145 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:21.685097 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f69jd" event={"ID":"933ff827-8c81-4476-a08c-6f416ce84bd6","Type":"ContainerStarted","Data":"535504ba24116e1f69cf95b3e2f4682870fd853acbe4c2f970bbda8312dfd523"} Apr 16 20:11:21.689359 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:21.689315 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fbpzt" event={"ID":"9c611987-0423-4488-b0f7-408d1c68cda1","Type":"ContainerStarted","Data":"2c2c53e8baa6549d36d1cedc59ce18fa5ad29a8cfa9862fd30bd508ae159a76c"} Apr 16 20:11:21.694996 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:21.694965 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" event={"ID":"ebce4803-f39d-4f96-8ee2-9b2eab78da74","Type":"ContainerStarted","Data":"733c70a4aada171fc168601f56886ed1657540bff53f9611e812bfca0f306b49"} Apr 16 20:11:21.710177 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:21.710136 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gf7l" event={"ID":"844406c9-7055-481f-ae73-5d4d7500e71d","Type":"ContainerStarted","Data":"e428a44d624e3514add9cab2e99195c36f08845f946878a3daa27933d25eec2b"} Apr 16 20:11:21.749451 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:21.749416 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-41.ec2.internal" event={"ID":"d35c82c9bfffde60f444dc698a8db807","Type":"ContainerStarted","Data":"a5576ddec5f759d78ddd36195b92690882d041b6aa61bf456a421d83b11ca754"} Apr 16 20:11:22.165550 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:22.165500 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs\") pod \"network-metrics-daemon-h5ntx\" (UID: \"ee593b7f-fc54-40a3-af7d-f5643196a107\") " pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:22.165765 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:22.165684 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:22.165765 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:22.165751 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs podName:ee593b7f-fc54-40a3-af7d-f5643196a107 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:24.165732703 +0000 UTC m=+5.073078105 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs") pod "network-metrics-daemon-h5ntx" (UID: "ee593b7f-fc54-40a3-af7d-f5643196a107") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:22.266327 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:22.266286 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4xg7\" (UniqueName: \"kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7\") pod \"network-check-target-zdf2s\" (UID: \"ade2626d-4dc5-4796-9c91-0c0699095807\") " pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:22.266554 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:22.266495 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:22.266554 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:22.266514 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:22.266554 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:22.266548 2571 projected.go:194] Error preparing data for projected volume kube-api-access-s4xg7 for pod openshift-network-diagnostics/network-check-target-zdf2s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:22.266734 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:22.266618 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7 podName:ade2626d-4dc5-4796-9c91-0c0699095807 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:24.266598272 +0000 UTC m=+5.173943670 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s4xg7" (UniqueName: "kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7") pod "network-check-target-zdf2s" (UID: "ade2626d-4dc5-4796-9c91-0c0699095807") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:22.589573 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:22.589395 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:06:20 +0000 UTC" deadline="2027-09-20 07:59:52.280533328 +0000 UTC" Apr 16 20:11:22.589573 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:22.589443 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12515h48m29.691094289s" Apr 16 20:11:22.615509 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:22.614701 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:22.615509 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:22.614826 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zdf2s" podUID="ade2626d-4dc5-4796-9c91-0c0699095807" Apr 16 20:11:22.615509 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:22.615339 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:22.615509 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:22.615457 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5ntx" podUID="ee593b7f-fc54-40a3-af7d-f5643196a107" Apr 16 20:11:24.184252 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:24.184208 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs\") pod \"network-metrics-daemon-h5ntx\" (UID: \"ee593b7f-fc54-40a3-af7d-f5643196a107\") " pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:24.184862 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:24.184346 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:24.184862 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:24.184409 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs podName:ee593b7f-fc54-40a3-af7d-f5643196a107 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:28.184391444 +0000 UTC m=+9.091736846 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs") pod "network-metrics-daemon-h5ntx" (UID: "ee593b7f-fc54-40a3-af7d-f5643196a107") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:24.284634 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:24.284594 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4xg7\" (UniqueName: \"kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7\") pod \"network-check-target-zdf2s\" (UID: \"ade2626d-4dc5-4796-9c91-0c0699095807\") " pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:24.284878 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:24.284803 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:24.284878 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:24.284825 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:24.284878 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:24.284839 2571 projected.go:194] Error preparing data for projected volume kube-api-access-s4xg7 for pod openshift-network-diagnostics/network-check-target-zdf2s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:24.285015 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:24.284898 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7 podName:ade2626d-4dc5-4796-9c91-0c0699095807 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:28.284877933 +0000 UTC m=+9.192223336 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s4xg7" (UniqueName: "kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7") pod "network-check-target-zdf2s" (UID: "ade2626d-4dc5-4796-9c91-0c0699095807") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:24.615381 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:24.615296 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:24.615633 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:24.615441 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5ntx" podUID="ee593b7f-fc54-40a3-af7d-f5643196a107" Apr 16 20:11:24.615633 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:24.615588 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:24.615769 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:24.615688 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zdf2s" podUID="ade2626d-4dc5-4796-9c91-0c0699095807" Apr 16 20:11:26.615113 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:26.615079 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:26.615113 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:26.615125 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:26.615666 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:26.615220 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zdf2s" podUID="ade2626d-4dc5-4796-9c91-0c0699095807" Apr 16 20:11:26.615666 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:26.615308 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5ntx" podUID="ee593b7f-fc54-40a3-af7d-f5643196a107" Apr 16 20:11:28.218349 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:28.218305 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs\") pod \"network-metrics-daemon-h5ntx\" (UID: \"ee593b7f-fc54-40a3-af7d-f5643196a107\") " pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:28.218861 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:28.218490 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:28.218861 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:28.218589 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs podName:ee593b7f-fc54-40a3-af7d-f5643196a107 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:36.218568959 +0000 UTC m=+17.125914365 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs") pod "network-metrics-daemon-h5ntx" (UID: "ee593b7f-fc54-40a3-af7d-f5643196a107") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:28.319152 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:28.319109 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4xg7\" (UniqueName: \"kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7\") pod \"network-check-target-zdf2s\" (UID: \"ade2626d-4dc5-4796-9c91-0c0699095807\") " pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:28.319335 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:28.319277 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:28.319335 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:28.319298 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:28.319335 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:28.319311 2571 projected.go:194] Error preparing data for projected volume kube-api-access-s4xg7 for pod openshift-network-diagnostics/network-check-target-zdf2s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:28.319479 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:28.319364 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7 podName:ade2626d-4dc5-4796-9c91-0c0699095807 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:36.319350911 +0000 UTC m=+17.226696309 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s4xg7" (UniqueName: "kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7") pod "network-check-target-zdf2s" (UID: "ade2626d-4dc5-4796-9c91-0c0699095807") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:28.615077 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:28.614732 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:28.615255 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:28.615138 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5ntx" podUID="ee593b7f-fc54-40a3-af7d-f5643196a107" Apr 16 20:11:28.615603 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:28.615584 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:28.615728 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:28.615680 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zdf2s" podUID="ade2626d-4dc5-4796-9c91-0c0699095807" Apr 16 20:11:30.615276 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:30.614818 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:30.615276 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:30.614866 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:30.615276 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:30.614966 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5ntx" podUID="ee593b7f-fc54-40a3-af7d-f5643196a107" Apr 16 20:11:30.615276 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:30.615128 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zdf2s" podUID="ade2626d-4dc5-4796-9c91-0c0699095807" Apr 16 20:11:32.614358 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:32.614323 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:32.614817 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:32.614324 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:32.614817 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:32.614474 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5ntx" podUID="ee593b7f-fc54-40a3-af7d-f5643196a107" Apr 16 20:11:32.614817 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:32.614576 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zdf2s" podUID="ade2626d-4dc5-4796-9c91-0c0699095807" Apr 16 20:11:34.614749 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:34.614662 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:34.615167 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:34.614662 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:34.615167 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:34.614807 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5ntx" podUID="ee593b7f-fc54-40a3-af7d-f5643196a107" Apr 16 20:11:34.615167 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:34.614902 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zdf2s" podUID="ade2626d-4dc5-4796-9c91-0c0699095807" Apr 16 20:11:36.268748 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:36.268700 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs\") pod \"network-metrics-daemon-h5ntx\" (UID: \"ee593b7f-fc54-40a3-af7d-f5643196a107\") " pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:36.269250 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:36.268851 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:36.269250 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:36.268934 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs podName:ee593b7f-fc54-40a3-af7d-f5643196a107 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:52.268911542 +0000 UTC m=+33.176256942 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs") pod "network-metrics-daemon-h5ntx" (UID: "ee593b7f-fc54-40a3-af7d-f5643196a107") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:36.369834 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:36.369797 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4xg7\" (UniqueName: \"kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7\") pod \"network-check-target-zdf2s\" (UID: \"ade2626d-4dc5-4796-9c91-0c0699095807\") " pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:36.370040 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:36.369952 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:36.370040 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:36.369970 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:36.370040 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:36.369982 2571 projected.go:194] Error preparing data for projected volume kube-api-access-s4xg7 for pod openshift-network-diagnostics/network-check-target-zdf2s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:36.370175 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:36.370050 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7 podName:ade2626d-4dc5-4796-9c91-0c0699095807 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:52.37003208 +0000 UTC m=+33.277377501 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s4xg7" (UniqueName: "kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7") pod "network-check-target-zdf2s" (UID: "ade2626d-4dc5-4796-9c91-0c0699095807") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:36.615102 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:36.615022 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:36.615271 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:36.615023 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:36.615271 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:36.615164 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5ntx" podUID="ee593b7f-fc54-40a3-af7d-f5643196a107" Apr 16 20:11:36.615271 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:36.615213 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zdf2s" podUID="ade2626d-4dc5-4796-9c91-0c0699095807" Apr 16 20:11:38.472752 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:38.472719 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-24qzw"] Apr 16 20:11:38.516389 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:38.516351 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-24qzw" Apr 16 20:11:38.521188 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:38.521167 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 20:11:38.521305 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:38.521216 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-gwpsm\"" Apr 16 20:11:38.521305 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:38.521219 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 20:11:38.585038 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:38.585002 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e-tmp-dir\") pod \"node-resolver-24qzw\" (UID: \"916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e\") " pod="openshift-dns/node-resolver-24qzw" Apr 16 20:11:38.585038 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:38.585038 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhz55\" (UniqueName: \"kubernetes.io/projected/916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e-kube-api-access-qhz55\") pod \"node-resolver-24qzw\" (UID: \"916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e\") " pod="openshift-dns/node-resolver-24qzw" Apr 16 20:11:38.585242 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:38.585059 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e-hosts-file\") pod \"node-resolver-24qzw\" (UID: \"916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e\") " pod="openshift-dns/node-resolver-24qzw" Apr 16 20:11:38.614454 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:38.614431 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:38.614587 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:38.614494 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:38.614650 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:38.614630 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5ntx" podUID="ee593b7f-fc54-40a3-af7d-f5643196a107" Apr 16 20:11:38.614778 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:38.614757 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zdf2s" podUID="ade2626d-4dc5-4796-9c91-0c0699095807" Apr 16 20:11:38.685614 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:38.685577 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhz55\" (UniqueName: \"kubernetes.io/projected/916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e-kube-api-access-qhz55\") pod \"node-resolver-24qzw\" (UID: \"916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e\") " pod="openshift-dns/node-resolver-24qzw" Apr 16 20:11:38.685614 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:38.685616 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e-hosts-file\") pod \"node-resolver-24qzw\" (UID: \"916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e\") " pod="openshift-dns/node-resolver-24qzw" Apr 16 20:11:38.685821 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:38.685708 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e-tmp-dir\") pod \"node-resolver-24qzw\" (UID: \"916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e\") " pod="openshift-dns/node-resolver-24qzw" Apr 16 20:11:38.685821 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:38.685782 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e-hosts-file\") pod \"node-resolver-24qzw\" (UID: \"916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e\") " pod="openshift-dns/node-resolver-24qzw" Apr 16 20:11:38.686001 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:38.685983 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e-tmp-dir\") pod \"node-resolver-24qzw\" (UID: \"916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e\") " pod="openshift-dns/node-resolver-24qzw" Apr 16 20:11:38.694934 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:38.694910 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhz55\" (UniqueName: \"kubernetes.io/projected/916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e-kube-api-access-qhz55\") pod \"node-resolver-24qzw\" (UID: \"916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e\") " pod="openshift-dns/node-resolver-24qzw" Apr 16 20:11:38.824892 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:38.824859 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-24qzw" Apr 16 20:11:39.783424 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:39.783152 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" event={"ID":"ebce4803-f39d-4f96-8ee2-9b2eab78da74","Type":"ContainerStarted","Data":"cc05eba34e8c088e0f3eb7df06773114041fc43003aacd5a186a824a426caa2f"} Apr 16 20:11:39.785638 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:39.785304 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gf7l" event={"ID":"844406c9-7055-481f-ae73-5d4d7500e71d","Type":"ContainerStarted","Data":"ff9f4baeace80d36dca142ce7f82743a19f6f7319a111766662e7b0f3492997c"} Apr 16 20:11:39.791744 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:39.791638 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-41.ec2.internal" event={"ID":"d35c82c9bfffde60f444dc698a8db807","Type":"ContainerStarted","Data":"4487f31e0bec529c54fed203b3d6e67ddb1a8ba2260d92b722c94bab01ca5d91"} Apr 16 20:11:39.795266 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:39.795223 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-24qzw" event={"ID":"916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e","Type":"ContainerStarted","Data":"a21ee7d0f7a2d4c367d93d70844c45b1ddc8ee29e16f311826e1d9a33fc560d8"} Apr 16 20:11:39.798789 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:39.798732 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-2qpdd" podStartSLOduration=2.709248876 podStartE2EDuration="20.798714972s" podCreationTimestamp="2026-04-16 20:11:19 +0000 UTC" firstStartedPulling="2026-04-16 20:11:20.93784717 +0000 UTC m=+1.845192567" lastFinishedPulling="2026-04-16 20:11:39.027313261 +0000 UTC m=+19.934658663" observedRunningTime="2026-04-16 20:11:39.797986749 +0000 UTC m=+20.705332182" watchObservedRunningTime="2026-04-16 20:11:39.798714972 +0000 UTC m=+20.706060394" Apr 16 20:11:39.804841 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:39.804823 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:11:39.805125 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:39.805103 2571 generic.go:358] "Generic (PLEG): container finished" podID="1f2eb888-db83-4f12-83ec-2f634c4cf807" containerID="f32dde8ca496e48624e15e3e4b7b184c5bd205afb0a41cbc6b4f1ae240c4b048" exitCode=1 Apr 16 20:11:39.805199 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:39.805144 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" event={"ID":"1f2eb888-db83-4f12-83ec-2f634c4cf807","Type":"ContainerStarted","Data":"470a5b46b0f8caaf12df4188a47c4a532bd2fd2ce6a457db060810441ed85f56"} Apr 16 20:11:39.805199 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:39.805169 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" event={"ID":"1f2eb888-db83-4f12-83ec-2f634c4cf807","Type":"ContainerStarted","Data":"5fdf8fe06a515e8b6736700fc7b21db52bae0302e686ddff8cea8a9dbed386a5"} Apr 16 20:11:39.805199 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:39.805182 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" event={"ID":"1f2eb888-db83-4f12-83ec-2f634c4cf807","Type":"ContainerStarted","Data":"fa8c1a69bc3c51f3522f79057a9f3bc46f0ab427538477268c5524fda03842ff"} Apr 16 20:11:39.805199 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:39.805196 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" event={"ID":"1f2eb888-db83-4f12-83ec-2f634c4cf807","Type":"ContainerStarted","Data":"aa8662aa2573742631aa1948cbe778acc0c7cb8d9a22cf7c6e5a21e50cff7dec"} Apr 16 20:11:39.805350 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:39.805209 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" event={"ID":"1f2eb888-db83-4f12-83ec-2f634c4cf807","Type":"ContainerDied","Data":"f32dde8ca496e48624e15e3e4b7b184c5bd205afb0a41cbc6b4f1ae240c4b048"} Apr 16 20:11:39.805350 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:39.805224 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" event={"ID":"1f2eb888-db83-4f12-83ec-2f634c4cf807","Type":"ContainerStarted","Data":"91cebd49cacb50d4b6d28a64588412556b198727360e9decbe1bdba43afb2f6f"} Apr 16 20:11:39.812903 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:39.812864 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-41.ec2.internal" podStartSLOduration=19.812852295 podStartE2EDuration="19.812852295s" podCreationTimestamp="2026-04-16 20:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:11:39.812710462 +0000 UTC m=+20.720055882" watchObservedRunningTime="2026-04-16 20:11:39.812852295 +0000 UTC m=+20.720197717" Apr 16 20:11:39.826954 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:39.826870 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2gf7l" podStartSLOduration=2.369565255 podStartE2EDuration="20.826855358s" podCreationTimestamp="2026-04-16 20:11:19 +0000 UTC" firstStartedPulling="2026-04-16 20:11:20.861032488 +0000 UTC m=+1.768377886" lastFinishedPulling="2026-04-16 20:11:39.318322587 +0000 UTC m=+20.225667989" observedRunningTime="2026-04-16 20:11:39.826097371 +0000 UTC m=+20.733442792" watchObservedRunningTime="2026-04-16 20:11:39.826855358 +0000 UTC m=+20.734200819" Apr 16 20:11:40.615011 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:40.614979 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:40.615202 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:40.614978 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:40.615202 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:40.615117 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5ntx" podUID="ee593b7f-fc54-40a3-af7d-f5643196a107" Apr 16 20:11:40.615202 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:40.615171 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zdf2s" podUID="ade2626d-4dc5-4796-9c91-0c0699095807" Apr 16 20:11:40.808745 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:40.808705 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-24qzw" event={"ID":"916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e","Type":"ContainerStarted","Data":"2acfc6f9ce06261ca6eb165de5a5662dfe79ed54c0b107d64de7ed6ced6adbfb"} Apr 16 20:11:40.810326 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:40.810294 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-s4k6r" event={"ID":"8d864121-3e7d-4667-a357-2bc3c0ff03ca","Type":"ContainerStarted","Data":"fbde7235b8857a334e7e1e38cb9de01b2b7a0c1e890275e8de5c33e0bb684998"} Apr 16 20:11:40.811985 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:40.811953 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" event={"ID":"b52a462e-c070-4f15-8a70-36589ff82e9b","Type":"ContainerStarted","Data":"47e4cea252811f4287327d391fa6eae880979d65d5b946e647ec7389b7c30a67"} Apr 16 20:11:40.813448 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:40.813424 2571 generic.go:358] "Generic (PLEG): container finished" podID="b356ff33c9b62e243b4c712e3bd6b686" containerID="9d448aa7e8d4d7be6e9c0c85d8bef6b10bb340b0a142dcde6a13d566e9970b63" exitCode=0 Apr 16 20:11:40.813579 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:40.813518 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal" event={"ID":"b356ff33c9b62e243b4c712e3bd6b686","Type":"ContainerDied","Data":"9d448aa7e8d4d7be6e9c0c85d8bef6b10bb340b0a142dcde6a13d566e9970b63"} Apr 16 20:11:40.814907 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:40.814881 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xz7cv" event={"ID":"7aa3c845-972e-41e1-89d2-9126f2eb4905","Type":"ContainerStarted","Data":"2474465ff079bfaa573fea1d03a94999c55b09a865052a231af105b71cf0faf2"} Apr 16 20:11:40.816775 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:40.816751 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f69jd" event={"ID":"933ff827-8c81-4476-a08c-6f416ce84bd6","Type":"ContainerStarted","Data":"c1958ea658b2026451b7e017432f850bbac1e7074edbf0a16cff62e40c61f8e0"} Apr 16 20:11:40.819310 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:40.819282 2571 generic.go:358] "Generic (PLEG): container finished" podID="9c611987-0423-4488-b0f7-408d1c68cda1" containerID="e871d801dd39048593eaddaac285b8d1547b2cc8d14b4416e6241c3ac7adcc77" exitCode=0 Apr 16 20:11:40.820141 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:40.820113 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fbpzt" event={"ID":"9c611987-0423-4488-b0f7-408d1c68cda1","Type":"ContainerDied","Data":"e871d801dd39048593eaddaac285b8d1547b2cc8d14b4416e6241c3ac7adcc77"} Apr 16 20:11:40.823038 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:40.822951 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-24qzw" podStartSLOduration=2.822936758 podStartE2EDuration="2.822936758s" podCreationTimestamp="2026-04-16 20:11:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:11:40.822647402 +0000 UTC m=+21.729992826" watchObservedRunningTime="2026-04-16 20:11:40.822936758 +0000 UTC m=+21.730282179" Apr 16 20:11:40.843517 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:40.843474 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-s4k6r" podStartSLOduration=3.72877143 podStartE2EDuration="21.843457999s" podCreationTimestamp="2026-04-16 20:11:19 +0000 UTC" firstStartedPulling="2026-04-16 20:11:20.908979314 +0000 UTC m=+1.816324713" lastFinishedPulling="2026-04-16 20:11:39.02366587 +0000 UTC m=+19.931011282" observedRunningTime="2026-04-16 20:11:40.842797011 +0000 UTC m=+21.750142430" watchObservedRunningTime="2026-04-16 20:11:40.843457999 +0000 UTC m=+21.750803422" Apr 16 20:11:40.903800 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:40.903664 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-f69jd" podStartSLOduration=3.831464473 podStartE2EDuration="21.90364657s" podCreationTimestamp="2026-04-16 20:11:19 +0000 UTC" firstStartedPulling="2026-04-16 20:11:20.951041633 +0000 UTC m=+1.858387030" lastFinishedPulling="2026-04-16 20:11:39.023223726 +0000 UTC m=+19.930569127" observedRunningTime="2026-04-16 20:11:40.903440041 +0000 UTC m=+21.810785474" watchObservedRunningTime="2026-04-16 20:11:40.90364657 +0000 UTC m=+21.810991991" Apr 16 20:11:40.907912 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:40.907884 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 20:11:41.609580 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:41.609438 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T20:11:40.90790809Z","UUID":"870bd22c-459a-4077-a088-012a0ed39807","Handler":null,"Name":"","Endpoint":""} Apr 16 20:11:41.611204 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:41.611184 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 20:11:41.611345 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:41.611215 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 20:11:41.827772 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:41.827745 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:11:41.828596 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:41.828564 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" event={"ID":"1f2eb888-db83-4f12-83ec-2f634c4cf807","Type":"ContainerStarted","Data":"115965eef046d4af507bcc30477b19b43689b1bd3cfff36777a27017fa4db325"} Apr 16 20:11:41.830983 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:41.830950 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" event={"ID":"b52a462e-c070-4f15-8a70-36589ff82e9b","Type":"ContainerStarted","Data":"cb9231134296222737a807d3773b974e3a9397e79c114d0a38e2a109b12cdb5b"} Apr 16 20:11:41.834322 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:41.834297 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal" event={"ID":"b356ff33c9b62e243b4c712e3bd6b686","Type":"ContainerStarted","Data":"a09a852fc5fd010353ff14eb9f03992fc30bd2c24486766ed40d5711da870a7f"} Apr 16 20:11:41.861151 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:41.861072 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-41.ec2.internal" podStartSLOduration=21.86105415 podStartE2EDuration="21.86105415s" podCreationTimestamp="2026-04-16 20:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:11:41.860675013 +0000 UTC m=+22.768020434" watchObservedRunningTime="2026-04-16 20:11:41.86105415 +0000 UTC m=+22.768399570" Apr 16 20:11:41.861572 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:41.861518 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xz7cv" podStartSLOduration=3.7912420129999997 podStartE2EDuration="21.861509126s" podCreationTimestamp="2026-04-16 20:11:20 +0000 UTC" firstStartedPulling="2026-04-16 20:11:20.953324003 +0000 UTC m=+1.860669401" lastFinishedPulling="2026-04-16 20:11:39.023591116 +0000 UTC m=+19.930936514" observedRunningTime="2026-04-16 20:11:40.924904516 +0000 UTC m=+21.832249935" watchObservedRunningTime="2026-04-16 20:11:41.861509126 +0000 UTC m=+22.768854547" Apr 16 20:11:42.615025 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:42.614790 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:42.615212 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:42.614863 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:42.615212 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:42.615115 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zdf2s" podUID="ade2626d-4dc5-4796-9c91-0c0699095807" Apr 16 20:11:42.615212 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:42.615189 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5ntx" podUID="ee593b7f-fc54-40a3-af7d-f5643196a107" Apr 16 20:11:42.836881 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:42.836785 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" event={"ID":"b52a462e-c070-4f15-8a70-36589ff82e9b","Type":"ContainerStarted","Data":"1126b21dacd3f0c372e193bd2f795287d0cc473650bd9f9fda99db07b30fc564"} Apr 16 20:11:43.889251 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:43.889202 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jprdk" podStartSLOduration=3.02558266 podStartE2EDuration="23.889185365s" podCreationTimestamp="2026-04-16 20:11:20 +0000 UTC" firstStartedPulling="2026-04-16 20:11:20.881603636 +0000 UTC m=+1.788949034" lastFinishedPulling="2026-04-16 20:11:41.745206328 +0000 UTC m=+22.652551739" observedRunningTime="2026-04-16 20:11:42.85995944 +0000 UTC m=+23.767304860" watchObservedRunningTime="2026-04-16 20:11:43.889185365 +0000 UTC m=+24.796530779" Apr 16 20:11:43.889882 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:43.889869 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-xqx4j"] Apr 16 20:11:43.892059 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:43.892045 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:43.892125 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:43.892110 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xqx4j" podUID="28b55b45-a77c-407d-b55c-8a2538906ceb" Apr 16 20:11:44.021396 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:44.021366 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/28b55b45-a77c-407d-b55c-8a2538906ceb-dbus\") pod \"global-pull-secret-syncer-xqx4j\" (UID: \"28b55b45-a77c-407d-b55c-8a2538906ceb\") " pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:44.021566 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:44.021421 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/28b55b45-a77c-407d-b55c-8a2538906ceb-kubelet-config\") pod \"global-pull-secret-syncer-xqx4j\" (UID: \"28b55b45-a77c-407d-b55c-8a2538906ceb\") " pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:44.021566 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:44.021446 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28b55b45-a77c-407d-b55c-8a2538906ceb-original-pull-secret\") pod \"global-pull-secret-syncer-xqx4j\" (UID: \"28b55b45-a77c-407d-b55c-8a2538906ceb\") " pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:44.121879 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:44.121780 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/28b55b45-a77c-407d-b55c-8a2538906ceb-dbus\") pod \"global-pull-secret-syncer-xqx4j\" (UID: \"28b55b45-a77c-407d-b55c-8a2538906ceb\") " pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:44.121879 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:44.121845 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/28b55b45-a77c-407d-b55c-8a2538906ceb-kubelet-config\") pod \"global-pull-secret-syncer-xqx4j\" (UID: \"28b55b45-a77c-407d-b55c-8a2538906ceb\") " pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:44.121879 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:44.121873 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28b55b45-a77c-407d-b55c-8a2538906ceb-original-pull-secret\") pod \"global-pull-secret-syncer-xqx4j\" (UID: \"28b55b45-a77c-407d-b55c-8a2538906ceb\") " pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:44.122378 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:44.121977 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:44.122378 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:44.122007 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/28b55b45-a77c-407d-b55c-8a2538906ceb-dbus\") pod \"global-pull-secret-syncer-xqx4j\" (UID: \"28b55b45-a77c-407d-b55c-8a2538906ceb\") " pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:44.122378 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:44.122021 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/28b55b45-a77c-407d-b55c-8a2538906ceb-kubelet-config\") pod \"global-pull-secret-syncer-xqx4j\" (UID: \"28b55b45-a77c-407d-b55c-8a2538906ceb\") " pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:44.122378 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:44.122040 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28b55b45-a77c-407d-b55c-8a2538906ceb-original-pull-secret podName:28b55b45-a77c-407d-b55c-8a2538906ceb nodeName:}" failed. No retries permitted until 2026-04-16 20:11:44.622025904 +0000 UTC m=+25.529371302 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/28b55b45-a77c-407d-b55c-8a2538906ceb-original-pull-secret") pod "global-pull-secret-syncer-xqx4j" (UID: "28b55b45-a77c-407d-b55c-8a2538906ceb") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:44.615230 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:44.615063 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:44.615379 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:44.615120 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:44.615379 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:44.615324 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zdf2s" podUID="ade2626d-4dc5-4796-9c91-0c0699095807" Apr 16 20:11:44.615379 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:44.615369 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5ntx" podUID="ee593b7f-fc54-40a3-af7d-f5643196a107" Apr 16 20:11:44.626356 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:44.626325 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28b55b45-a77c-407d-b55c-8a2538906ceb-original-pull-secret\") pod \"global-pull-secret-syncer-xqx4j\" (UID: \"28b55b45-a77c-407d-b55c-8a2538906ceb\") " pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:44.626543 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:44.626505 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:44.626614 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:44.626603 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28b55b45-a77c-407d-b55c-8a2538906ceb-original-pull-secret podName:28b55b45-a77c-407d-b55c-8a2538906ceb nodeName:}" failed. No retries permitted until 2026-04-16 20:11:45.626584925 +0000 UTC m=+26.533930334 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/28b55b45-a77c-407d-b55c-8a2538906ceb-original-pull-secret") pod "global-pull-secret-syncer-xqx4j" (UID: "28b55b45-a77c-407d-b55c-8a2538906ceb") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:44.844307 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:44.844281 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:11:44.844664 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:44.844632 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" event={"ID":"1f2eb888-db83-4f12-83ec-2f634c4cf807","Type":"ContainerStarted","Data":"d09cd63409ccfc874f3fe81426cf454ad2af31b26dbfb880e53e5d9a5d059b72"} Apr 16 20:11:44.844990 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:44.844948 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:44.844990 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:44.844972 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:44.845227 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:44.845210 2571 scope.go:117] "RemoveContainer" containerID="f32dde8ca496e48624e15e3e4b7b184c5bd205afb0a41cbc6b4f1ae240c4b048" Apr 16 20:11:44.862444 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:44.862410 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:45.412127 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:45.412094 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-s4k6r" Apr 16 20:11:45.412974 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:45.412954 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-s4k6r" Apr 16 20:11:45.614892 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:45.614856 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:45.615053 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:45.614988 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xqx4j" podUID="28b55b45-a77c-407d-b55c-8a2538906ceb" Apr 16 20:11:45.639502 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:45.639466 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28b55b45-a77c-407d-b55c-8a2538906ceb-original-pull-secret\") pod \"global-pull-secret-syncer-xqx4j\" (UID: \"28b55b45-a77c-407d-b55c-8a2538906ceb\") " pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:45.639671 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:45.639619 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:45.639726 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:45.639681 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28b55b45-a77c-407d-b55c-8a2538906ceb-original-pull-secret podName:28b55b45-a77c-407d-b55c-8a2538906ceb nodeName:}" failed. No retries permitted until 2026-04-16 20:11:47.63966639 +0000 UTC m=+28.547011787 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/28b55b45-a77c-407d-b55c-8a2538906ceb-original-pull-secret") pod "global-pull-secret-syncer-xqx4j" (UID: "28b55b45-a77c-407d-b55c-8a2538906ceb") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:46.129583 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:46.129006 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xqx4j"] Apr 16 20:11:46.129583 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:46.129135 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:46.129583 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:46.129247 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xqx4j" podUID="28b55b45-a77c-407d-b55c-8a2538906ceb" Apr 16 20:11:46.131463 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:46.131435 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h5ntx"] Apr 16 20:11:46.131630 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:46.131574 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:46.131718 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:46.131689 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5ntx" podUID="ee593b7f-fc54-40a3-af7d-f5643196a107" Apr 16 20:11:46.134491 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:46.134460 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zdf2s"] Apr 16 20:11:46.134638 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:46.134608 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:46.134737 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:46.134714 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zdf2s" podUID="ade2626d-4dc5-4796-9c91-0c0699095807" Apr 16 20:11:46.850395 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:46.850360 2571 generic.go:358] "Generic (PLEG): container finished" podID="9c611987-0423-4488-b0f7-408d1c68cda1" containerID="9e86e8432d8982a42173d1d0c8bff88f5bcb160245eac0138b9ea4443e13d4d1" exitCode=0 Apr 16 20:11:46.850794 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:46.850442 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fbpzt" event={"ID":"9c611987-0423-4488-b0f7-408d1c68cda1","Type":"ContainerDied","Data":"9e86e8432d8982a42173d1d0c8bff88f5bcb160245eac0138b9ea4443e13d4d1"} Apr 16 20:11:46.853824 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:46.853806 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:11:46.854150 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:46.854128 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" event={"ID":"1f2eb888-db83-4f12-83ec-2f634c4cf807","Type":"ContainerStarted","Data":"a6eea77550dd53cd3878a01a590008f7282371d6122dfa4f898dbeef7ab7cf2f"} Apr 16 20:11:46.854500 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:46.854480 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:46.869746 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:46.869718 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:11:46.899899 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:46.899852 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" podStartSLOduration=9.724292987 podStartE2EDuration="27.899837051s" podCreationTimestamp="2026-04-16 20:11:19 +0000 UTC" firstStartedPulling="2026-04-16 20:11:20.898878049 +0000 UTC m=+1.806223446" lastFinishedPulling="2026-04-16 20:11:39.074422112 +0000 UTC m=+19.981767510" observedRunningTime="2026-04-16 20:11:46.899258983 +0000 UTC m=+27.806604404" watchObservedRunningTime="2026-04-16 20:11:46.899837051 +0000 UTC m=+27.807182468" Apr 16 20:11:47.614955 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:47.614920 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:47.614955 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:47.614953 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:47.615147 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:47.615012 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:47.615147 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:47.615123 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5ntx" podUID="ee593b7f-fc54-40a3-af7d-f5643196a107" Apr 16 20:11:47.615224 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:47.615148 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xqx4j" podUID="28b55b45-a77c-407d-b55c-8a2538906ceb" Apr 16 20:11:47.615258 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:47.615219 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zdf2s" podUID="ade2626d-4dc5-4796-9c91-0c0699095807" Apr 16 20:11:47.655491 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:47.655452 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28b55b45-a77c-407d-b55c-8a2538906ceb-original-pull-secret\") pod \"global-pull-secret-syncer-xqx4j\" (UID: \"28b55b45-a77c-407d-b55c-8a2538906ceb\") " pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:47.655721 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:47.655698 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:47.655792 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:47.655779 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28b55b45-a77c-407d-b55c-8a2538906ceb-original-pull-secret podName:28b55b45-a77c-407d-b55c-8a2538906ceb nodeName:}" failed. No retries permitted until 2026-04-16 20:11:51.655761289 +0000 UTC m=+32.563106691 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/28b55b45-a77c-407d-b55c-8a2538906ceb-original-pull-secret") pod "global-pull-secret-syncer-xqx4j" (UID: "28b55b45-a77c-407d-b55c-8a2538906ceb") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:47.858202 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:47.858165 2571 generic.go:358] "Generic (PLEG): container finished" podID="9c611987-0423-4488-b0f7-408d1c68cda1" containerID="a1386f56422a4ab18e65b33003042a76ff6cff00c3fec6135f378fd4c50831cc" exitCode=0 Apr 16 20:11:47.858558 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:47.858264 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fbpzt" event={"ID":"9c611987-0423-4488-b0f7-408d1c68cda1","Type":"ContainerDied","Data":"a1386f56422a4ab18e65b33003042a76ff6cff00c3fec6135f378fd4c50831cc"} Apr 16 20:11:48.550920 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:48.550884 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-s4k6r" Apr 16 20:11:48.551054 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:48.551036 2571 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 20:11:48.552233 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:48.552200 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-s4k6r" Apr 16 20:11:48.862714 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:48.862511 2571 generic.go:358] "Generic (PLEG): container finished" podID="9c611987-0423-4488-b0f7-408d1c68cda1" containerID="efcc016cf4fec5f2b7c11e8cbe2f11ee4d9a47051888e873cc2877740cb209a5" exitCode=0 Apr 16 20:11:48.863118 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:48.862595 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fbpzt" event={"ID":"9c611987-0423-4488-b0f7-408d1c68cda1","Type":"ContainerDied","Data":"efcc016cf4fec5f2b7c11e8cbe2f11ee4d9a47051888e873cc2877740cb209a5"} Apr 16 20:11:48.874445 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:48.874397 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" podUID="1f2eb888-db83-4f12-83ec-2f634c4cf807" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 20:11:49.615944 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:49.615909 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:49.616119 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:49.616026 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zdf2s" podUID="ade2626d-4dc5-4796-9c91-0c0699095807" Apr 16 20:11:49.616119 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:49.616087 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:49.616229 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:49.616166 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xqx4j" podUID="28b55b45-a77c-407d-b55c-8a2538906ceb" Apr 16 20:11:49.616229 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:49.616167 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:49.616307 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:49.616264 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5ntx" podUID="ee593b7f-fc54-40a3-af7d-f5643196a107" Apr 16 20:11:51.615169 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:51.615112 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:51.615772 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:51.615118 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:51.615772 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:51.615250 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5ntx" podUID="ee593b7f-fc54-40a3-af7d-f5643196a107" Apr 16 20:11:51.615772 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:51.615128 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:51.615772 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:51.615318 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xqx4j" podUID="28b55b45-a77c-407d-b55c-8a2538906ceb" Apr 16 20:11:51.615772 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:51.615371 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zdf2s" podUID="ade2626d-4dc5-4796-9c91-0c0699095807" Apr 16 20:11:51.686591 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:51.686510 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28b55b45-a77c-407d-b55c-8a2538906ceb-original-pull-secret\") pod \"global-pull-secret-syncer-xqx4j\" (UID: \"28b55b45-a77c-407d-b55c-8a2538906ceb\") " pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:51.686758 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:51.686645 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:51.686758 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:51.686713 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28b55b45-a77c-407d-b55c-8a2538906ceb-original-pull-secret podName:28b55b45-a77c-407d-b55c-8a2538906ceb nodeName:}" failed. No retries permitted until 2026-04-16 20:11:59.686695795 +0000 UTC m=+40.594041193 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/28b55b45-a77c-407d-b55c-8a2538906ceb-original-pull-secret") pod "global-pull-secret-syncer-xqx4j" (UID: "28b55b45-a77c-407d-b55c-8a2538906ceb") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:52.291744 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.291659 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs\") pod \"network-metrics-daemon-h5ntx\" (UID: \"ee593b7f-fc54-40a3-af7d-f5643196a107\") " pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:52.291921 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:52.291820 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:52.291921 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:52.291899 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs podName:ee593b7f-fc54-40a3-af7d-f5643196a107 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:24.291881764 +0000 UTC m=+65.199227184 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs") pod "network-metrics-daemon-h5ntx" (UID: "ee593b7f-fc54-40a3-af7d-f5643196a107") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:52.392234 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.392198 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4xg7\" (UniqueName: \"kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7\") pod \"network-check-target-zdf2s\" (UID: \"ade2626d-4dc5-4796-9c91-0c0699095807\") " pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:52.392428 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:52.392369 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:52.392428 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:52.392387 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:52.392428 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:52.392397 2571 projected.go:194] Error preparing data for projected volume kube-api-access-s4xg7 for pod openshift-network-diagnostics/network-check-target-zdf2s: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:52.392603 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:52.392449 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7 podName:ade2626d-4dc5-4796-9c91-0c0699095807 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:24.392433249 +0000 UTC m=+65.299778649 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s4xg7" (UniqueName: "kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7") pod "network-check-target-zdf2s" (UID: "ade2626d-4dc5-4796-9c91-0c0699095807") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:52.455246 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.455219 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-41.ec2.internal" event="NodeReady" Apr 16 20:11:52.455409 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.455342 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 20:11:52.503281 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.503245 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm"] Apr 16 20:11:52.539342 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.538647 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5847757746-fm52t"] Apr 16 20:11:52.539342 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.538905 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm" Apr 16 20:11:52.545182 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.544277 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 20:11:52.545182 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.544588 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 20:11:52.545182 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.544964 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-g68ln\"" Apr 16 20:11:52.547019 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.546993 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 20:11:52.547019 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.547006 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 20:11:52.562985 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.562959 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p4xv5"] Apr 16 20:11:52.563496 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.563470 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.567086 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.566819 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ksxbp\"" Apr 16 20:11:52.567205 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.567112 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 20:11:52.567317 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.567301 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 20:11:52.567608 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.567591 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 20:11:52.574333 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.574307 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 20:11:52.581908 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.581888 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-h59sf"] Apr 16 20:11:52.582035 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.582020 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p4xv5" Apr 16 20:11:52.584745 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.584724 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:52.584859 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.584805 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:52.585109 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.585092 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-q4m7m\"" Apr 16 20:11:52.599838 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.599816 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-984b845fd-2gf9m"] Apr 16 20:11:52.600001 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.599977 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-h59sf" Apr 16 20:11:52.602616 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.602588 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-6s4hm\"" Apr 16 20:11:52.602720 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.602628 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 20:11:52.602720 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.602706 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 20:11:52.617945 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.617923 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g5ns"] Apr 16 20:11:52.618447 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.618061 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:11:52.620826 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.620804 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 20:11:52.620926 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.620863 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 20:11:52.621062 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.621047 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-2r8gt\"" Apr 16 20:11:52.621252 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.621235 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 20:11:52.621310 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.621253 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 20:11:52.621503 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.621487 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 20:11:52.621727 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.621711 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 20:11:52.637011 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.636971 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-qlvhv"] Apr 16 20:11:52.637143 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.637127 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g5ns" Apr 16 20:11:52.640156 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.640135 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 20:11:52.640272 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.640166 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 20:11:52.640333 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.640279 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:52.640385 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.640375 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:52.640436 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.640399 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-gbxwm\"" Apr 16 20:11:52.658038 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.658012 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf"] Apr 16 20:11:52.658178 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.658060 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-qlvhv" Apr 16 20:11:52.660653 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.660634 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 20:11:52.660759 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.660674 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 20:11:52.660759 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.660710 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-lzlhd\"" Apr 16 20:11:52.660829 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.660634 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 20:11:52.661182 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.661165 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 20:11:52.666286 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.666015 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 20:11:52.679539 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.679512 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8f7hz"] Apr 16 20:11:52.679686 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.679670 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf" Apr 16 20:11:52.682215 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.682191 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-fkq6l\"" Apr 16 20:11:52.682321 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.682245 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:52.682321 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.682192 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:52.682645 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.682624 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 20:11:52.694998 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.694972 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-ca-trust-extracted\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.695105 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.695001 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-stats-auth\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:11:52.695105 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.695024 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksx9s\" (UniqueName: \"kubernetes.io/projected/799e94dc-a712-492e-a369-129299525b15-kube-api-access-ksx9s\") pod \"volume-data-source-validator-7c6cbb6c87-p4xv5\" (UID: \"799e94dc-a712-492e-a369-129299525b15\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p4xv5" Apr 16 20:11:52.695105 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.695049 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/05a809af-d1a2-4af3-9ac4-46b14e4aada1-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-h59sf\" (UID: \"05a809af-d1a2-4af3-9ac4-46b14e4aada1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h59sf" Apr 16 20:11:52.695105 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.695071 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-trusted-ca\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.695105 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.695089 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-bound-sa-token\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.695461 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.695131 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rscxz\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-kube-api-access-rscxz\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.695461 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.695166 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:11:52.695461 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.695195 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/dc063ce5-322d-48b4-9d08-1904d3210ecd-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-wndvm\" (UID: \"dc063ce5-322d-48b4-9d08-1904d3210ecd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm" Apr 16 20:11:52.695461 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.695221 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5pxb\" (UniqueName: \"kubernetes.io/projected/dc063ce5-322d-48b4-9d08-1904d3210ecd-kube-api-access-v5pxb\") pod \"cluster-monitoring-operator-75587bd455-wndvm\" (UID: \"dc063ce5-322d-48b4-9d08-1904d3210ecd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm" Apr 16 20:11:52.695461 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.695264 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-image-registry-private-configuration\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.695461 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.695349 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.695461 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.695385 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-certificates\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.695461 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.695442 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-installation-pull-secrets\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.695792 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.695471 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-default-certificate\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:11:52.695792 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.695499 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wndvm\" (UID: \"dc063ce5-322d-48b4-9d08-1904d3210ecd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm" Apr 16 20:11:52.695792 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.695565 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h59sf\" (UID: \"05a809af-d1a2-4af3-9ac4-46b14e4aada1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h59sf" Apr 16 20:11:52.695792 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.695589 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:11:52.695792 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.695611 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf45r\" (UniqueName: \"kubernetes.io/projected/6f259876-3b65-4751-ba98-118d8aa205f9-kube-api-access-gf45r\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:11:52.701287 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.701267 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm"] Apr 16 20:11:52.701393 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.701296 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5847757746-fm52t"] Apr 16 20:11:52.701393 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.701309 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-h59sf"] Apr 16 20:11:52.701393 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.701321 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p4xv5"] Apr 16 20:11:52.701393 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.701339 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-qlvhv"] Apr 16 20:11:52.701393 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.701351 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8f7hz"] Apr 16 20:11:52.701393 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.701361 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf"] Apr 16 20:11:52.701393 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.701370 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g5ns"] Apr 16 20:11:52.701393 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.701379 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mchp6"] Apr 16 20:11:52.701762 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.701435 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8f7hz" Apr 16 20:11:52.704041 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.704020 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 20:11:52.704138 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.704077 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 20:11:52.704254 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.704232 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:52.704421 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.704312 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-zs6ch\"" Apr 16 20:11:52.704421 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.704335 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:52.722493 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.722471 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-gvgs9"] Apr 16 20:11:52.722643 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.722626 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mchp6" Apr 16 20:11:52.725515 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.725386 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 20:11:52.725515 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.725397 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 20:11:52.725515 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.725397 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 20:11:52.725515 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.725404 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-v7v7t\"" Apr 16 20:11:52.742545 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.742499 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8cq5p"] Apr 16 20:11:52.742698 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.742641 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gvgs9" Apr 16 20:11:52.745389 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.745368 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:11:52.745694 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.745674 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:11:52.745957 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.745935 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-ldft5\"" Apr 16 20:11:52.763738 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.763710 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-pjp69"] Apr 16 20:11:52.763896 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.763864 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8cq5p" Apr 16 20:11:52.766598 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.766572 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 20:11:52.766722 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.766653 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tnjdz\"" Apr 16 20:11:52.766722 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.766671 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 20:11:52.783767 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.783741 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-984b845fd-2gf9m"] Apr 16 20:11:52.783767 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.783771 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mchp6"] Apr 16 20:11:52.783931 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.783787 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-gvgs9"] Apr 16 20:11:52.783931 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.783797 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-pjp69"] Apr 16 20:11:52.783931 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.783807 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8cq5p"] Apr 16 20:11:52.784052 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.783955 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" Apr 16 20:11:52.786679 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.786509 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:52.786679 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.786558 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 20:11:52.786679 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.786562 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-rv44b\"" Apr 16 20:11:52.786881 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.786823 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 20:11:52.787007 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.786995 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:52.792592 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.792572 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 20:11:52.796130 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796059 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/89638695-826b-4c66-a544-96f10200a105-tmp\") pod \"insights-operator-585dfdc468-qlvhv\" (UID: \"89638695-826b-4c66-a544-96f10200a105\") " pod="openshift-insights/insights-operator-585dfdc468-qlvhv" Apr 16 20:11:52.796130 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796092 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-image-registry-private-configuration\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.796130 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796116 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d883e9c-34a0-4bc6-8784-879380b900d3-config\") pod \"service-ca-operator-d6fc45fc5-8f7hz\" (UID: \"2d883e9c-34a0-4bc6-8784-879380b900d3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8f7hz" Apr 16 20:11:52.796319 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796163 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-certificates\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.796319 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796188 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-default-certificate\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:11:52.796319 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796208 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-installation-pull-secrets\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.796319 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796238 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h59sf\" (UID: \"05a809af-d1a2-4af3-9ac4-46b14e4aada1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h59sf" Apr 16 20:11:52.796319 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796269 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89638695-826b-4c66-a544-96f10200a105-service-ca-bundle\") pod \"insights-operator-585dfdc468-qlvhv\" (UID: \"89638695-826b-4c66-a544-96f10200a105\") " pod="openshift-insights/insights-operator-585dfdc468-qlvhv" Apr 16 20:11:52.796319 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796293 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89638695-826b-4c66-a544-96f10200a105-serving-cert\") pod \"insights-operator-585dfdc468-qlvhv\" (UID: \"89638695-826b-4c66-a544-96f10200a105\") " pod="openshift-insights/insights-operator-585dfdc468-qlvhv" Apr 16 20:11:52.796319 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796320 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:11:52.796667 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796348 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9293d674-a736-4062-a5ad-cc844313fbfe-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-4g5ns\" (UID: \"9293d674-a736-4062-a5ad-cc844313fbfe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g5ns" Apr 16 20:11:52.796667 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796382 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-bound-sa-token\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.796667 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796408 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-stats-auth\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:11:52.796667 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796436 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rscxz\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-kube-api-access-rscxz\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.796667 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796460 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:11:52.796667 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796484 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/dc063ce5-322d-48b4-9d08-1904d3210ecd-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-wndvm\" (UID: \"dc063ce5-322d-48b4-9d08-1904d3210ecd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm" Apr 16 20:11:52.796667 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796513 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ksx9s\" (UniqueName: \"kubernetes.io/projected/799e94dc-a712-492e-a369-129299525b15-kube-api-access-ksx9s\") pod \"volume-data-source-validator-7c6cbb6c87-p4xv5\" (UID: \"799e94dc-a712-492e-a369-129299525b15\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p4xv5" Apr 16 20:11:52.796667 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796555 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/05a809af-d1a2-4af3-9ac4-46b14e4aada1-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-h59sf\" (UID: \"05a809af-d1a2-4af3-9ac4-46b14e4aada1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h59sf" Apr 16 20:11:52.796667 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796578 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d883e9c-34a0-4bc6-8784-879380b900d3-serving-cert\") pod \"service-ca-operator-d6fc45fc5-8f7hz\" (UID: \"2d883e9c-34a0-4bc6-8784-879380b900d3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8f7hz" Apr 16 20:11:52.796667 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796618 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert\") pod \"ingress-canary-mchp6\" (UID: \"b3bab96c-fc58-4afa-886a-ff24380a19e6\") " pod="openshift-ingress-canary/ingress-canary-mchp6" Apr 16 20:11:52.796667 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796666 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.797108 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796691 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wndvm\" (UID: \"dc063ce5-322d-48b4-9d08-1904d3210ecd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm" Apr 16 20:11:52.797108 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796729 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gf45r\" (UniqueName: \"kubernetes.io/projected/6f259876-3b65-4751-ba98-118d8aa205f9-kube-api-access-gf45r\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:11:52.797108 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796754 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89638695-826b-4c66-a544-96f10200a105-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-qlvhv\" (UID: \"89638695-826b-4c66-a544-96f10200a105\") " pod="openshift-insights/insights-operator-585dfdc468-qlvhv" Apr 16 20:11:52.797108 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796792 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-ca-trust-extracted\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.797108 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796814 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/89638695-826b-4c66-a544-96f10200a105-snapshots\") pod \"insights-operator-585dfdc468-qlvhv\" (UID: \"89638695-826b-4c66-a544-96f10200a105\") " pod="openshift-insights/insights-operator-585dfdc468-qlvhv" Apr 16 20:11:52.797108 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796849 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt259\" (UniqueName: \"kubernetes.io/projected/7f8bccf8-1c1e-4890-98f3-747f76421e6c-kube-api-access-rt259\") pod \"cluster-samples-operator-6dc5bdb6b4-d2sbf\" (UID: \"7f8bccf8-1c1e-4890-98f3-747f76421e6c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf" Apr 16 20:11:52.797108 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796874 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9293d674-a736-4062-a5ad-cc844313fbfe-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-4g5ns\" (UID: \"9293d674-a736-4062-a5ad-cc844313fbfe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g5ns" Apr 16 20:11:52.797108 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796901 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvtmg\" (UniqueName: \"kubernetes.io/projected/9293d674-a736-4062-a5ad-cc844313fbfe-kube-api-access-zvtmg\") pod \"kube-storage-version-migrator-operator-6769c5d45-4g5ns\" (UID: \"9293d674-a736-4062-a5ad-cc844313fbfe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g5ns" Apr 16 20:11:52.797108 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796922 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d2sbf\" (UID: \"7f8bccf8-1c1e-4890-98f3-747f76421e6c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf" Apr 16 20:11:52.797108 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796948 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx2d9\" (UniqueName: \"kubernetes.io/projected/2d883e9c-34a0-4bc6-8784-879380b900d3-kube-api-access-kx2d9\") pod \"service-ca-operator-d6fc45fc5-8f7hz\" (UID: \"2d883e9c-34a0-4bc6-8784-879380b900d3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8f7hz" Apr 16 20:11:52.797108 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.796982 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5pxb\" (UniqueName: \"kubernetes.io/projected/dc063ce5-322d-48b4-9d08-1904d3210ecd-kube-api-access-v5pxb\") pod \"cluster-monitoring-operator-75587bd455-wndvm\" (UID: \"dc063ce5-322d-48b4-9d08-1904d3210ecd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm" Apr 16 20:11:52.797108 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.797003 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-trusted-ca\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.797108 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.797031 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgw8h\" (UniqueName: \"kubernetes.io/projected/b3bab96c-fc58-4afa-886a-ff24380a19e6-kube-api-access-hgw8h\") pod \"ingress-canary-mchp6\" (UID: \"b3bab96c-fc58-4afa-886a-ff24380a19e6\") " pod="openshift-ingress-canary/ingress-canary-mchp6" Apr 16 20:11:52.797108 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.797054 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m2l8\" (UniqueName: \"kubernetes.io/projected/89638695-826b-4c66-a544-96f10200a105-kube-api-access-9m2l8\") pod \"insights-operator-585dfdc468-qlvhv\" (UID: \"89638695-826b-4c66-a544-96f10200a105\") " pod="openshift-insights/insights-operator-585dfdc468-qlvhv" Apr 16 20:11:52.797768 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.797340 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-certificates\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.797768 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:52.797457 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:11:52.797768 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:52.797514 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert podName:05a809af-d1a2-4af3-9ac4-46b14e4aada1 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:53.297494776 +0000 UTC m=+34.204840178 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-h59sf" (UID: "05a809af-d1a2-4af3-9ac4-46b14e4aada1") : secret "networking-console-plugin-cert" not found Apr 16 20:11:52.797768 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.797524 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/dc063ce5-322d-48b4-9d08-1904d3210ecd-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-wndvm\" (UID: \"dc063ce5-322d-48b4-9d08-1904d3210ecd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm" Apr 16 20:11:52.797942 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.797889 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/05a809af-d1a2-4af3-9ac4-46b14e4aada1-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-h59sf\" (UID: \"05a809af-d1a2-4af3-9ac4-46b14e4aada1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h59sf" Apr 16 20:11:52.798189 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:52.798172 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:11:52.798245 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:52.798234 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle podName:6f259876-3b65-4751-ba98-118d8aa205f9 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:53.298216174 +0000 UTC m=+34.205561572 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle") pod "router-default-984b845fd-2gf9m" (UID: "6f259876-3b65-4751-ba98-118d8aa205f9") : configmap references non-existent config key: service-ca.crt Apr 16 20:11:52.798306 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:52.798254 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls podName:dc063ce5-322d-48b4-9d08-1904d3210ecd nodeName:}" failed. No retries permitted until 2026-04-16 20:11:53.298244027 +0000 UTC m=+34.205589429 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wndvm" (UID: "dc063ce5-322d-48b4-9d08-1904d3210ecd") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:11:52.798306 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:52.798263 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:11:52.798306 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:52.798299 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs podName:6f259876-3b65-4751-ba98-118d8aa205f9 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:53.298287649 +0000 UTC m=+34.205633049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs") pod "router-default-984b845fd-2gf9m" (UID: "6f259876-3b65-4751-ba98-118d8aa205f9") : secret "router-metrics-certs-default" not found Apr 16 20:11:52.798684 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.798385 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-ca-trust-extracted\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.798684 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:52.798613 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:11:52.798684 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:52.798632 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5847757746-fm52t: secret "image-registry-tls" not found Apr 16 20:11:52.798829 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:52.798706 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls podName:bc1a4b01-eb2b-4766-9d17-3b931cb5aab1 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:53.298687304 +0000 UTC m=+34.206032717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls") pod "image-registry-5847757746-fm52t" (UID: "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1") : secret "image-registry-tls" not found Apr 16 20:11:52.799697 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.799557 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-trusted-ca\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.801945 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.801917 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-image-registry-private-configuration\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.802909 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.802890 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-stats-auth\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:11:52.803019 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.802946 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-default-certificate\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:11:52.803217 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.803200 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-installation-pull-secrets\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.807028 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.806931 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksx9s\" (UniqueName: \"kubernetes.io/projected/799e94dc-a712-492e-a369-129299525b15-kube-api-access-ksx9s\") pod \"volume-data-source-validator-7c6cbb6c87-p4xv5\" (UID: \"799e94dc-a712-492e-a369-129299525b15\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p4xv5" Apr 16 20:11:52.808347 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.808324 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rscxz\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-kube-api-access-rscxz\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.809359 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.809335 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-bound-sa-token\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:52.809455 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.809350 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf45r\" (UniqueName: \"kubernetes.io/projected/6f259876-3b65-4751-ba98-118d8aa205f9-kube-api-access-gf45r\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:11:52.810949 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.810928 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5pxb\" (UniqueName: \"kubernetes.io/projected/dc063ce5-322d-48b4-9d08-1904d3210ecd-kube-api-access-v5pxb\") pod \"cluster-monitoring-operator-75587bd455-wndvm\" (UID: \"dc063ce5-322d-48b4-9d08-1904d3210ecd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm" Apr 16 20:11:52.891094 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.891057 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p4xv5" Apr 16 20:11:52.897939 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.897913 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d89c84d3-91b3-4e2f-8457-b44d10b35a08-config-volume\") pod \"dns-default-8cq5p\" (UID: \"d89c84d3-91b3-4e2f-8457-b44d10b35a08\") " pod="openshift-dns/dns-default-8cq5p" Apr 16 20:11:52.897987 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.897952 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89638695-826b-4c66-a544-96f10200a105-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-qlvhv\" (UID: \"89638695-826b-4c66-a544-96f10200a105\") " pod="openshift-insights/insights-operator-585dfdc468-qlvhv" Apr 16 20:11:52.897987 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.897979 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0803ae09-3a9f-4d31-988c-4b5c2a29a6a2-config\") pod \"console-operator-9d4b6777b-pjp69\" (UID: \"0803ae09-3a9f-4d31-988c-4b5c2a29a6a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" Apr 16 20:11:52.898058 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898015 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/89638695-826b-4c66-a544-96f10200a105-snapshots\") pod \"insights-operator-585dfdc468-qlvhv\" (UID: \"89638695-826b-4c66-a544-96f10200a105\") " pod="openshift-insights/insights-operator-585dfdc468-qlvhv" Apr 16 20:11:52.898058 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898042 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rt259\" (UniqueName: \"kubernetes.io/projected/7f8bccf8-1c1e-4890-98f3-747f76421e6c-kube-api-access-rt259\") pod \"cluster-samples-operator-6dc5bdb6b4-d2sbf\" (UID: \"7f8bccf8-1c1e-4890-98f3-747f76421e6c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf" Apr 16 20:11:52.898135 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898073 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9293d674-a736-4062-a5ad-cc844313fbfe-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-4g5ns\" (UID: \"9293d674-a736-4062-a5ad-cc844313fbfe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g5ns" Apr 16 20:11:52.898135 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898112 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvtmg\" (UniqueName: \"kubernetes.io/projected/9293d674-a736-4062-a5ad-cc844313fbfe-kube-api-access-zvtmg\") pod \"kube-storage-version-migrator-operator-6769c5d45-4g5ns\" (UID: \"9293d674-a736-4062-a5ad-cc844313fbfe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g5ns" Apr 16 20:11:52.898230 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898152 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d2sbf\" (UID: \"7f8bccf8-1c1e-4890-98f3-747f76421e6c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf" Apr 16 20:11:52.898230 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898190 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kx2d9\" (UniqueName: \"kubernetes.io/projected/2d883e9c-34a0-4bc6-8784-879380b900d3-kube-api-access-kx2d9\") pod \"service-ca-operator-d6fc45fc5-8f7hz\" (UID: \"2d883e9c-34a0-4bc6-8784-879380b900d3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8f7hz" Apr 16 20:11:52.898230 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898221 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9qqw\" (UniqueName: \"kubernetes.io/projected/025cbe57-f51f-4203-a8cc-6b592a9735f7-kube-api-access-k9qqw\") pod \"network-check-source-8894fc9bd-gvgs9\" (UID: \"025cbe57-f51f-4203-a8cc-6b592a9735f7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gvgs9" Apr 16 20:11:52.898368 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898258 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgw8h\" (UniqueName: \"kubernetes.io/projected/b3bab96c-fc58-4afa-886a-ff24380a19e6-kube-api-access-hgw8h\") pod \"ingress-canary-mchp6\" (UID: \"b3bab96c-fc58-4afa-886a-ff24380a19e6\") " pod="openshift-ingress-canary/ingress-canary-mchp6" Apr 16 20:11:52.898368 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898288 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9m2l8\" (UniqueName: \"kubernetes.io/projected/89638695-826b-4c66-a544-96f10200a105-kube-api-access-9m2l8\") pod \"insights-operator-585dfdc468-qlvhv\" (UID: \"89638695-826b-4c66-a544-96f10200a105\") " pod="openshift-insights/insights-operator-585dfdc468-qlvhv" Apr 16 20:11:52.898368 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:52.898295 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:11:52.898368 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898317 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/89638695-826b-4c66-a544-96f10200a105-tmp\") pod \"insights-operator-585dfdc468-qlvhv\" (UID: \"89638695-826b-4c66-a544-96f10200a105\") " pod="openshift-insights/insights-operator-585dfdc468-qlvhv" Apr 16 20:11:52.898368 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898348 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d883e9c-34a0-4bc6-8784-879380b900d3-config\") pod \"service-ca-operator-d6fc45fc5-8f7hz\" (UID: \"2d883e9c-34a0-4bc6-8784-879380b900d3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8f7hz" Apr 16 20:11:52.898708 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898374 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d89c84d3-91b3-4e2f-8457-b44d10b35a08-tmp-dir\") pod \"dns-default-8cq5p\" (UID: \"d89c84d3-91b3-4e2f-8457-b44d10b35a08\") " pod="openshift-dns/dns-default-8cq5p" Apr 16 20:11:52.898708 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:52.898393 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls podName:7f8bccf8-1c1e-4890-98f3-747f76421e6c nodeName:}" failed. No retries permitted until 2026-04-16 20:11:53.398366593 +0000 UTC m=+34.305712008 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-d2sbf" (UID: "7f8bccf8-1c1e-4890-98f3-747f76421e6c") : secret "samples-operator-tls" not found Apr 16 20:11:52.898708 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898451 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0803ae09-3a9f-4d31-988c-4b5c2a29a6a2-trusted-ca\") pod \"console-operator-9d4b6777b-pjp69\" (UID: \"0803ae09-3a9f-4d31-988c-4b5c2a29a6a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" Apr 16 20:11:52.898708 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898577 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89638695-826b-4c66-a544-96f10200a105-service-ca-bundle\") pod \"insights-operator-585dfdc468-qlvhv\" (UID: \"89638695-826b-4c66-a544-96f10200a105\") " pod="openshift-insights/insights-operator-585dfdc468-qlvhv" Apr 16 20:11:52.898708 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898615 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89638695-826b-4c66-a544-96f10200a105-serving-cert\") pod \"insights-operator-585dfdc468-qlvhv\" (UID: \"89638695-826b-4c66-a544-96f10200a105\") " pod="openshift-insights/insights-operator-585dfdc468-qlvhv" Apr 16 20:11:52.898708 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898665 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9293d674-a736-4062-a5ad-cc844313fbfe-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-4g5ns\" (UID: \"9293d674-a736-4062-a5ad-cc844313fbfe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g5ns" Apr 16 20:11:52.899199 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d883e9c-34a0-4bc6-8784-879380b900d3-serving-cert\") pod \"service-ca-operator-d6fc45fc5-8f7hz\" (UID: \"2d883e9c-34a0-4bc6-8784-879380b900d3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8f7hz" Apr 16 20:11:52.899199 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898782 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/89638695-826b-4c66-a544-96f10200a105-snapshots\") pod \"insights-operator-585dfdc468-qlvhv\" (UID: \"89638695-826b-4c66-a544-96f10200a105\") " pod="openshift-insights/insights-operator-585dfdc468-qlvhv" Apr 16 20:11:52.899199 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898831 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/89638695-826b-4c66-a544-96f10200a105-tmp\") pod \"insights-operator-585dfdc468-qlvhv\" (UID: \"89638695-826b-4c66-a544-96f10200a105\") " pod="openshift-insights/insights-operator-585dfdc468-qlvhv" Apr 16 20:11:52.899199 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898876 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls\") pod \"dns-default-8cq5p\" (UID: \"d89c84d3-91b3-4e2f-8457-b44d10b35a08\") " pod="openshift-dns/dns-default-8cq5p" Apr 16 20:11:52.899199 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898917 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert\") pod \"ingress-canary-mchp6\" (UID: \"b3bab96c-fc58-4afa-886a-ff24380a19e6\") " pod="openshift-ingress-canary/ingress-canary-mchp6" Apr 16 20:11:52.899199 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898953 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k84cg\" (UniqueName: \"kubernetes.io/projected/d89c84d3-91b3-4e2f-8457-b44d10b35a08-kube-api-access-k84cg\") pod \"dns-default-8cq5p\" (UID: \"d89c84d3-91b3-4e2f-8457-b44d10b35a08\") " pod="openshift-dns/dns-default-8cq5p" Apr 16 20:11:52.899199 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.898979 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ngq4\" (UniqueName: \"kubernetes.io/projected/0803ae09-3a9f-4d31-988c-4b5c2a29a6a2-kube-api-access-8ngq4\") pod \"console-operator-9d4b6777b-pjp69\" (UID: \"0803ae09-3a9f-4d31-988c-4b5c2a29a6a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" Apr 16 20:11:52.899199 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.899003 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0803ae09-3a9f-4d31-988c-4b5c2a29a6a2-serving-cert\") pod \"console-operator-9d4b6777b-pjp69\" (UID: \"0803ae09-3a9f-4d31-988c-4b5c2a29a6a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" Apr 16 20:11:52.899199 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:52.899124 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:11:52.899199 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:52.899188 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert podName:b3bab96c-fc58-4afa-886a-ff24380a19e6 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:53.399172633 +0000 UTC m=+34.306518046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert") pod "ingress-canary-mchp6" (UID: "b3bab96c-fc58-4afa-886a-ff24380a19e6") : secret "canary-serving-cert" not found Apr 16 20:11:52.899761 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.899291 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d883e9c-34a0-4bc6-8784-879380b900d3-config\") pod \"service-ca-operator-d6fc45fc5-8f7hz\" (UID: \"2d883e9c-34a0-4bc6-8784-879380b900d3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8f7hz" Apr 16 20:11:52.899761 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.899326 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9293d674-a736-4062-a5ad-cc844313fbfe-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-4g5ns\" (UID: \"9293d674-a736-4062-a5ad-cc844313fbfe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g5ns" Apr 16 20:11:52.899761 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.899639 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89638695-826b-4c66-a544-96f10200a105-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-qlvhv\" (UID: \"89638695-826b-4c66-a544-96f10200a105\") " pod="openshift-insights/insights-operator-585dfdc468-qlvhv" Apr 16 20:11:52.900825 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.900806 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9293d674-a736-4062-a5ad-cc844313fbfe-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-4g5ns\" (UID: \"9293d674-a736-4062-a5ad-cc844313fbfe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g5ns" Apr 16 20:11:52.901438 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.901416 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d883e9c-34a0-4bc6-8784-879380b900d3-serving-cert\") pod \"service-ca-operator-d6fc45fc5-8f7hz\" (UID: \"2d883e9c-34a0-4bc6-8784-879380b900d3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8f7hz" Apr 16 20:11:52.905545 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.905499 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89638695-826b-4c66-a544-96f10200a105-service-ca-bundle\") pod \"insights-operator-585dfdc468-qlvhv\" (UID: \"89638695-826b-4c66-a544-96f10200a105\") " pod="openshift-insights/insights-operator-585dfdc468-qlvhv" Apr 16 20:11:52.908130 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.907953 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvtmg\" (UniqueName: \"kubernetes.io/projected/9293d674-a736-4062-a5ad-cc844313fbfe-kube-api-access-zvtmg\") pod \"kube-storage-version-migrator-operator-6769c5d45-4g5ns\" (UID: \"9293d674-a736-4062-a5ad-cc844313fbfe\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g5ns" Apr 16 20:11:52.908130 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.908083 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx2d9\" (UniqueName: \"kubernetes.io/projected/2d883e9c-34a0-4bc6-8784-879380b900d3-kube-api-access-kx2d9\") pod \"service-ca-operator-d6fc45fc5-8f7hz\" (UID: \"2d883e9c-34a0-4bc6-8784-879380b900d3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8f7hz" Apr 16 20:11:52.908397 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.908363 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt259\" (UniqueName: \"kubernetes.io/projected/7f8bccf8-1c1e-4890-98f3-747f76421e6c-kube-api-access-rt259\") pod \"cluster-samples-operator-6dc5bdb6b4-d2sbf\" (UID: \"7f8bccf8-1c1e-4890-98f3-747f76421e6c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf" Apr 16 20:11:52.908487 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.908467 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m2l8\" (UniqueName: \"kubernetes.io/projected/89638695-826b-4c66-a544-96f10200a105-kube-api-access-9m2l8\") pod \"insights-operator-585dfdc468-qlvhv\" (UID: \"89638695-826b-4c66-a544-96f10200a105\") " pod="openshift-insights/insights-operator-585dfdc468-qlvhv" Apr 16 20:11:52.908783 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.908764 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89638695-826b-4c66-a544-96f10200a105-serving-cert\") pod \"insights-operator-585dfdc468-qlvhv\" (UID: \"89638695-826b-4c66-a544-96f10200a105\") " pod="openshift-insights/insights-operator-585dfdc468-qlvhv" Apr 16 20:11:52.909011 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.908989 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgw8h\" (UniqueName: \"kubernetes.io/projected/b3bab96c-fc58-4afa-886a-ff24380a19e6-kube-api-access-hgw8h\") pod \"ingress-canary-mchp6\" (UID: \"b3bab96c-fc58-4afa-886a-ff24380a19e6\") " pod="openshift-ingress-canary/ingress-canary-mchp6" Apr 16 20:11:52.947041 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.947007 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g5ns" Apr 16 20:11:52.968891 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:52.968855 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-qlvhv" Apr 16 20:11:53.000545 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.000503 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d89c84d3-91b3-4e2f-8457-b44d10b35a08-tmp-dir\") pod \"dns-default-8cq5p\" (UID: \"d89c84d3-91b3-4e2f-8457-b44d10b35a08\") " pod="openshift-dns/dns-default-8cq5p" Apr 16 20:11:53.000727 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.000568 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0803ae09-3a9f-4d31-988c-4b5c2a29a6a2-trusted-ca\") pod \"console-operator-9d4b6777b-pjp69\" (UID: \"0803ae09-3a9f-4d31-988c-4b5c2a29a6a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" Apr 16 20:11:53.000788 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.000742 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls\") pod \"dns-default-8cq5p\" (UID: \"d89c84d3-91b3-4e2f-8457-b44d10b35a08\") " pod="openshift-dns/dns-default-8cq5p" Apr 16 20:11:53.000868 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.000848 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k84cg\" (UniqueName: \"kubernetes.io/projected/d89c84d3-91b3-4e2f-8457-b44d10b35a08-kube-api-access-k84cg\") pod \"dns-default-8cq5p\" (UID: \"d89c84d3-91b3-4e2f-8457-b44d10b35a08\") " pod="openshift-dns/dns-default-8cq5p" Apr 16 20:11:53.000927 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.000894 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ngq4\" (UniqueName: \"kubernetes.io/projected/0803ae09-3a9f-4d31-988c-4b5c2a29a6a2-kube-api-access-8ngq4\") pod \"console-operator-9d4b6777b-pjp69\" (UID: \"0803ae09-3a9f-4d31-988c-4b5c2a29a6a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" Apr 16 20:11:53.000976 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.000930 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0803ae09-3a9f-4d31-988c-4b5c2a29a6a2-serving-cert\") pod \"console-operator-9d4b6777b-pjp69\" (UID: \"0803ae09-3a9f-4d31-988c-4b5c2a29a6a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" Apr 16 20:11:53.001052 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.001035 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d89c84d3-91b3-4e2f-8457-b44d10b35a08-config-volume\") pod \"dns-default-8cq5p\" (UID: \"d89c84d3-91b3-4e2f-8457-b44d10b35a08\") " pod="openshift-dns/dns-default-8cq5p" Apr 16 20:11:53.001101 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.001091 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0803ae09-3a9f-4d31-988c-4b5c2a29a6a2-config\") pod \"console-operator-9d4b6777b-pjp69\" (UID: \"0803ae09-3a9f-4d31-988c-4b5c2a29a6a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" Apr 16 20:11:53.001215 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.001196 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9qqw\" (UniqueName: \"kubernetes.io/projected/025cbe57-f51f-4203-a8cc-6b592a9735f7-kube-api-access-k9qqw\") pod \"network-check-source-8894fc9bd-gvgs9\" (UID: \"025cbe57-f51f-4203-a8cc-6b592a9735f7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gvgs9" Apr 16 20:11:53.001773 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:53.001752 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:11:53.001876 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:53.001829 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls podName:d89c84d3-91b3-4e2f-8457-b44d10b35a08 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:53.501802546 +0000 UTC m=+34.409147960 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls") pod "dns-default-8cq5p" (UID: "d89c84d3-91b3-4e2f-8457-b44d10b35a08") : secret "dns-default-metrics-tls" not found Apr 16 20:11:53.002323 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.002299 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0803ae09-3a9f-4d31-988c-4b5c2a29a6a2-trusted-ca\") pod \"console-operator-9d4b6777b-pjp69\" (UID: \"0803ae09-3a9f-4d31-988c-4b5c2a29a6a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" Apr 16 20:11:53.002956 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.002876 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d89c84d3-91b3-4e2f-8457-b44d10b35a08-tmp-dir\") pod \"dns-default-8cq5p\" (UID: \"d89c84d3-91b3-4e2f-8457-b44d10b35a08\") " pod="openshift-dns/dns-default-8cq5p" Apr 16 20:11:53.002956 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.002884 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d89c84d3-91b3-4e2f-8457-b44d10b35a08-config-volume\") pod \"dns-default-8cq5p\" (UID: \"d89c84d3-91b3-4e2f-8457-b44d10b35a08\") " pod="openshift-dns/dns-default-8cq5p" Apr 16 20:11:53.003454 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.003434 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0803ae09-3a9f-4d31-988c-4b5c2a29a6a2-config\") pod \"console-operator-9d4b6777b-pjp69\" (UID: \"0803ae09-3a9f-4d31-988c-4b5c2a29a6a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" Apr 16 20:11:53.006557 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.006427 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0803ae09-3a9f-4d31-988c-4b5c2a29a6a2-serving-cert\") pod \"console-operator-9d4b6777b-pjp69\" (UID: \"0803ae09-3a9f-4d31-988c-4b5c2a29a6a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" Apr 16 20:11:53.011798 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.011775 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8f7hz" Apr 16 20:11:53.015271 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.015248 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9qqw\" (UniqueName: \"kubernetes.io/projected/025cbe57-f51f-4203-a8cc-6b592a9735f7-kube-api-access-k9qqw\") pod \"network-check-source-8894fc9bd-gvgs9\" (UID: \"025cbe57-f51f-4203-a8cc-6b592a9735f7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gvgs9" Apr 16 20:11:53.017220 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.017202 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k84cg\" (UniqueName: \"kubernetes.io/projected/d89c84d3-91b3-4e2f-8457-b44d10b35a08-kube-api-access-k84cg\") pod \"dns-default-8cq5p\" (UID: \"d89c84d3-91b3-4e2f-8457-b44d10b35a08\") " pod="openshift-dns/dns-default-8cq5p" Apr 16 20:11:53.017573 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.017551 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ngq4\" (UniqueName: \"kubernetes.io/projected/0803ae09-3a9f-4d31-988c-4b5c2a29a6a2-kube-api-access-8ngq4\") pod \"console-operator-9d4b6777b-pjp69\" (UID: \"0803ae09-3a9f-4d31-988c-4b5c2a29a6a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" Apr 16 20:11:53.052645 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.052558 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gvgs9" Apr 16 20:11:53.104830 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.104795 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" Apr 16 20:11:53.304167 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.304073 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h59sf\" (UID: \"05a809af-d1a2-4af3-9ac4-46b14e4aada1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h59sf" Apr 16 20:11:53.304167 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.304125 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:11:53.304167 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.304162 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:11:53.304446 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.304243 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:53.304446 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.304266 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wndvm\" (UID: \"dc063ce5-322d-48b4-9d08-1904d3210ecd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm" Apr 16 20:11:53.304446 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:53.304243 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:11:53.304446 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:53.304320 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:11:53.304446 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:53.304362 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:11:53.304446 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:53.304371 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle podName:6f259876-3b65-4751-ba98-118d8aa205f9 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:54.304346502 +0000 UTC m=+35.211691900 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle") pod "router-default-984b845fd-2gf9m" (UID: "6f259876-3b65-4751-ba98-118d8aa205f9") : configmap references non-existent config key: service-ca.crt Apr 16 20:11:53.304446 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:53.304376 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5847757746-fm52t: secret "image-registry-tls" not found Apr 16 20:11:53.304446 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:53.304400 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs podName:6f259876-3b65-4751-ba98-118d8aa205f9 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:54.304389348 +0000 UTC m=+35.211734749 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs") pod "router-default-984b845fd-2gf9m" (UID: "6f259876-3b65-4751-ba98-118d8aa205f9") : secret "router-metrics-certs-default" not found Apr 16 20:11:53.304446 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:53.304407 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:11:53.304446 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:53.304420 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert podName:05a809af-d1a2-4af3-9ac4-46b14e4aada1 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:54.304411263 +0000 UTC m=+35.211756666 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-h59sf" (UID: "05a809af-d1a2-4af3-9ac4-46b14e4aada1") : secret "networking-console-plugin-cert" not found Apr 16 20:11:53.304446 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:53.304446 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls podName:dc063ce5-322d-48b4-9d08-1904d3210ecd nodeName:}" failed. No retries permitted until 2026-04-16 20:11:54.304430669 +0000 UTC m=+35.211776072 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wndvm" (UID: "dc063ce5-322d-48b4-9d08-1904d3210ecd") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:11:53.304818 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:53.304464 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls podName:bc1a4b01-eb2b-4766-9d17-3b931cb5aab1 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:54.30445482 +0000 UTC m=+35.211800221 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls") pod "image-registry-5847757746-fm52t" (UID: "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1") : secret "image-registry-tls" not found Apr 16 20:11:53.405507 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.405463 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert\") pod \"ingress-canary-mchp6\" (UID: \"b3bab96c-fc58-4afa-886a-ff24380a19e6\") " pod="openshift-ingress-canary/ingress-canary-mchp6" Apr 16 20:11:53.405721 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.405627 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d2sbf\" (UID: \"7f8bccf8-1c1e-4890-98f3-747f76421e6c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf" Apr 16 20:11:53.405721 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:53.405660 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:11:53.405838 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:53.405732 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert podName:b3bab96c-fc58-4afa-886a-ff24380a19e6 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:54.405711971 +0000 UTC m=+35.313057370 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert") pod "ingress-canary-mchp6" (UID: "b3bab96c-fc58-4afa-886a-ff24380a19e6") : secret "canary-serving-cert" not found Apr 16 20:11:53.405838 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:53.405754 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:11:53.405838 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:53.405793 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls podName:7f8bccf8-1c1e-4890-98f3-747f76421e6c nodeName:}" failed. No retries permitted until 2026-04-16 20:11:54.405782233 +0000 UTC m=+35.313127630 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-d2sbf" (UID: "7f8bccf8-1c1e-4890-98f3-747f76421e6c") : secret "samples-operator-tls" not found Apr 16 20:11:53.507227 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.507185 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls\") pod \"dns-default-8cq5p\" (UID: \"d89c84d3-91b3-4e2f-8457-b44d10b35a08\") " pod="openshift-dns/dns-default-8cq5p" Apr 16 20:11:53.507426 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:53.507348 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:11:53.507496 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:53.507427 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls podName:d89c84d3-91b3-4e2f-8457-b44d10b35a08 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:54.507405477 +0000 UTC m=+35.414750891 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls") pod "dns-default-8cq5p" (UID: "d89c84d3-91b3-4e2f-8457-b44d10b35a08") : secret "dns-default-metrics-tls" not found Apr 16 20:11:53.619072 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.619035 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:53.619502 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.619438 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:11:53.619886 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.619864 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:11:53.622414 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.622390 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:11:53.623120 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.623096 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 20:11:53.623120 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.623114 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jm2hn\"" Apr 16 20:11:53.623501 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:53.623484 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-b2xrn\"" Apr 16 20:11:54.315453 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:54.315411 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h59sf\" (UID: \"05a809af-d1a2-4af3-9ac4-46b14e4aada1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h59sf" Apr 16 20:11:54.315453 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:54.315462 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:11:54.315691 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:54.315501 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:11:54.315691 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:54.315594 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:11:54.315691 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:54.315605 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:54.315691 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:54.315625 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wndvm\" (UID: \"dc063ce5-322d-48b4-9d08-1904d3210ecd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm" Apr 16 20:11:54.315691 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:54.315656 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert podName:05a809af-d1a2-4af3-9ac4-46b14e4aada1 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:56.315639898 +0000 UTC m=+37.222985296 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-h59sf" (UID: "05a809af-d1a2-4af3-9ac4-46b14e4aada1") : secret "networking-console-plugin-cert" not found Apr 16 20:11:54.315839 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:54.315725 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:11:54.315839 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:54.315763 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls podName:dc063ce5-322d-48b4-9d08-1904d3210ecd nodeName:}" failed. No retries permitted until 2026-04-16 20:11:56.315751881 +0000 UTC m=+37.223097279 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wndvm" (UID: "dc063ce5-322d-48b4-9d08-1904d3210ecd") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:11:54.315839 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:54.315816 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle podName:6f259876-3b65-4751-ba98-118d8aa205f9 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:56.315800806 +0000 UTC m=+37.223146204 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle") pod "router-default-984b845fd-2gf9m" (UID: "6f259876-3b65-4751-ba98-118d8aa205f9") : configmap references non-existent config key: service-ca.crt Apr 16 20:11:54.315934 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:54.315854 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:11:54.315934 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:54.315873 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs podName:6f259876-3b65-4751-ba98-118d8aa205f9 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:56.315867247 +0000 UTC m=+37.223212645 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs") pod "router-default-984b845fd-2gf9m" (UID: "6f259876-3b65-4751-ba98-118d8aa205f9") : secret "router-metrics-certs-default" not found Apr 16 20:11:54.315934 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:54.315910 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:11:54.315934 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:54.315917 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5847757746-fm52t: secret "image-registry-tls" not found Apr 16 20:11:54.315934 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:54.315936 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls podName:bc1a4b01-eb2b-4766-9d17-3b931cb5aab1 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:56.315930273 +0000 UTC m=+37.223275671 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls") pod "image-registry-5847757746-fm52t" (UID: "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1") : secret "image-registry-tls" not found Apr 16 20:11:54.416607 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:54.416569 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert\") pod \"ingress-canary-mchp6\" (UID: \"b3bab96c-fc58-4afa-886a-ff24380a19e6\") " pod="openshift-ingress-canary/ingress-canary-mchp6" Apr 16 20:11:54.416794 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:54.416693 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d2sbf\" (UID: \"7f8bccf8-1c1e-4890-98f3-747f76421e6c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf" Apr 16 20:11:54.416794 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:54.416739 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:11:54.416911 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:54.416815 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert podName:b3bab96c-fc58-4afa-886a-ff24380a19e6 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:56.416799694 +0000 UTC m=+37.324145092 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert") pod "ingress-canary-mchp6" (UID: "b3bab96c-fc58-4afa-886a-ff24380a19e6") : secret "canary-serving-cert" not found Apr 16 20:11:54.416911 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:54.416820 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:11:54.416911 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:54.416872 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls podName:7f8bccf8-1c1e-4890-98f3-747f76421e6c nodeName:}" failed. No retries permitted until 2026-04-16 20:11:56.416855554 +0000 UTC m=+37.324200955 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-d2sbf" (UID: "7f8bccf8-1c1e-4890-98f3-747f76421e6c") : secret "samples-operator-tls" not found Apr 16 20:11:54.518112 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:54.518081 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls\") pod \"dns-default-8cq5p\" (UID: \"d89c84d3-91b3-4e2f-8457-b44d10b35a08\") " pod="openshift-dns/dns-default-8cq5p" Apr 16 20:11:54.518294 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:54.518257 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:11:54.518428 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:54.518339 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls podName:d89c84d3-91b3-4e2f-8457-b44d10b35a08 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:56.518324339 +0000 UTC m=+37.425669737 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls") pod "dns-default-8cq5p" (UID: "d89c84d3-91b3-4e2f-8457-b44d10b35a08") : secret "dns-default-metrics-tls" not found Apr 16 20:11:55.351819 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:55.351509 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-qlvhv"] Apr 16 20:11:55.356110 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:55.356052 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8f7hz"] Apr 16 20:11:55.358961 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:55.358932 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89638695_826b_4c66_a544_96f10200a105.slice/crio-9682631877aa79bb1895d860c58131d210a18aca6340cc534c0962e043435c2b WatchSource:0}: Error finding container 9682631877aa79bb1895d860c58131d210a18aca6340cc534c0962e043435c2b: Status 404 returned error can't find the container with id 9682631877aa79bb1895d860c58131d210a18aca6340cc534c0962e043435c2b Apr 16 20:11:55.362232 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:55.361458 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-pjp69"] Apr 16 20:11:55.363936 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:55.363911 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p4xv5"] Apr 16 20:11:55.364214 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:55.364168 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0803ae09_3a9f_4d31_988c_4b5c2a29a6a2.slice/crio-e66640b403ffce282c736b0d4f3c793b0c2810213646a3b33fa9f2a51450a0f5 WatchSource:0}: Error finding container e66640b403ffce282c736b0d4f3c793b0c2810213646a3b33fa9f2a51450a0f5: Status 404 returned error can't find the container with id e66640b403ffce282c736b0d4f3c793b0c2810213646a3b33fa9f2a51450a0f5 Apr 16 20:11:55.367265 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:55.367240 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod799e94dc_a712_492e_a369_129299525b15.slice/crio-dbfce23a6d8f373c63b778c642b1e807e9bbd4ba113ec1654901cedbbc12e4be WatchSource:0}: Error finding container dbfce23a6d8f373c63b778c642b1e807e9bbd4ba113ec1654901cedbbc12e4be: Status 404 returned error can't find the container with id dbfce23a6d8f373c63b778c642b1e807e9bbd4ba113ec1654901cedbbc12e4be Apr 16 20:11:55.373716 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:55.372928 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-gvgs9"] Apr 16 20:11:55.375754 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:55.375731 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g5ns"] Apr 16 20:11:55.382328 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:55.382289 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod025cbe57_f51f_4203_a8cc_6b592a9735f7.slice/crio-008ed8223ba0dfeefa2cf474b8c6f1c0ce7edbf705cd70af4e73d257d17ccfa1 WatchSource:0}: Error finding container 008ed8223ba0dfeefa2cf474b8c6f1c0ce7edbf705cd70af4e73d257d17ccfa1: Status 404 returned error can't find the container with id 008ed8223ba0dfeefa2cf474b8c6f1c0ce7edbf705cd70af4e73d257d17ccfa1 Apr 16 20:11:55.383241 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:11:55.383209 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9293d674_a736_4062_a5ad_cc844313fbfe.slice/crio-c80611b10bacea535689b00e43f0efd0998c2334a85122554778ac75e264aa7e WatchSource:0}: Error finding container c80611b10bacea535689b00e43f0efd0998c2334a85122554778ac75e264aa7e: Status 404 returned error can't find the container with id c80611b10bacea535689b00e43f0efd0998c2334a85122554778ac75e264aa7e Apr 16 20:11:55.879432 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:55.879351 2571 generic.go:358] "Generic (PLEG): container finished" podID="9c611987-0423-4488-b0f7-408d1c68cda1" containerID="3f99fb1744caa12acef1ba6411d8c6aa284076d4670266e960dcac6a53db9e90" exitCode=0 Apr 16 20:11:55.879615 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:55.879439 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fbpzt" event={"ID":"9c611987-0423-4488-b0f7-408d1c68cda1","Type":"ContainerDied","Data":"3f99fb1744caa12acef1ba6411d8c6aa284076d4670266e960dcac6a53db9e90"} Apr 16 20:11:55.880504 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:55.880469 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gvgs9" event={"ID":"025cbe57-f51f-4203-a8cc-6b592a9735f7","Type":"ContainerStarted","Data":"008ed8223ba0dfeefa2cf474b8c6f1c0ce7edbf705cd70af4e73d257d17ccfa1"} Apr 16 20:11:55.881364 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:55.881342 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-qlvhv" event={"ID":"89638695-826b-4c66-a544-96f10200a105","Type":"ContainerStarted","Data":"9682631877aa79bb1895d860c58131d210a18aca6340cc534c0962e043435c2b"} Apr 16 20:11:55.882285 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:55.882263 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g5ns" event={"ID":"9293d674-a736-4062-a5ad-cc844313fbfe","Type":"ContainerStarted","Data":"c80611b10bacea535689b00e43f0efd0998c2334a85122554778ac75e264aa7e"} Apr 16 20:11:55.883273 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:55.883255 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p4xv5" event={"ID":"799e94dc-a712-492e-a369-129299525b15","Type":"ContainerStarted","Data":"dbfce23a6d8f373c63b778c642b1e807e9bbd4ba113ec1654901cedbbc12e4be"} Apr 16 20:11:55.884232 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:55.884212 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8f7hz" event={"ID":"2d883e9c-34a0-4bc6-8784-879380b900d3","Type":"ContainerStarted","Data":"03dbddefb6b0769f6d7314b2e2ecf170152d42fc225cd90c4d3cc7b00d5511f9"} Apr 16 20:11:55.885120 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:55.885100 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" event={"ID":"0803ae09-3a9f-4d31-988c-4b5c2a29a6a2","Type":"ContainerStarted","Data":"e66640b403ffce282c736b0d4f3c793b0c2810213646a3b33fa9f2a51450a0f5"} Apr 16 20:11:56.341288 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:56.340379 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:11:56.341288 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:56.340654 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wndvm\" (UID: \"dc063ce5-322d-48b4-9d08-1904d3210ecd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm" Apr 16 20:11:56.341288 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:56.340826 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h59sf\" (UID: \"05a809af-d1a2-4af3-9ac4-46b14e4aada1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h59sf" Apr 16 20:11:56.341288 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:56.340868 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:11:56.341288 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:56.340905 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:11:56.341288 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:56.341029 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:11:56.341288 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:56.341101 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls podName:dc063ce5-322d-48b4-9d08-1904d3210ecd nodeName:}" failed. No retries permitted until 2026-04-16 20:12:00.341079898 +0000 UTC m=+41.248425318 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wndvm" (UID: "dc063ce5-322d-48b4-9d08-1904d3210ecd") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:11:56.341288 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:56.341150 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:11:56.341288 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:56.341169 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:11:56.341288 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:56.341180 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5847757746-fm52t: secret "image-registry-tls" not found Apr 16 20:11:56.341288 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:56.341191 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert podName:05a809af-d1a2-4af3-9ac4-46b14e4aada1 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:00.341178248 +0000 UTC m=+41.248523650 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-h59sf" (UID: "05a809af-d1a2-4af3-9ac4-46b14e4aada1") : secret "networking-console-plugin-cert" not found Apr 16 20:11:56.341288 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:56.341212 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls podName:bc1a4b01-eb2b-4766-9d17-3b931cb5aab1 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:00.341201349 +0000 UTC m=+41.248546748 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls") pod "image-registry-5847757746-fm52t" (UID: "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1") : secret "image-registry-tls" not found Apr 16 20:11:56.341288 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:56.341029 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:11:56.341288 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:56.341242 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs podName:6f259876-3b65-4751-ba98-118d8aa205f9 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:00.341234742 +0000 UTC m=+41.248580146 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs") pod "router-default-984b845fd-2gf9m" (UID: "6f259876-3b65-4751-ba98-118d8aa205f9") : secret "router-metrics-certs-default" not found Apr 16 20:11:56.341288 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:56.341264 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle podName:6f259876-3b65-4751-ba98-118d8aa205f9 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:00.341256209 +0000 UTC m=+41.248601607 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle") pod "router-default-984b845fd-2gf9m" (UID: "6f259876-3b65-4751-ba98-118d8aa205f9") : configmap references non-existent config key: service-ca.crt Apr 16 20:11:56.442294 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:56.441958 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert\") pod \"ingress-canary-mchp6\" (UID: \"b3bab96c-fc58-4afa-886a-ff24380a19e6\") " pod="openshift-ingress-canary/ingress-canary-mchp6" Apr 16 20:11:56.442294 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:56.442094 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:11:56.442294 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:56.442117 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d2sbf\" (UID: \"7f8bccf8-1c1e-4890-98f3-747f76421e6c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf" Apr 16 20:11:56.442294 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:56.442155 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert podName:b3bab96c-fc58-4afa-886a-ff24380a19e6 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:00.442138829 +0000 UTC m=+41.349484228 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert") pod "ingress-canary-mchp6" (UID: "b3bab96c-fc58-4afa-886a-ff24380a19e6") : secret "canary-serving-cert" not found Apr 16 20:11:56.442294 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:56.442217 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:11:56.442294 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:56.442269 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls podName:7f8bccf8-1c1e-4890-98f3-747f76421e6c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:00.442253379 +0000 UTC m=+41.349598778 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-d2sbf" (UID: "7f8bccf8-1c1e-4890-98f3-747f76421e6c") : secret "samples-operator-tls" not found Apr 16 20:11:56.543413 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:56.542893 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls\") pod \"dns-default-8cq5p\" (UID: \"d89c84d3-91b3-4e2f-8457-b44d10b35a08\") " pod="openshift-dns/dns-default-8cq5p" Apr 16 20:11:56.543413 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:56.543068 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:11:56.543413 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:11:56.543116 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls podName:d89c84d3-91b3-4e2f-8457-b44d10b35a08 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:00.543103855 +0000 UTC m=+41.450449253 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls") pod "dns-default-8cq5p" (UID: "d89c84d3-91b3-4e2f-8457-b44d10b35a08") : secret "dns-default-metrics-tls" not found Apr 16 20:11:56.897046 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:56.896108 2571 generic.go:358] "Generic (PLEG): container finished" podID="9c611987-0423-4488-b0f7-408d1c68cda1" containerID="3074953d9271a12e3a4f676f51d0cfa85f19cd19b74515117e19a7a8a51393e9" exitCode=0 Apr 16 20:11:56.897046 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:56.896180 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fbpzt" event={"ID":"9c611987-0423-4488-b0f7-408d1c68cda1","Type":"ContainerDied","Data":"3074953d9271a12e3a4f676f51d0cfa85f19cd19b74515117e19a7a8a51393e9"} Apr 16 20:11:57.901679 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:57.901641 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fbpzt" event={"ID":"9c611987-0423-4488-b0f7-408d1c68cda1","Type":"ContainerStarted","Data":"a40e97127250cc27a390d418b19a4c9d0e5091c866d9dd27d459563e0456e3ce"} Apr 16 20:11:57.933785 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:57.933730 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fbpzt" podStartSLOduration=4.543701187 podStartE2EDuration="38.933707444s" podCreationTimestamp="2026-04-16 20:11:19 +0000 UTC" firstStartedPulling="2026-04-16 20:11:20.946358581 +0000 UTC m=+1.853703982" lastFinishedPulling="2026-04-16 20:11:55.336364841 +0000 UTC m=+36.243710239" observedRunningTime="2026-04-16 20:11:57.931378373 +0000 UTC m=+38.838723793" watchObservedRunningTime="2026-04-16 20:11:57.933707444 +0000 UTC m=+38.841052866" Apr 16 20:11:59.777767 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:59.777727 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28b55b45-a77c-407d-b55c-8a2538906ceb-original-pull-secret\") pod \"global-pull-secret-syncer-xqx4j\" (UID: \"28b55b45-a77c-407d-b55c-8a2538906ceb\") " pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:59.782507 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:59.782479 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/28b55b45-a77c-407d-b55c-8a2538906ceb-original-pull-secret\") pod \"global-pull-secret-syncer-xqx4j\" (UID: \"28b55b45-a77c-407d-b55c-8a2538906ceb\") " pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:11:59.931385 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:11:59.931347 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xqx4j" Apr 16 20:12:00.383805 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:00.383765 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h59sf\" (UID: \"05a809af-d1a2-4af3-9ac4-46b14e4aada1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h59sf" Apr 16 20:12:00.383992 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:00.383813 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:12:00.383992 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:00.383847 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:12:00.383992 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:00.383940 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:12:00.384169 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:00.383995 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:12:00.384169 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:00.384024 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle podName:6f259876-3b65-4751-ba98-118d8aa205f9 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:08.383996839 +0000 UTC m=+49.291342238 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle") pod "router-default-984b845fd-2gf9m" (UID: "6f259876-3b65-4751-ba98-118d8aa205f9") : configmap references non-existent config key: service-ca.crt Apr 16 20:12:00.384169 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:00.383947 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:12:00.384169 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:00.384058 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:12:00.384169 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:00.384070 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert podName:05a809af-d1a2-4af3-9ac4-46b14e4aada1 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:08.384060413 +0000 UTC m=+49.291405811 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-h59sf" (UID: "05a809af-d1a2-4af3-9ac4-46b14e4aada1") : secret "networking-console-plugin-cert" not found Apr 16 20:12:00.384169 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:00.384073 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5847757746-fm52t: secret "image-registry-tls" not found Apr 16 20:12:00.384169 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:00.384093 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wndvm\" (UID: \"dc063ce5-322d-48b4-9d08-1904d3210ecd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm" Apr 16 20:12:00.384169 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:00.384126 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls podName:bc1a4b01-eb2b-4766-9d17-3b931cb5aab1 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:08.384111667 +0000 UTC m=+49.291457088 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls") pod "image-registry-5847757746-fm52t" (UID: "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1") : secret "image-registry-tls" not found Apr 16 20:12:00.384169 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:00.384144 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:12:00.384169 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:00.384152 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs podName:6f259876-3b65-4751-ba98-118d8aa205f9 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:08.384141143 +0000 UTC m=+49.291486545 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs") pod "router-default-984b845fd-2gf9m" (UID: "6f259876-3b65-4751-ba98-118d8aa205f9") : secret "router-metrics-certs-default" not found Apr 16 20:12:00.384688 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:00.384181 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls podName:dc063ce5-322d-48b4-9d08-1904d3210ecd nodeName:}" failed. No retries permitted until 2026-04-16 20:12:08.384170148 +0000 UTC m=+49.291515548 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wndvm" (UID: "dc063ce5-322d-48b4-9d08-1904d3210ecd") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:12:00.484772 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:00.484725 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d2sbf\" (UID: \"7f8bccf8-1c1e-4890-98f3-747f76421e6c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf" Apr 16 20:12:00.484947 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:00.484888 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert\") pod \"ingress-canary-mchp6\" (UID: \"b3bab96c-fc58-4afa-886a-ff24380a19e6\") " pod="openshift-ingress-canary/ingress-canary-mchp6" Apr 16 20:12:00.484947 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:00.484903 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:12:00.485044 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:00.484962 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls podName:7f8bccf8-1c1e-4890-98f3-747f76421e6c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:08.48494726 +0000 UTC m=+49.392292658 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-d2sbf" (UID: "7f8bccf8-1c1e-4890-98f3-747f76421e6c") : secret "samples-operator-tls" not found Apr 16 20:12:00.485044 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:00.485020 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:00.485137 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:00.485076 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert podName:b3bab96c-fc58-4afa-886a-ff24380a19e6 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:08.48505979 +0000 UTC m=+49.392405191 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert") pod "ingress-canary-mchp6" (UID: "b3bab96c-fc58-4afa-886a-ff24380a19e6") : secret "canary-serving-cert" not found Apr 16 20:12:00.586226 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:00.586181 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls\") pod \"dns-default-8cq5p\" (UID: \"d89c84d3-91b3-4e2f-8457-b44d10b35a08\") " pod="openshift-dns/dns-default-8cq5p" Apr 16 20:12:00.586383 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:00.586335 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:00.586425 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:00.586405 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls podName:d89c84d3-91b3-4e2f-8457-b44d10b35a08 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:08.586389186 +0000 UTC m=+49.493734584 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls") pod "dns-default-8cq5p" (UID: "d89c84d3-91b3-4e2f-8457-b44d10b35a08") : secret "dns-default-metrics-tls" not found Apr 16 20:12:02.380004 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:02.379982 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xqx4j"] Apr 16 20:12:02.384939 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:12:02.384895 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28b55b45_a77c_407d_b55c_8a2538906ceb.slice/crio-ebb1f9bd6bb4be90ca7b782e4333264e9268f4878163617c407bc145a1c5afb8 WatchSource:0}: Error finding container ebb1f9bd6bb4be90ca7b782e4333264e9268f4878163617c407bc145a1c5afb8: Status 404 returned error can't find the container with id ebb1f9bd6bb4be90ca7b782e4333264e9268f4878163617c407bc145a1c5afb8 Apr 16 20:12:02.914606 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:02.914555 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8f7hz" event={"ID":"2d883e9c-34a0-4bc6-8784-879380b900d3","Type":"ContainerStarted","Data":"34f2d7dfe4272dab333dcc440b007e538772468c17a0f62e2e9662ff96f83527"} Apr 16 20:12:02.917209 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:02.917034 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/0.log" Apr 16 20:12:02.917209 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:02.917070 2571 generic.go:358] "Generic (PLEG): container finished" podID="0803ae09-3a9f-4d31-988c-4b5c2a29a6a2" containerID="82a22d0ce3fe09e346b4c3964d3543ac29f825a85d4a8815b06b063f8579c71b" exitCode=255 Apr 16 20:12:02.917209 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:02.917162 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" event={"ID":"0803ae09-3a9f-4d31-988c-4b5c2a29a6a2","Type":"ContainerDied","Data":"82a22d0ce3fe09e346b4c3964d3543ac29f825a85d4a8815b06b063f8579c71b"} Apr 16 20:12:02.917469 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:02.917451 2571 scope.go:117] "RemoveContainer" containerID="82a22d0ce3fe09e346b4c3964d3543ac29f825a85d4a8815b06b063f8579c71b" Apr 16 20:12:02.920302 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:02.919826 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gvgs9" event={"ID":"025cbe57-f51f-4203-a8cc-6b592a9735f7","Type":"ContainerStarted","Data":"b465319e8d071287bc9a471fd14920ccede893ea085f0b2a7cbb04e91b431216"} Apr 16 20:12:02.922386 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:02.921916 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-qlvhv" event={"ID":"89638695-826b-4c66-a544-96f10200a105","Type":"ContainerStarted","Data":"f0a0a5b71f77a2226e5db348bb5dd029f75af991ae976b9a9c7d32f3cffd8062"} Apr 16 20:12:02.925477 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:02.925232 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g5ns" event={"ID":"9293d674-a736-4062-a5ad-cc844313fbfe","Type":"ContainerStarted","Data":"075983a23efeed2cf1e6e6db8ad68d220ed4997c8c6130dd8ec96b979100eb21"} Apr 16 20:12:02.928510 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:02.928488 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p4xv5" event={"ID":"799e94dc-a712-492e-a369-129299525b15","Type":"ContainerStarted","Data":"597e8c329aab4b584e63088557ca6a64e8bf8eeb0ec7f657f6ec1bbcf2e48c94"} Apr 16 20:12:02.930482 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:02.930415 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xqx4j" event={"ID":"28b55b45-a77c-407d-b55c-8a2538906ceb","Type":"ContainerStarted","Data":"ebb1f9bd6bb4be90ca7b782e4333264e9268f4878163617c407bc145a1c5afb8"} Apr 16 20:12:02.933467 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:02.932768 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8f7hz" podStartSLOduration=31.047161904 podStartE2EDuration="37.932754981s" podCreationTimestamp="2026-04-16 20:11:25 +0000 UTC" firstStartedPulling="2026-04-16 20:11:55.365595041 +0000 UTC m=+36.272940438" lastFinishedPulling="2026-04-16 20:12:02.251188104 +0000 UTC m=+43.158533515" observedRunningTime="2026-04-16 20:12:02.931264883 +0000 UTC m=+43.838610304" watchObservedRunningTime="2026-04-16 20:12:02.932754981 +0000 UTC m=+43.840100402" Apr 16 20:12:02.974642 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:02.974081 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g5ns" podStartSLOduration=31.108141257 podStartE2EDuration="37.974059507s" podCreationTimestamp="2026-04-16 20:11:25 +0000 UTC" firstStartedPulling="2026-04-16 20:11:55.385258232 +0000 UTC m=+36.292603636" lastFinishedPulling="2026-04-16 20:12:02.251176481 +0000 UTC m=+43.158521886" observedRunningTime="2026-04-16 20:12:02.97366939 +0000 UTC m=+43.881014811" watchObservedRunningTime="2026-04-16 20:12:02.974059507 +0000 UTC m=+43.881404929" Apr 16 20:12:03.013368 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:03.013261 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-p4xv5" podStartSLOduration=31.131341497 podStartE2EDuration="38.01324677s" podCreationTimestamp="2026-04-16 20:11:25 +0000 UTC" firstStartedPulling="2026-04-16 20:11:55.369322514 +0000 UTC m=+36.276667916" lastFinishedPulling="2026-04-16 20:12:02.251227787 +0000 UTC m=+43.158573189" observedRunningTime="2026-04-16 20:12:02.994264905 +0000 UTC m=+43.901610326" watchObservedRunningTime="2026-04-16 20:12:03.01324677 +0000 UTC m=+43.920592189" Apr 16 20:12:03.013368 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:03.013334 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gvgs9" podStartSLOduration=31.13588778 podStartE2EDuration="38.013330325s" podCreationTimestamp="2026-04-16 20:11:25 +0000 UTC" firstStartedPulling="2026-04-16 20:11:55.384724961 +0000 UTC m=+36.292070365" lastFinishedPulling="2026-04-16 20:12:02.262167513 +0000 UTC m=+43.169512910" observedRunningTime="2026-04-16 20:12:03.012595843 +0000 UTC m=+43.919941263" watchObservedRunningTime="2026-04-16 20:12:03.013330325 +0000 UTC m=+43.920675785" Apr 16 20:12:03.052175 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:03.052115 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-qlvhv" podStartSLOduration=31.164774129 podStartE2EDuration="38.052094996s" podCreationTimestamp="2026-04-16 20:11:25 +0000 UTC" firstStartedPulling="2026-04-16 20:11:55.363854703 +0000 UTC m=+36.271200102" lastFinishedPulling="2026-04-16 20:12:02.251175568 +0000 UTC m=+43.158520969" observedRunningTime="2026-04-16 20:12:03.049900592 +0000 UTC m=+43.957246013" watchObservedRunningTime="2026-04-16 20:12:03.052094996 +0000 UTC m=+43.959440415" Apr 16 20:12:03.106184 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:03.106096 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" Apr 16 20:12:03.106184 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:03.106151 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" Apr 16 20:12:03.935822 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:03.935684 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:12:03.937543 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:03.937514 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/0.log" Apr 16 20:12:03.937829 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:03.937792 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" event={"ID":"0803ae09-3a9f-4d31-988c-4b5c2a29a6a2","Type":"ContainerDied","Data":"0f96bf0e1ab2b67cb8852d1c038e06ce9176ae7bfaf5ef19d7d74502ebdc8f31"} Apr 16 20:12:03.937948 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:03.937873 2571 scope.go:117] "RemoveContainer" containerID="82a22d0ce3fe09e346b4c3964d3543ac29f825a85d4a8815b06b063f8579c71b" Apr 16 20:12:03.938163 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:03.938137 2571 scope.go:117] "RemoveContainer" containerID="0f96bf0e1ab2b67cb8852d1c038e06ce9176ae7bfaf5ef19d7d74502ebdc8f31" Apr 16 20:12:03.938370 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:03.938351 2571 generic.go:358] "Generic (PLEG): container finished" podID="0803ae09-3a9f-4d31-988c-4b5c2a29a6a2" containerID="0f96bf0e1ab2b67cb8852d1c038e06ce9176ae7bfaf5ef19d7d74502ebdc8f31" exitCode=255 Apr 16 20:12:03.938370 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:03.938360 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-pjp69_openshift-console-operator(0803ae09-3a9f-4d31-988c-4b5c2a29a6a2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" podUID="0803ae09-3a9f-4d31-988c-4b5c2a29a6a2" Apr 16 20:12:04.915937 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:04.915898 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-5dbt4"] Apr 16 20:12:04.943067 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:04.942985 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:12:04.954726 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:04.954698 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-5dbt4"] Apr 16 20:12:04.954908 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:04.954747 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5dbt4" Apr 16 20:12:04.955047 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:04.955032 2571 scope.go:117] "RemoveContainer" containerID="0f96bf0e1ab2b67cb8852d1c038e06ce9176ae7bfaf5ef19d7d74502ebdc8f31" Apr 16 20:12:04.955243 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:04.955221 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-pjp69_openshift-console-operator(0803ae09-3a9f-4d31-988c-4b5c2a29a6a2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" podUID="0803ae09-3a9f-4d31-988c-4b5c2a29a6a2" Apr 16 20:12:04.957999 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:04.957922 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 20:12:04.958354 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:04.958333 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-4plxl\"" Apr 16 20:12:04.959146 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:04.959125 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 20:12:05.036955 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:05.036916 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxwkc\" (UniqueName: \"kubernetes.io/projected/93f97942-290d-4f48-a08a-5866a654cc19-kube-api-access-rxwkc\") pod \"migrator-74bb7799d9-5dbt4\" (UID: \"93f97942-290d-4f48-a08a-5866a654cc19\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5dbt4" Apr 16 20:12:05.137880 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:05.137844 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxwkc\" (UniqueName: \"kubernetes.io/projected/93f97942-290d-4f48-a08a-5866a654cc19-kube-api-access-rxwkc\") pod \"migrator-74bb7799d9-5dbt4\" (UID: \"93f97942-290d-4f48-a08a-5866a654cc19\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5dbt4" Apr 16 20:12:05.146641 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:05.146614 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxwkc\" (UniqueName: \"kubernetes.io/projected/93f97942-290d-4f48-a08a-5866a654cc19-kube-api-access-rxwkc\") pod \"migrator-74bb7799d9-5dbt4\" (UID: \"93f97942-290d-4f48-a08a-5866a654cc19\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5dbt4" Apr 16 20:12:05.280669 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:05.280568 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5dbt4" Apr 16 20:12:05.556403 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:05.556323 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-24qzw_916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e/dns-node-resolver/0.log" Apr 16 20:12:06.299933 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.299898 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-gn5vs"] Apr 16 20:12:06.330403 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.330372 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-gn5vs"] Apr 16 20:12:06.330596 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.330524 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-gn5vs" Apr 16 20:12:06.333513 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.333480 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-xnvdg\"" Apr 16 20:12:06.333649 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.333488 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 20:12:06.334432 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.334413 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 20:12:06.334432 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.334425 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 20:12:06.334674 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.334659 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 20:12:06.450270 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.450232 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/31fafad0-4daa-4ffd-b118-36b7ac7fc717-signing-key\") pod \"service-ca-865cb79987-gn5vs\" (UID: \"31fafad0-4daa-4ffd-b118-36b7ac7fc717\") " pod="openshift-service-ca/service-ca-865cb79987-gn5vs" Apr 16 20:12:06.450426 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.450273 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg45t\" (UniqueName: \"kubernetes.io/projected/31fafad0-4daa-4ffd-b118-36b7ac7fc717-kube-api-access-pg45t\") pod \"service-ca-865cb79987-gn5vs\" (UID: \"31fafad0-4daa-4ffd-b118-36b7ac7fc717\") " pod="openshift-service-ca/service-ca-865cb79987-gn5vs" Apr 16 20:12:06.450426 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.450304 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/31fafad0-4daa-4ffd-b118-36b7ac7fc717-signing-cabundle\") pod \"service-ca-865cb79987-gn5vs\" (UID: \"31fafad0-4daa-4ffd-b118-36b7ac7fc717\") " pod="openshift-service-ca/service-ca-865cb79987-gn5vs" Apr 16 20:12:06.551453 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.551424 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/31fafad0-4daa-4ffd-b118-36b7ac7fc717-signing-key\") pod \"service-ca-865cb79987-gn5vs\" (UID: \"31fafad0-4daa-4ffd-b118-36b7ac7fc717\") " pod="openshift-service-ca/service-ca-865cb79987-gn5vs" Apr 16 20:12:06.551591 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.551468 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pg45t\" (UniqueName: \"kubernetes.io/projected/31fafad0-4daa-4ffd-b118-36b7ac7fc717-kube-api-access-pg45t\") pod \"service-ca-865cb79987-gn5vs\" (UID: \"31fafad0-4daa-4ffd-b118-36b7ac7fc717\") " pod="openshift-service-ca/service-ca-865cb79987-gn5vs" Apr 16 20:12:06.551591 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.551487 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/31fafad0-4daa-4ffd-b118-36b7ac7fc717-signing-cabundle\") pod \"service-ca-865cb79987-gn5vs\" (UID: \"31fafad0-4daa-4ffd-b118-36b7ac7fc717\") " pod="openshift-service-ca/service-ca-865cb79987-gn5vs" Apr 16 20:12:06.553956 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.553930 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/31fafad0-4daa-4ffd-b118-36b7ac7fc717-signing-cabundle\") pod \"service-ca-865cb79987-gn5vs\" (UID: \"31fafad0-4daa-4ffd-b118-36b7ac7fc717\") " pod="openshift-service-ca/service-ca-865cb79987-gn5vs" Apr 16 20:12:06.555757 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.555722 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/31fafad0-4daa-4ffd-b118-36b7ac7fc717-signing-key\") pod \"service-ca-865cb79987-gn5vs\" (UID: \"31fafad0-4daa-4ffd-b118-36b7ac7fc717\") " pod="openshift-service-ca/service-ca-865cb79987-gn5vs" Apr 16 20:12:06.562110 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.562087 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg45t\" (UniqueName: \"kubernetes.io/projected/31fafad0-4daa-4ffd-b118-36b7ac7fc717-kube-api-access-pg45t\") pod \"service-ca-865cb79987-gn5vs\" (UID: \"31fafad0-4daa-4ffd-b118-36b7ac7fc717\") " pod="openshift-service-ca/service-ca-865cb79987-gn5vs" Apr 16 20:12:06.642404 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.642364 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-gn5vs" Apr 16 20:12:06.651828 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.651804 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-5dbt4"] Apr 16 20:12:06.654304 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:12:06.654275 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93f97942_290d_4f48_a08a_5866a654cc19.slice/crio-6068928ab2605be7a7dda001593b751ab2a6e14f7f33b253922e003c91863c51 WatchSource:0}: Error finding container 6068928ab2605be7a7dda001593b751ab2a6e14f7f33b253922e003c91863c51: Status 404 returned error can't find the container with id 6068928ab2605be7a7dda001593b751ab2a6e14f7f33b253922e003c91863c51 Apr 16 20:12:06.761632 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.761600 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-gn5vs"] Apr 16 20:12:06.899516 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:12:06.899474 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31fafad0_4daa_4ffd_b118_36b7ac7fc717.slice/crio-c804c35a11777231a388864412047af875e51826f9812c4423371b6ebde728c5 WatchSource:0}: Error finding container c804c35a11777231a388864412047af875e51826f9812c4423371b6ebde728c5: Status 404 returned error can't find the container with id c804c35a11777231a388864412047af875e51826f9812c4423371b6ebde728c5 Apr 16 20:12:06.949748 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.949720 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-gn5vs" event={"ID":"31fafad0-4daa-4ffd-b118-36b7ac7fc717","Type":"ContainerStarted","Data":"c804c35a11777231a388864412047af875e51826f9812c4423371b6ebde728c5"} Apr 16 20:12:06.950754 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.950729 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5dbt4" event={"ID":"93f97942-290d-4f48-a08a-5866a654cc19","Type":"ContainerStarted","Data":"6068928ab2605be7a7dda001593b751ab2a6e14f7f33b253922e003c91863c51"} Apr 16 20:12:06.964129 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:06.964109 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xz7cv_7aa3c845-972e-41e1-89d2-9126f2eb4905/node-ca/0.log" Apr 16 20:12:07.955698 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:07.955650 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xqx4j" event={"ID":"28b55b45-a77c-407d-b55c-8a2538906ceb","Type":"ContainerStarted","Data":"bdf4f322e52fe160775b1498c22e0354970fda163cc52bb6de2d8d750886612b"} Apr 16 20:12:07.957460 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:07.957430 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-gn5vs" event={"ID":"31fafad0-4daa-4ffd-b118-36b7ac7fc717","Type":"ContainerStarted","Data":"e2ef98d21c5330ba5e81ba7356d2631be4bc86e0d10ceae1f7a08898556c0fcf"} Apr 16 20:12:07.974455 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:07.974408 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xqx4j" podStartSLOduration=20.424208698 podStartE2EDuration="24.974391563s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:12:02.387026906 +0000 UTC m=+43.294372305" lastFinishedPulling="2026-04-16 20:12:06.937209769 +0000 UTC m=+47.844555170" observedRunningTime="2026-04-16 20:12:07.973428213 +0000 UTC m=+48.880773634" watchObservedRunningTime="2026-04-16 20:12:07.974391563 +0000 UTC m=+48.881736983" Apr 16 20:12:07.992617 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:07.992415 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-gn5vs" podStartSLOduration=1.9923953509999999 podStartE2EDuration="1.992395351s" podCreationTimestamp="2026-04-16 20:12:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:12:07.991376907 +0000 UTC m=+48.898722329" watchObservedRunningTime="2026-04-16 20:12:07.992395351 +0000 UTC m=+48.899740772" Apr 16 20:12:08.470375 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:08.470340 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h59sf\" (UID: \"05a809af-d1a2-4af3-9ac4-46b14e4aada1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h59sf" Apr 16 20:12:08.470375 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:08.470381 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:12:08.470663 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:08.470404 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:12:08.470663 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:08.470489 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 20:12:08.470663 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:08.470499 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:12:08.470663 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:08.470540 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle podName:6f259876-3b65-4751-ba98-118d8aa205f9 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:24.470506468 +0000 UTC m=+65.377851880 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle") pod "router-default-984b845fd-2gf9m" (UID: "6f259876-3b65-4751-ba98-118d8aa205f9") : configmap references non-existent config key: service-ca.crt Apr 16 20:12:08.470663 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:08.470580 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert podName:05a809af-d1a2-4af3-9ac4-46b14e4aada1 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:24.470562873 +0000 UTC m=+65.377908275 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-h59sf" (UID: "05a809af-d1a2-4af3-9ac4-46b14e4aada1") : secret "networking-console-plugin-cert" not found Apr 16 20:12:08.470663 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:08.470643 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:12:08.470663 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:08.470662 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs podName:6f259876-3b65-4751-ba98-118d8aa205f9 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:24.470644559 +0000 UTC m=+65.377989959 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs") pod "router-default-984b845fd-2gf9m" (UID: "6f259876-3b65-4751-ba98-118d8aa205f9") : secret "router-metrics-certs-default" not found Apr 16 20:12:08.470935 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:08.470701 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wndvm\" (UID: \"dc063ce5-322d-48b4-9d08-1904d3210ecd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm" Apr 16 20:12:08.470935 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:08.470711 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:12:08.470935 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:08.470725 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5847757746-fm52t: secret "image-registry-tls" not found Apr 16 20:12:08.470935 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:08.470756 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:12:08.470935 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:08.470771 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls podName:bc1a4b01-eb2b-4766-9d17-3b931cb5aab1 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:24.470757314 +0000 UTC m=+65.378102713 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls") pod "image-registry-5847757746-fm52t" (UID: "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1") : secret "image-registry-tls" not found Apr 16 20:12:08.470935 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:08.470785 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls podName:dc063ce5-322d-48b4-9d08-1904d3210ecd nodeName:}" failed. No retries permitted until 2026-04-16 20:12:24.470777059 +0000 UTC m=+65.378122457 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wndvm" (UID: "dc063ce5-322d-48b4-9d08-1904d3210ecd") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:12:08.571665 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:08.571638 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d2sbf\" (UID: \"7f8bccf8-1c1e-4890-98f3-747f76421e6c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf" Apr 16 20:12:08.571816 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:08.571794 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert\") pod \"ingress-canary-mchp6\" (UID: \"b3bab96c-fc58-4afa-886a-ff24380a19e6\") " pod="openshift-ingress-canary/ingress-canary-mchp6" Apr 16 20:12:08.572076 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:08.571990 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:08.572076 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:08.572026 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 20:12:08.572076 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:08.572060 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert podName:b3bab96c-fc58-4afa-886a-ff24380a19e6 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:24.572041623 +0000 UTC m=+65.479387026 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert") pod "ingress-canary-mchp6" (UID: "b3bab96c-fc58-4afa-886a-ff24380a19e6") : secret "canary-serving-cert" not found Apr 16 20:12:08.572076 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:08.572077 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls podName:7f8bccf8-1c1e-4890-98f3-747f76421e6c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:24.57206888 +0000 UTC m=+65.479414278 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-d2sbf" (UID: "7f8bccf8-1c1e-4890-98f3-747f76421e6c") : secret "samples-operator-tls" not found Apr 16 20:12:08.673302 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:08.673266 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls\") pod \"dns-default-8cq5p\" (UID: \"d89c84d3-91b3-4e2f-8457-b44d10b35a08\") " pod="openshift-dns/dns-default-8cq5p" Apr 16 20:12:08.673506 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:08.673489 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:08.673594 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:08.673568 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls podName:d89c84d3-91b3-4e2f-8457-b44d10b35a08 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:24.673548559 +0000 UTC m=+65.580893970 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls") pod "dns-default-8cq5p" (UID: "d89c84d3-91b3-4e2f-8457-b44d10b35a08") : secret "dns-default-metrics-tls" not found Apr 16 20:12:08.963277 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:08.963236 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5dbt4" event={"ID":"93f97942-290d-4f48-a08a-5866a654cc19","Type":"ContainerStarted","Data":"bfbe9be0a7f08bb6cef1595358677714bbab515b805d8a9b3cdf22eed7abc7e6"} Apr 16 20:12:08.963277 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:08.963279 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5dbt4" event={"ID":"93f97942-290d-4f48-a08a-5866a654cc19","Type":"ContainerStarted","Data":"401bee783360b1edac4c147b41636b3054996be3569ceba470825f9a825701db"} Apr 16 20:12:08.980027 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:08.979972 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-5dbt4" podStartSLOduration=3.101891325 podStartE2EDuration="4.979957252s" podCreationTimestamp="2026-04-16 20:12:04 +0000 UTC" firstStartedPulling="2026-04-16 20:12:06.656363108 +0000 UTC m=+47.563708507" lastFinishedPulling="2026-04-16 20:12:08.53442902 +0000 UTC m=+49.441774434" observedRunningTime="2026-04-16 20:12:08.979278422 +0000 UTC m=+49.886623843" watchObservedRunningTime="2026-04-16 20:12:08.979957252 +0000 UTC m=+49.887302673" Apr 16 20:12:13.105820 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:13.105781 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" Apr 16 20:12:13.105820 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:13.105819 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" Apr 16 20:12:13.106244 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:13.106200 2571 scope.go:117] "RemoveContainer" containerID="0f96bf0e1ab2b67cb8852d1c038e06ce9176ae7bfaf5ef19d7d74502ebdc8f31" Apr 16 20:12:13.106380 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:13.106362 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-pjp69_openshift-console-operator(0803ae09-3a9f-4d31-988c-4b5c2a29a6a2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" podUID="0803ae09-3a9f-4d31-988c-4b5c2a29a6a2" Apr 16 20:12:18.873969 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:18.873942 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-blr8v" Apr 16 20:12:24.316875 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.316837 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs\") pod \"network-metrics-daemon-h5ntx\" (UID: \"ee593b7f-fc54-40a3-af7d-f5643196a107\") " pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:12:24.319654 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.319634 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:12:24.330027 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.330008 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee593b7f-fc54-40a3-af7d-f5643196a107-metrics-certs\") pod \"network-metrics-daemon-h5ntx\" (UID: \"ee593b7f-fc54-40a3-af7d-f5643196a107\") " pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:12:24.417988 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.417950 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4xg7\" (UniqueName: \"kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7\") pod \"network-check-target-zdf2s\" (UID: \"ade2626d-4dc5-4796-9c91-0c0699095807\") " pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:12:24.420681 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.420649 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4xg7\" (UniqueName: \"kubernetes.io/projected/ade2626d-4dc5-4796-9c91-0c0699095807-kube-api-access-s4xg7\") pod \"network-check-target-zdf2s\" (UID: \"ade2626d-4dc5-4796-9c91-0c0699095807\") " pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:12:24.518556 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.518487 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:12:24.518556 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.518551 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wndvm\" (UID: \"dc063ce5-322d-48b4-9d08-1904d3210ecd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm" Apr 16 20:12:24.518807 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.518622 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h59sf\" (UID: \"05a809af-d1a2-4af3-9ac4-46b14e4aada1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h59sf" Apr 16 20:12:24.518807 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.518654 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:12:24.518807 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.518688 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:12:24.519375 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.519347 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f259876-3b65-4751-ba98-118d8aa205f9-service-ca-bundle\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:12:24.521101 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.521081 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls\") pod \"image-registry-5847757746-fm52t\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:12:24.521101 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.521092 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f259876-3b65-4751-ba98-118d8aa205f9-metrics-certs\") pod \"router-default-984b845fd-2gf9m\" (UID: \"6f259876-3b65-4751-ba98-118d8aa205f9\") " pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:12:24.521477 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.521458 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc063ce5-322d-48b4-9d08-1904d3210ecd-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wndvm\" (UID: \"dc063ce5-322d-48b4-9d08-1904d3210ecd\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm" Apr 16 20:12:24.529502 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.529479 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/05a809af-d1a2-4af3-9ac4-46b14e4aada1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-h59sf\" (UID: \"05a809af-d1a2-4af3-9ac4-46b14e4aada1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-h59sf" Apr 16 20:12:24.540995 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.540973 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-b2xrn\"" Apr 16 20:12:24.545989 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.545970 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jm2hn\"" Apr 16 20:12:24.548689 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.548674 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5ntx" Apr 16 20:12:24.554387 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.554365 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:12:24.621396 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.620385 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert\") pod \"ingress-canary-mchp6\" (UID: \"b3bab96c-fc58-4afa-886a-ff24380a19e6\") " pod="openshift-ingress-canary/ingress-canary-mchp6" Apr 16 20:12:24.621396 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.620470 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d2sbf\" (UID: \"7f8bccf8-1c1e-4890-98f3-747f76421e6c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf" Apr 16 20:12:24.624226 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.624161 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3bab96c-fc58-4afa-886a-ff24380a19e6-cert\") pod \"ingress-canary-mchp6\" (UID: \"b3bab96c-fc58-4afa-886a-ff24380a19e6\") " pod="openshift-ingress-canary/ingress-canary-mchp6" Apr 16 20:12:24.624464 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.624439 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7f8bccf8-1c1e-4890-98f3-747f76421e6c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-d2sbf\" (UID: \"7f8bccf8-1c1e-4890-98f3-747f76421e6c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf" Apr 16 20:12:24.661636 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.661514 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-g68ln\"" Apr 16 20:12:24.669700 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.669664 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm" Apr 16 20:12:24.678020 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.677984 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ksxbp\"" Apr 16 20:12:24.681720 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.681689 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h5ntx"] Apr 16 20:12:24.684719 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:12:24.684691 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee593b7f_fc54_40a3_af7d_f5643196a107.slice/crio-1e55173081a4e10bf4c515a843fac10fdefa48af6050ac04783ab9d4afd6126d WatchSource:0}: Error finding container 1e55173081a4e10bf4c515a843fac10fdefa48af6050ac04783ab9d4afd6126d: Status 404 returned error can't find the container with id 1e55173081a4e10bf4c515a843fac10fdefa48af6050ac04783ab9d4afd6126d Apr 16 20:12:24.685620 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.685600 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:12:24.697393 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.697365 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zdf2s"] Apr 16 20:12:24.700590 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:12:24.700515 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podade2626d_4dc5_4796_9c91_0c0699095807.slice/crio-229135c69aa9282dbb2384ca031cc30d3b684837f5aa8fd5d1bc91c9d0400f14 WatchSource:0}: Error finding container 229135c69aa9282dbb2384ca031cc30d3b684837f5aa8fd5d1bc91c9d0400f14: Status 404 returned error can't find the container with id 229135c69aa9282dbb2384ca031cc30d3b684837f5aa8fd5d1bc91c9d0400f14 Apr 16 20:12:24.711797 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.711772 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-6s4hm\"" Apr 16 20:12:24.720489 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.720425 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-h59sf" Apr 16 20:12:24.721715 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.721175 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls\") pod \"dns-default-8cq5p\" (UID: \"d89c84d3-91b3-4e2f-8457-b44d10b35a08\") " pod="openshift-dns/dns-default-8cq5p" Apr 16 20:12:24.731940 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.731692 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-2r8gt\"" Apr 16 20:12:24.738885 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.738845 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:12:24.739164 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.738921 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d89c84d3-91b3-4e2f-8457-b44d10b35a08-metrics-tls\") pod \"dns-default-8cq5p\" (UID: \"d89c84d3-91b3-4e2f-8457-b44d10b35a08\") " pod="openshift-dns/dns-default-8cq5p" Apr 16 20:12:24.796257 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.795994 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-fkq6l\"" Apr 16 20:12:24.804996 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.800602 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf" Apr 16 20:12:24.824160 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.823994 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm"] Apr 16 20:12:24.829001 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:12:24.828963 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc063ce5_322d_48b4_9d08_1904d3210ecd.slice/crio-adc218499dbd2edf6e144f3e0b86d6dad093082e1f341727c2338fd52755e08b WatchSource:0}: Error finding container adc218499dbd2edf6e144f3e0b86d6dad093082e1f341727c2338fd52755e08b: Status 404 returned error can't find the container with id adc218499dbd2edf6e144f3e0b86d6dad093082e1f341727c2338fd52755e08b Apr 16 20:12:24.839970 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.836992 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-v7v7t\"" Apr 16 20:12:24.845690 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.845344 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mchp6" Apr 16 20:12:24.857935 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.857815 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5847757746-fm52t"] Apr 16 20:12:24.865575 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:12:24.865480 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc1a4b01_eb2b_4766_9d17_3b931cb5aab1.slice/crio-532f909bb8d3d29374e6bbfd39980ec0459c1a97c709b892af2a86812e683329 WatchSource:0}: Error finding container 532f909bb8d3d29374e6bbfd39980ec0459c1a97c709b892af2a86812e683329: Status 404 returned error can't find the container with id 532f909bb8d3d29374e6bbfd39980ec0459c1a97c709b892af2a86812e683329 Apr 16 20:12:24.878336 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.878129 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tnjdz\"" Apr 16 20:12:24.886022 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.885933 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8cq5p" Apr 16 20:12:24.901804 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.901703 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-h59sf"] Apr 16 20:12:24.909335 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:12:24.909302 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05a809af_d1a2_4af3_9ac4_46b14e4aada1.slice/crio-1f8062236b2466857fdd611b9217f53c1f4e96fb5acff9788502b288d267b904 WatchSource:0}: Error finding container 1f8062236b2466857fdd611b9217f53c1f4e96fb5acff9788502b288d267b904: Status 404 returned error can't find the container with id 1f8062236b2466857fdd611b9217f53c1f4e96fb5acff9788502b288d267b904 Apr 16 20:12:24.944274 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:24.943732 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-984b845fd-2gf9m"] Apr 16 20:12:24.955552 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:12:24.955443 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f259876_3b65_4751_ba98_118d8aa205f9.slice/crio-875e2e9c70a778aa704bf24a409012b8cf2a3e107946ccdec044329cdc579cc8 WatchSource:0}: Error finding container 875e2e9c70a778aa704bf24a409012b8cf2a3e107946ccdec044329cdc579cc8: Status 404 returned error can't find the container with id 875e2e9c70a778aa704bf24a409012b8cf2a3e107946ccdec044329cdc579cc8 Apr 16 20:12:25.005430 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:25.005372 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-h59sf" event={"ID":"05a809af-d1a2-4af3-9ac4-46b14e4aada1","Type":"ContainerStarted","Data":"1f8062236b2466857fdd611b9217f53c1f4e96fb5acff9788502b288d267b904"} Apr 16 20:12:25.006822 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:25.006789 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5847757746-fm52t" event={"ID":"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1","Type":"ContainerStarted","Data":"532f909bb8d3d29374e6bbfd39980ec0459c1a97c709b892af2a86812e683329"} Apr 16 20:12:25.008387 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:25.008189 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zdf2s" event={"ID":"ade2626d-4dc5-4796-9c91-0c0699095807","Type":"ContainerStarted","Data":"36cab714658c2ba5211abb876f196903c1d38064fc08e3bef95118e7cb2d9aa7"} Apr 16 20:12:25.008387 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:25.008215 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zdf2s" event={"ID":"ade2626d-4dc5-4796-9c91-0c0699095807","Type":"ContainerStarted","Data":"229135c69aa9282dbb2384ca031cc30d3b684837f5aa8fd5d1bc91c9d0400f14"} Apr 16 20:12:25.008644 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:25.008606 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:12:25.009817 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:25.009791 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h5ntx" event={"ID":"ee593b7f-fc54-40a3-af7d-f5643196a107","Type":"ContainerStarted","Data":"1e55173081a4e10bf4c515a843fac10fdefa48af6050ac04783ab9d4afd6126d"} Apr 16 20:12:25.010806 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:25.010769 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-984b845fd-2gf9m" event={"ID":"6f259876-3b65-4751-ba98-118d8aa205f9","Type":"ContainerStarted","Data":"875e2e9c70a778aa704bf24a409012b8cf2a3e107946ccdec044329cdc579cc8"} Apr 16 20:12:25.015623 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:25.015597 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf"] Apr 16 20:12:25.017036 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:25.016675 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm" event={"ID":"dc063ce5-322d-48b4-9d08-1904d3210ecd","Type":"ContainerStarted","Data":"adc218499dbd2edf6e144f3e0b86d6dad093082e1f341727c2338fd52755e08b"} Apr 16 20:12:25.049417 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:25.049364 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zdf2s" podStartSLOduration=66.04934371 podStartE2EDuration="1m6.04934371s" podCreationTimestamp="2026-04-16 20:11:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:12:25.02981441 +0000 UTC m=+65.937159833" watchObservedRunningTime="2026-04-16 20:12:25.04934371 +0000 UTC m=+65.956689131" Apr 16 20:12:25.049687 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:25.049669 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mchp6"] Apr 16 20:12:25.074610 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:25.074506 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8cq5p"] Apr 16 20:12:25.077543 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:12:25.077498 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd89c84d3_91b3_4e2f_8457_b44d10b35a08.slice/crio-cc033f0079b560c0893ba01600e5f7fba3712780556d1d20a81101413382bc00 WatchSource:0}: Error finding container cc033f0079b560c0893ba01600e5f7fba3712780556d1d20a81101413382bc00: Status 404 returned error can't find the container with id cc033f0079b560c0893ba01600e5f7fba3712780556d1d20a81101413382bc00 Apr 16 20:12:26.026419 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:26.025546 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5847757746-fm52t" event={"ID":"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1","Type":"ContainerStarted","Data":"0d2885ec3c4c868f6fe277cde1a89c76bf360db5f0850e4a526d5d62fa113fe9"} Apr 16 20:12:26.026419 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:26.026372 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:12:26.028513 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:26.028397 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mchp6" event={"ID":"b3bab96c-fc58-4afa-886a-ff24380a19e6","Type":"ContainerStarted","Data":"4e68b736d6fed9faf7f26cce5c774834d45f5b149e6d387d324a2fd50057eae3"} Apr 16 20:12:26.031390 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:26.030517 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8cq5p" event={"ID":"d89c84d3-91b3-4e2f-8457-b44d10b35a08","Type":"ContainerStarted","Data":"cc033f0079b560c0893ba01600e5f7fba3712780556d1d20a81101413382bc00"} Apr 16 20:12:26.033865 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:26.033828 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-984b845fd-2gf9m" event={"ID":"6f259876-3b65-4751-ba98-118d8aa205f9","Type":"ContainerStarted","Data":"3964511afd07d1c914afac190a19c905b0e2b3fdb1a4daa5fe9e5038d46863e4"} Apr 16 20:12:26.036986 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:26.036854 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf" event={"ID":"7f8bccf8-1c1e-4890-98f3-747f76421e6c","Type":"ContainerStarted","Data":"450d95a33d4ecad99839741bcce3dc62596a70fd1d3302b7ea489a6dd213b57d"} Apr 16 20:12:26.048753 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:26.047399 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5847757746-fm52t" podStartSLOduration=66.047381211 podStartE2EDuration="1m6.047381211s" podCreationTimestamp="2026-04-16 20:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:12:26.046270831 +0000 UTC m=+66.953616252" watchObservedRunningTime="2026-04-16 20:12:26.047381211 +0000 UTC m=+66.954726612" Apr 16 20:12:26.070557 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:26.070250 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-984b845fd-2gf9m" podStartSLOduration=61.070228278 podStartE2EDuration="1m1.070228278s" podCreationTimestamp="2026-04-16 20:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:12:26.068786135 +0000 UTC m=+66.976131578" watchObservedRunningTime="2026-04-16 20:12:26.070228278 +0000 UTC m=+66.977573699" Apr 16 20:12:26.739544 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:26.739481 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:12:26.742247 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:26.742223 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:12:27.040615 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.040237 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:12:27.041743 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.041671 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-984b845fd-2gf9m" Apr 16 20:12:27.138731 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.137218 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7"] Apr 16 20:12:27.141418 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.141393 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-695457864-sstkk"] Apr 16 20:12:27.141596 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.141580 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" Apr 16 20:12:27.144912 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.144428 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-94d664f4f-b89hv"] Apr 16 20:12:27.144912 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.144641 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 20:12:27.145109 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.145084 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 20:12:27.145286 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.145197 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-695457864-sstkk" Apr 16 20:12:27.145353 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.145328 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 20:12:27.145844 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.145516 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 20:12:27.145844 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.145696 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 20:12:27.145844 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.145736 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 20:12:27.145844 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.145798 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 20:12:27.147868 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.147846 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 20:12:27.148250 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.148231 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-94d664f4f-b89hv" Apr 16 20:12:27.150558 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.150516 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 20:12:27.150666 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.150567 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-mbfff\"" Apr 16 20:12:27.153400 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.153378 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-695457864-sstkk"] Apr 16 20:12:27.155965 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.155949 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7"] Apr 16 20:12:27.157099 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.157075 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-94d664f4f-b89hv"] Apr 16 20:12:27.250624 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.250581 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0bbd0670-7dbf-4928-a161-363090b3dc2f-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-94d664f4f-b89hv\" (UID: \"0bbd0670-7dbf-4928-a161-363090b3dc2f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-94d664f4f-b89hv" Apr 16 20:12:27.250802 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.250636 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/7a7ec274-8a20-488b-b3be-ff44a04a1cbd-klusterlet-config\") pod \"klusterlet-addon-workmgr-695457864-sstkk\" (UID: \"7a7ec274-8a20-488b-b3be-ff44a04a1cbd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-695457864-sstkk" Apr 16 20:12:27.250802 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.250676 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f5a66d9e-b3ca-40dd-a57f-474a431be482-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-786f447f9f-rtft7\" (UID: \"f5a66d9e-b3ca-40dd-a57f-474a431be482\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" Apr 16 20:12:27.250802 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.250711 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f5a66d9e-b3ca-40dd-a57f-474a431be482-ca\") pod \"cluster-proxy-proxy-agent-786f447f9f-rtft7\" (UID: \"f5a66d9e-b3ca-40dd-a57f-474a431be482\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" Apr 16 20:12:27.250802 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.250733 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f5a66d9e-b3ca-40dd-a57f-474a431be482-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-786f447f9f-rtft7\" (UID: \"f5a66d9e-b3ca-40dd-a57f-474a431be482\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" Apr 16 20:12:27.250802 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.250770 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg88q\" (UniqueName: \"kubernetes.io/projected/7a7ec274-8a20-488b-b3be-ff44a04a1cbd-kube-api-access-qg88q\") pod \"klusterlet-addon-workmgr-695457864-sstkk\" (UID: \"7a7ec274-8a20-488b-b3be-ff44a04a1cbd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-695457864-sstkk" Apr 16 20:12:27.251038 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.250856 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9prp8\" (UniqueName: \"kubernetes.io/projected/0bbd0670-7dbf-4928-a161-363090b3dc2f-kube-api-access-9prp8\") pod \"managed-serviceaccount-addon-agent-94d664f4f-b89hv\" (UID: \"0bbd0670-7dbf-4928-a161-363090b3dc2f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-94d664f4f-b89hv" Apr 16 20:12:27.251038 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.250883 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f5a66d9e-b3ca-40dd-a57f-474a431be482-hub\") pod \"cluster-proxy-proxy-agent-786f447f9f-rtft7\" (UID: \"f5a66d9e-b3ca-40dd-a57f-474a431be482\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" Apr 16 20:12:27.251038 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.250933 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f5a66d9e-b3ca-40dd-a57f-474a431be482-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-786f447f9f-rtft7\" (UID: \"f5a66d9e-b3ca-40dd-a57f-474a431be482\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" Apr 16 20:12:27.251038 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.250984 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7a7ec274-8a20-488b-b3be-ff44a04a1cbd-tmp\") pod \"klusterlet-addon-workmgr-695457864-sstkk\" (UID: \"7a7ec274-8a20-488b-b3be-ff44a04a1cbd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-695457864-sstkk" Apr 16 20:12:27.251177 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.251043 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kzpx\" (UniqueName: \"kubernetes.io/projected/f5a66d9e-b3ca-40dd-a57f-474a431be482-kube-api-access-7kzpx\") pod \"cluster-proxy-proxy-agent-786f447f9f-rtft7\" (UID: \"f5a66d9e-b3ca-40dd-a57f-474a431be482\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" Apr 16 20:12:27.291891 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.291810 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-nbhmg"] Apr 16 20:12:27.295606 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.295583 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nbhmg" Apr 16 20:12:27.298849 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.298816 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 20:12:27.298967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.298957 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bmglr\"" Apr 16 20:12:27.299499 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.299477 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 20:12:27.323997 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.323968 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nbhmg"] Apr 16 20:12:27.352483 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.352451 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9prp8\" (UniqueName: \"kubernetes.io/projected/0bbd0670-7dbf-4928-a161-363090b3dc2f-kube-api-access-9prp8\") pod \"managed-serviceaccount-addon-agent-94d664f4f-b89hv\" (UID: \"0bbd0670-7dbf-4928-a161-363090b3dc2f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-94d664f4f-b89hv" Apr 16 20:12:27.352483 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.352486 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f5a66d9e-b3ca-40dd-a57f-474a431be482-hub\") pod \"cluster-proxy-proxy-agent-786f447f9f-rtft7\" (UID: \"f5a66d9e-b3ca-40dd-a57f-474a431be482\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" Apr 16 20:12:27.352769 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.352508 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f5a66d9e-b3ca-40dd-a57f-474a431be482-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-786f447f9f-rtft7\" (UID: \"f5a66d9e-b3ca-40dd-a57f-474a431be482\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" Apr 16 20:12:27.352769 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.352555 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7a7ec274-8a20-488b-b3be-ff44a04a1cbd-tmp\") pod \"klusterlet-addon-workmgr-695457864-sstkk\" (UID: \"7a7ec274-8a20-488b-b3be-ff44a04a1cbd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-695457864-sstkk" Apr 16 20:12:27.352769 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.352584 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kzpx\" (UniqueName: \"kubernetes.io/projected/f5a66d9e-b3ca-40dd-a57f-474a431be482-kube-api-access-7kzpx\") pod \"cluster-proxy-proxy-agent-786f447f9f-rtft7\" (UID: \"f5a66d9e-b3ca-40dd-a57f-474a431be482\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" Apr 16 20:12:27.352769 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.352623 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0bbd0670-7dbf-4928-a161-363090b3dc2f-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-94d664f4f-b89hv\" (UID: \"0bbd0670-7dbf-4928-a161-363090b3dc2f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-94d664f4f-b89hv" Apr 16 20:12:27.352769 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.352650 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/7a7ec274-8a20-488b-b3be-ff44a04a1cbd-klusterlet-config\") pod \"klusterlet-addon-workmgr-695457864-sstkk\" (UID: \"7a7ec274-8a20-488b-b3be-ff44a04a1cbd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-695457864-sstkk" Apr 16 20:12:27.352769 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.352691 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f5a66d9e-b3ca-40dd-a57f-474a431be482-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-786f447f9f-rtft7\" (UID: \"f5a66d9e-b3ca-40dd-a57f-474a431be482\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" Apr 16 20:12:27.352769 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.352726 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f5a66d9e-b3ca-40dd-a57f-474a431be482-ca\") pod \"cluster-proxy-proxy-agent-786f447f9f-rtft7\" (UID: \"f5a66d9e-b3ca-40dd-a57f-474a431be482\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" Apr 16 20:12:27.352769 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.352754 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f5a66d9e-b3ca-40dd-a57f-474a431be482-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-786f447f9f-rtft7\" (UID: \"f5a66d9e-b3ca-40dd-a57f-474a431be482\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" Apr 16 20:12:27.354561 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.352791 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qg88q\" (UniqueName: \"kubernetes.io/projected/7a7ec274-8a20-488b-b3be-ff44a04a1cbd-kube-api-access-qg88q\") pod \"klusterlet-addon-workmgr-695457864-sstkk\" (UID: \"7a7ec274-8a20-488b-b3be-ff44a04a1cbd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-695457864-sstkk" Apr 16 20:12:27.354561 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.353171 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7a7ec274-8a20-488b-b3be-ff44a04a1cbd-tmp\") pod \"klusterlet-addon-workmgr-695457864-sstkk\" (UID: \"7a7ec274-8a20-488b-b3be-ff44a04a1cbd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-695457864-sstkk" Apr 16 20:12:27.354561 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.353446 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f5a66d9e-b3ca-40dd-a57f-474a431be482-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-786f447f9f-rtft7\" (UID: \"f5a66d9e-b3ca-40dd-a57f-474a431be482\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" Apr 16 20:12:27.355840 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.355819 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f5a66d9e-b3ca-40dd-a57f-474a431be482-hub\") pod \"cluster-proxy-proxy-agent-786f447f9f-rtft7\" (UID: \"f5a66d9e-b3ca-40dd-a57f-474a431be482\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" Apr 16 20:12:27.356701 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.356676 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f5a66d9e-b3ca-40dd-a57f-474a431be482-ca\") pod \"cluster-proxy-proxy-agent-786f447f9f-rtft7\" (UID: \"f5a66d9e-b3ca-40dd-a57f-474a431be482\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" Apr 16 20:12:27.356792 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.356710 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f5a66d9e-b3ca-40dd-a57f-474a431be482-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-786f447f9f-rtft7\" (UID: \"f5a66d9e-b3ca-40dd-a57f-474a431be482\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" Apr 16 20:12:27.356792 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.356756 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/7a7ec274-8a20-488b-b3be-ff44a04a1cbd-klusterlet-config\") pod \"klusterlet-addon-workmgr-695457864-sstkk\" (UID: \"7a7ec274-8a20-488b-b3be-ff44a04a1cbd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-695457864-sstkk" Apr 16 20:12:27.356859 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.356818 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f5a66d9e-b3ca-40dd-a57f-474a431be482-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-786f447f9f-rtft7\" (UID: \"f5a66d9e-b3ca-40dd-a57f-474a431be482\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" Apr 16 20:12:27.356944 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.356924 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0bbd0670-7dbf-4928-a161-363090b3dc2f-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-94d664f4f-b89hv\" (UID: \"0bbd0670-7dbf-4928-a161-363090b3dc2f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-94d664f4f-b89hv" Apr 16 20:12:27.387747 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.387719 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg88q\" (UniqueName: \"kubernetes.io/projected/7a7ec274-8a20-488b-b3be-ff44a04a1cbd-kube-api-access-qg88q\") pod \"klusterlet-addon-workmgr-695457864-sstkk\" (UID: \"7a7ec274-8a20-488b-b3be-ff44a04a1cbd\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-695457864-sstkk" Apr 16 20:12:27.387882 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.387782 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9prp8\" (UniqueName: \"kubernetes.io/projected/0bbd0670-7dbf-4928-a161-363090b3dc2f-kube-api-access-9prp8\") pod \"managed-serviceaccount-addon-agent-94d664f4f-b89hv\" (UID: \"0bbd0670-7dbf-4928-a161-363090b3dc2f\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-94d664f4f-b89hv" Apr 16 20:12:27.387882 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.387826 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kzpx\" (UniqueName: \"kubernetes.io/projected/f5a66d9e-b3ca-40dd-a57f-474a431be482-kube-api-access-7kzpx\") pod \"cluster-proxy-proxy-agent-786f447f9f-rtft7\" (UID: \"f5a66d9e-b3ca-40dd-a57f-474a431be482\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" Apr 16 20:12:27.453598 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.453557 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcd44\" (UniqueName: \"kubernetes.io/projected/9c8959d6-9e7b-4a7c-a001-6e46516e676d-kube-api-access-dcd44\") pod \"insights-runtime-extractor-nbhmg\" (UID: \"9c8959d6-9e7b-4a7c-a001-6e46516e676d\") " pod="openshift-insights/insights-runtime-extractor-nbhmg" Apr 16 20:12:27.453787 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.453616 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9c8959d6-9e7b-4a7c-a001-6e46516e676d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nbhmg\" (UID: \"9c8959d6-9e7b-4a7c-a001-6e46516e676d\") " pod="openshift-insights/insights-runtime-extractor-nbhmg" Apr 16 20:12:27.453787 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.453655 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9c8959d6-9e7b-4a7c-a001-6e46516e676d-data-volume\") pod \"insights-runtime-extractor-nbhmg\" (UID: \"9c8959d6-9e7b-4a7c-a001-6e46516e676d\") " pod="openshift-insights/insights-runtime-extractor-nbhmg" Apr 16 20:12:27.453787 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.453681 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9c8959d6-9e7b-4a7c-a001-6e46516e676d-crio-socket\") pod \"insights-runtime-extractor-nbhmg\" (UID: \"9c8959d6-9e7b-4a7c-a001-6e46516e676d\") " pod="openshift-insights/insights-runtime-extractor-nbhmg" Apr 16 20:12:27.453787 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.453775 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9c8959d6-9e7b-4a7c-a001-6e46516e676d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nbhmg\" (UID: \"9c8959d6-9e7b-4a7c-a001-6e46516e676d\") " pod="openshift-insights/insights-runtime-extractor-nbhmg" Apr 16 20:12:27.465976 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.465937 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" Apr 16 20:12:27.474892 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.474863 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-695457864-sstkk" Apr 16 20:12:27.480630 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.480600 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-94d664f4f-b89hv" Apr 16 20:12:27.555147 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.555059 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9c8959d6-9e7b-4a7c-a001-6e46516e676d-data-volume\") pod \"insights-runtime-extractor-nbhmg\" (UID: \"9c8959d6-9e7b-4a7c-a001-6e46516e676d\") " pod="openshift-insights/insights-runtime-extractor-nbhmg" Apr 16 20:12:27.555147 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.555105 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9c8959d6-9e7b-4a7c-a001-6e46516e676d-crio-socket\") pod \"insights-runtime-extractor-nbhmg\" (UID: \"9c8959d6-9e7b-4a7c-a001-6e46516e676d\") " pod="openshift-insights/insights-runtime-extractor-nbhmg" Apr 16 20:12:27.555368 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.555202 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9c8959d6-9e7b-4a7c-a001-6e46516e676d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nbhmg\" (UID: \"9c8959d6-9e7b-4a7c-a001-6e46516e676d\") " pod="openshift-insights/insights-runtime-extractor-nbhmg" Apr 16 20:12:27.555368 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.555249 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcd44\" (UniqueName: \"kubernetes.io/projected/9c8959d6-9e7b-4a7c-a001-6e46516e676d-kube-api-access-dcd44\") pod \"insights-runtime-extractor-nbhmg\" (UID: \"9c8959d6-9e7b-4a7c-a001-6e46516e676d\") " pod="openshift-insights/insights-runtime-extractor-nbhmg" Apr 16 20:12:27.555368 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.555260 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9c8959d6-9e7b-4a7c-a001-6e46516e676d-crio-socket\") pod \"insights-runtime-extractor-nbhmg\" (UID: \"9c8959d6-9e7b-4a7c-a001-6e46516e676d\") " pod="openshift-insights/insights-runtime-extractor-nbhmg" Apr 16 20:12:27.555368 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.555281 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9c8959d6-9e7b-4a7c-a001-6e46516e676d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nbhmg\" (UID: \"9c8959d6-9e7b-4a7c-a001-6e46516e676d\") " pod="openshift-insights/insights-runtime-extractor-nbhmg" Apr 16 20:12:27.555994 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.555953 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9c8959d6-9e7b-4a7c-a001-6e46516e676d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nbhmg\" (UID: \"9c8959d6-9e7b-4a7c-a001-6e46516e676d\") " pod="openshift-insights/insights-runtime-extractor-nbhmg" Apr 16 20:12:27.555994 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.555958 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9c8959d6-9e7b-4a7c-a001-6e46516e676d-data-volume\") pod \"insights-runtime-extractor-nbhmg\" (UID: \"9c8959d6-9e7b-4a7c-a001-6e46516e676d\") " pod="openshift-insights/insights-runtime-extractor-nbhmg" Apr 16 20:12:27.558011 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.557989 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9c8959d6-9e7b-4a7c-a001-6e46516e676d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nbhmg\" (UID: \"9c8959d6-9e7b-4a7c-a001-6e46516e676d\") " pod="openshift-insights/insights-runtime-extractor-nbhmg" Apr 16 20:12:27.566076 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.566042 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcd44\" (UniqueName: \"kubernetes.io/projected/9c8959d6-9e7b-4a7c-a001-6e46516e676d-kube-api-access-dcd44\") pod \"insights-runtime-extractor-nbhmg\" (UID: \"9c8959d6-9e7b-4a7c-a001-6e46516e676d\") " pod="openshift-insights/insights-runtime-extractor-nbhmg" Apr 16 20:12:27.611914 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:27.611879 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nbhmg" Apr 16 20:12:28.614852 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:28.614822 2571 scope.go:117] "RemoveContainer" containerID="0f96bf0e1ab2b67cb8852d1c038e06ce9176ae7bfaf5ef19d7d74502ebdc8f31" Apr 16 20:12:29.053495 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:29.053110 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:12:29.053672 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:29.053503 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" event={"ID":"0803ae09-3a9f-4d31-988c-4b5c2a29a6a2","Type":"ContainerStarted","Data":"9b16aa7748491d0f1445becbd90216ff12804923dc340f5788a92ae64c36eb35"} Apr 16 20:12:29.054559 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:29.054043 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" Apr 16 20:12:29.078147 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:29.075256 2571 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-pjp69 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.132.0.18:8443/readyz\": dial tcp 10.132.0.18:8443: connect: connection refused" start-of-body= Apr 16 20:12:29.078147 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:29.075312 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" podUID="0803ae09-3a9f-4d31-988c-4b5c2a29a6a2" containerName="console-operator" probeResult="failure" output="Get \"https://10.132.0.18:8443/readyz\": dial tcp 10.132.0.18:8443: connect: connection refused" Apr 16 20:12:29.078360 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:29.078122 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" podStartSLOduration=57.193055758 podStartE2EDuration="1m4.078101867s" podCreationTimestamp="2026-04-16 20:11:25 +0000 UTC" firstStartedPulling="2026-04-16 20:11:55.366183211 +0000 UTC m=+36.273528611" lastFinishedPulling="2026-04-16 20:12:02.25122932 +0000 UTC m=+43.158574720" observedRunningTime="2026-04-16 20:12:29.075174624 +0000 UTC m=+69.982520051" watchObservedRunningTime="2026-04-16 20:12:29.078101867 +0000 UTC m=+69.985447288" Apr 16 20:12:29.181342 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:29.181312 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-94d664f4f-b89hv"] Apr 16 20:12:29.252251 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:29.252176 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-695457864-sstkk"] Apr 16 20:12:29.261234 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:12:29.261203 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a7ec274_8a20_488b_b3be_ff44a04a1cbd.slice/crio-8202182bc51b6f290999445cb5bce45335d8963364877405da670e2e6e52f9b9 WatchSource:0}: Error finding container 8202182bc51b6f290999445cb5bce45335d8963364877405da670e2e6e52f9b9: Status 404 returned error can't find the container with id 8202182bc51b6f290999445cb5bce45335d8963364877405da670e2e6e52f9b9 Apr 16 20:12:29.410969 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:29.409763 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nbhmg"] Apr 16 20:12:29.413415 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:29.413117 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7"] Apr 16 20:12:29.425625 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:12:29.425480 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5a66d9e_b3ca_40dd_a57f_474a431be482.slice/crio-04c220eede8bc64bba52b6c8cfe7f4a41eef698be02a79b0a69ff8d22b5b0bde WatchSource:0}: Error finding container 04c220eede8bc64bba52b6c8cfe7f4a41eef698be02a79b0a69ff8d22b5b0bde: Status 404 returned error can't find the container with id 04c220eede8bc64bba52b6c8cfe7f4a41eef698be02a79b0a69ff8d22b5b0bde Apr 16 20:12:29.739365 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:29.739331 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h7d6v"] Apr 16 20:12:29.742427 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:29.742403 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h7d6v" Apr 16 20:12:29.744886 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:29.744860 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-kzrh6\"" Apr 16 20:12:29.745025 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:29.744860 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 20:12:29.751233 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:29.751208 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h7d6v"] Apr 16 20:12:29.887568 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:29.887440 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/bd25557f-a77c-411d-9d69-73cec79671f0-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-h7d6v\" (UID: \"bd25557f-a77c-411d-9d69-73cec79671f0\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h7d6v" Apr 16 20:12:29.989038 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:29.988564 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/bd25557f-a77c-411d-9d69-73cec79671f0-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-h7d6v\" (UID: \"bd25557f-a77c-411d-9d69-73cec79671f0\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h7d6v" Apr 16 20:12:29.989038 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:29.988783 2571 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 20:12:29.989038 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:29.988859 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd25557f-a77c-411d-9d69-73cec79671f0-tls-certificates podName:bd25557f-a77c-411d-9d69-73cec79671f0 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:30.488835846 +0000 UTC m=+71.396181251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/bd25557f-a77c-411d-9d69-73cec79671f0-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-h7d6v" (UID: "bd25557f-a77c-411d-9d69-73cec79671f0") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 20:12:30.065053 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.064247 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm" event={"ID":"dc063ce5-322d-48b4-9d08-1904d3210ecd","Type":"ContainerStarted","Data":"d1c32eec6b9374b09b3df10f6ffa336784452e3c3bd2342ca0aeaeb62b8903d3"} Apr 16 20:12:30.070593 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.070518 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mchp6" event={"ID":"b3bab96c-fc58-4afa-886a-ff24380a19e6","Type":"ContainerStarted","Data":"69dffac20966ab7a65cfe38b391a9c6a415087e81b4333c71d0b8d3982b05325"} Apr 16 20:12:30.073967 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.073912 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nbhmg" event={"ID":"9c8959d6-9e7b-4a7c-a001-6e46516e676d","Type":"ContainerStarted","Data":"cd50b2de0fd86d617474102a4ee79a920126545be07c4ba2f9c5e4c705206a74"} Apr 16 20:12:30.074140 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.073975 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nbhmg" event={"ID":"9c8959d6-9e7b-4a7c-a001-6e46516e676d","Type":"ContainerStarted","Data":"3f49c8e302a62cf42b4b6fe3e1868be7a35170751cc19f7ecfbe8e7e1d97ec69"} Apr 16 20:12:30.079631 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.078879 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8cq5p" event={"ID":"d89c84d3-91b3-4e2f-8457-b44d10b35a08","Type":"ContainerStarted","Data":"2c9856ea75f367343e7caa742093d977723d662feb69280943aff95ec7cdc87e"} Apr 16 20:12:30.079631 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.078919 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8cq5p" event={"ID":"d89c84d3-91b3-4e2f-8457-b44d10b35a08","Type":"ContainerStarted","Data":"d460b2489264acd3ed241d10b2cfc7e2da3e18fe99aa0090ff31d5742f43595c"} Apr 16 20:12:30.079631 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.079579 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-8cq5p" Apr 16 20:12:30.084468 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.082203 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wndvm" podStartSLOduration=61.014198431 podStartE2EDuration="1m5.082187825s" podCreationTimestamp="2026-04-16 20:11:25 +0000 UTC" firstStartedPulling="2026-04-16 20:12:24.832044618 +0000 UTC m=+65.739390016" lastFinishedPulling="2026-04-16 20:12:28.900033998 +0000 UTC m=+69.807379410" observedRunningTime="2026-04-16 20:12:30.081132201 +0000 UTC m=+70.988477623" watchObservedRunningTime="2026-04-16 20:12:30.082187825 +0000 UTC m=+70.989533267" Apr 16 20:12:30.089696 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.089642 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-695457864-sstkk" event={"ID":"7a7ec274-8a20-488b-b3be-ff44a04a1cbd","Type":"ContainerStarted","Data":"8202182bc51b6f290999445cb5bce45335d8963364877405da670e2e6e52f9b9"} Apr 16 20:12:30.100283 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.100254 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h5ntx" event={"ID":"ee593b7f-fc54-40a3-af7d-f5643196a107","Type":"ContainerStarted","Data":"41d5084691e84ceb4e887d674b5173084c5fe036464ad3da3085b5c10fdc2b73"} Apr 16 20:12:30.100455 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.100442 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h5ntx" event={"ID":"ee593b7f-fc54-40a3-af7d-f5643196a107","Type":"ContainerStarted","Data":"602fb1baf9ddfc674520506b48f9147ac02652d65544617bb3906e1aa443b8c0"} Apr 16 20:12:30.119127 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.118481 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf" event={"ID":"7f8bccf8-1c1e-4890-98f3-747f76421e6c","Type":"ContainerStarted","Data":"b35c934733d084b9d8c6ac5b0f6c06b3291dc4703ce5441407d2fc811b74106d"} Apr 16 20:12:30.119127 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.118520 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf" event={"ID":"7f8bccf8-1c1e-4890-98f3-747f76421e6c","Type":"ContainerStarted","Data":"3472d26ef740ee020664a34657c15c77e425999051502f63fa2ddfa514e5c45d"} Apr 16 20:12:30.122763 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.122722 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" event={"ID":"f5a66d9e-b3ca-40dd-a57f-474a431be482","Type":"ContainerStarted","Data":"04c220eede8bc64bba52b6c8cfe7f4a41eef698be02a79b0a69ff8d22b5b0bde"} Apr 16 20:12:30.127662 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.125984 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mchp6" podStartSLOduration=34.268364988 podStartE2EDuration="38.125966967s" podCreationTimestamp="2026-04-16 20:11:52 +0000 UTC" firstStartedPulling="2026-04-16 20:12:25.056111967 +0000 UTC m=+65.963457365" lastFinishedPulling="2026-04-16 20:12:28.913713942 +0000 UTC m=+69.821059344" observedRunningTime="2026-04-16 20:12:30.098424394 +0000 UTC m=+71.005769815" watchObservedRunningTime="2026-04-16 20:12:30.125966967 +0000 UTC m=+71.033312388" Apr 16 20:12:30.128448 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.127948 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8cq5p" podStartSLOduration=34.309310729 podStartE2EDuration="38.127934474s" podCreationTimestamp="2026-04-16 20:11:52 +0000 UTC" firstStartedPulling="2026-04-16 20:12:25.07928951 +0000 UTC m=+65.986634907" lastFinishedPulling="2026-04-16 20:12:28.897913239 +0000 UTC m=+69.805258652" observedRunningTime="2026-04-16 20:12:30.123910757 +0000 UTC m=+71.031256183" watchObservedRunningTime="2026-04-16 20:12:30.127934474 +0000 UTC m=+71.035279899" Apr 16 20:12:30.130558 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.130011 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-h59sf" event={"ID":"05a809af-d1a2-4af3-9ac4-46b14e4aada1","Type":"ContainerStarted","Data":"0bd955b15acdb6da35c9e0699037ba394582713685bc66c3a0025213424f687e"} Apr 16 20:12:30.135017 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.134994 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-94d664f4f-b89hv" event={"ID":"0bbd0670-7dbf-4928-a161-363090b3dc2f","Type":"ContainerStarted","Data":"21ab8f61e6882798efa1354cb5282ef9793a82ca8617c7d7f752ce45848d1c92"} Apr 16 20:12:30.180987 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.172669 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-d2sbf" podStartSLOduration=61.358986273 podStartE2EDuration="1m5.172648601s" podCreationTimestamp="2026-04-16 20:11:25 +0000 UTC" firstStartedPulling="2026-04-16 20:12:25.09245996 +0000 UTC m=+65.999805358" lastFinishedPulling="2026-04-16 20:12:28.906122274 +0000 UTC m=+69.813467686" observedRunningTime="2026-04-16 20:12:30.171171799 +0000 UTC m=+71.078517220" watchObservedRunningTime="2026-04-16 20:12:30.172648601 +0000 UTC m=+71.079994023" Apr 16 20:12:30.180987 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.173433 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-h5ntx" podStartSLOduration=66.979483604 podStartE2EDuration="1m11.173426327s" podCreationTimestamp="2026-04-16 20:11:19 +0000 UTC" firstStartedPulling="2026-04-16 20:12:24.686885069 +0000 UTC m=+65.594230469" lastFinishedPulling="2026-04-16 20:12:28.880827779 +0000 UTC m=+69.788173192" observedRunningTime="2026-04-16 20:12:30.150143222 +0000 UTC m=+71.057488643" watchObservedRunningTime="2026-04-16 20:12:30.173426327 +0000 UTC m=+71.080771751" Apr 16 20:12:30.271393 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.271313 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-pjp69" Apr 16 20:12:30.299377 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.299293 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-h59sf" podStartSLOduration=61.312646138 podStartE2EDuration="1m5.299274081s" podCreationTimestamp="2026-04-16 20:11:25 +0000 UTC" firstStartedPulling="2026-04-16 20:12:24.912028186 +0000 UTC m=+65.819373605" lastFinishedPulling="2026-04-16 20:12:28.898656138 +0000 UTC m=+69.806001548" observedRunningTime="2026-04-16 20:12:30.201759785 +0000 UTC m=+71.109105207" watchObservedRunningTime="2026-04-16 20:12:30.299274081 +0000 UTC m=+71.206619508" Apr 16 20:12:30.496917 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.496263 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/bd25557f-a77c-411d-9d69-73cec79671f0-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-h7d6v\" (UID: \"bd25557f-a77c-411d-9d69-73cec79671f0\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h7d6v" Apr 16 20:12:30.511698 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.511630 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/bd25557f-a77c-411d-9d69-73cec79671f0-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-h7d6v\" (UID: \"bd25557f-a77c-411d-9d69-73cec79671f0\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h7d6v" Apr 16 20:12:30.515049 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.513655 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-rcwh4"] Apr 16 20:12:30.519393 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.518413 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-rcwh4" Apr 16 20:12:30.522170 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.521516 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 20:12:30.522170 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.521743 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 20:12:30.522170 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.521974 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-rj48x\"" Apr 16 20:12:30.530165 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.530133 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-rcwh4"] Apr 16 20:12:30.658742 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.658038 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h7d6v" Apr 16 20:12:30.698794 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.698581 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj5hw\" (UniqueName: \"kubernetes.io/projected/39df3605-304c-4ee5-ad56-f96333f9a031-kube-api-access-lj5hw\") pod \"downloads-6bcc868b7-rcwh4\" (UID: \"39df3605-304c-4ee5-ad56-f96333f9a031\") " pod="openshift-console/downloads-6bcc868b7-rcwh4" Apr 16 20:12:30.799773 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.799304 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lj5hw\" (UniqueName: \"kubernetes.io/projected/39df3605-304c-4ee5-ad56-f96333f9a031-kube-api-access-lj5hw\") pod \"downloads-6bcc868b7-rcwh4\" (UID: \"39df3605-304c-4ee5-ad56-f96333f9a031\") " pod="openshift-console/downloads-6bcc868b7-rcwh4" Apr 16 20:12:30.818476 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.813304 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj5hw\" (UniqueName: \"kubernetes.io/projected/39df3605-304c-4ee5-ad56-f96333f9a031-kube-api-access-lj5hw\") pod \"downloads-6bcc868b7-rcwh4\" (UID: \"39df3605-304c-4ee5-ad56-f96333f9a031\") " pod="openshift-console/downloads-6bcc868b7-rcwh4" Apr 16 20:12:30.838852 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.838813 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-rcwh4" Apr 16 20:12:30.918295 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:30.918257 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h7d6v"] Apr 16 20:12:31.036977 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:31.036694 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-rcwh4"] Apr 16 20:12:31.144798 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:31.144724 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-rcwh4" event={"ID":"39df3605-304c-4ee5-ad56-f96333f9a031","Type":"ContainerStarted","Data":"eaceb4f94b7c4aa1bcc1537c07bf7a3a61aa37de3491295d9ba6d1bc32101a92"} Apr 16 20:12:31.150892 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:31.150814 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h7d6v" event={"ID":"bd25557f-a77c-411d-9d69-73cec79671f0","Type":"ContainerStarted","Data":"7a82df08b5197009e51012467bb7d3391105d2308f7a68cf0aeeef768cae7b4a"} Apr 16 20:12:31.157566 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:31.156216 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nbhmg" event={"ID":"9c8959d6-9e7b-4a7c-a001-6e46516e676d","Type":"ContainerStarted","Data":"3aa3ba6be4aeac5713a8fe4929f65d57082fcc19ea294b25e481e02ccaeb40c2"} Apr 16 20:12:39.197061 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:39.197018 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" event={"ID":"f5a66d9e-b3ca-40dd-a57f-474a431be482","Type":"ContainerStarted","Data":"40504996f2978ab3a44727ef1df67f3c756fd7ddbcf3df66386ab1dccab4bb99"} Apr 16 20:12:39.198638 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:39.198598 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-94d664f4f-b89hv" event={"ID":"0bbd0670-7dbf-4928-a161-363090b3dc2f","Type":"ContainerStarted","Data":"48a98c9eaaa834b3d086aef21e6ba2554d49519b3af0bf8c3db08a70928fb3af"} Apr 16 20:12:39.201074 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:39.201045 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h7d6v" event={"ID":"bd25557f-a77c-411d-9d69-73cec79671f0","Type":"ContainerStarted","Data":"39236edb50d94a700807b067bb74a86a4a6b1cfb73a6b6fa4f0150bb0d45b849"} Apr 16 20:12:39.201252 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:39.201234 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h7d6v" Apr 16 20:12:39.204429 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:39.204403 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nbhmg" event={"ID":"9c8959d6-9e7b-4a7c-a001-6e46516e676d","Type":"ContainerStarted","Data":"1004e5694623a6ce48ad9e75f7e5528274061759e30600e7950114c2f337d424"} Apr 16 20:12:39.206047 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:39.206016 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-695457864-sstkk" event={"ID":"7a7ec274-8a20-488b-b3be-ff44a04a1cbd","Type":"ContainerStarted","Data":"0fcad14eff2d978adea4eaaa3cf8e814f373324c53c5401ebf2551d20fc22617"} Apr 16 20:12:39.206482 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:39.206456 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-695457864-sstkk" Apr 16 20:12:39.208601 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:39.208579 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h7d6v" Apr 16 20:12:39.208921 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:39.208898 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-695457864-sstkk" Apr 16 20:12:39.222097 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:39.221838 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-94d664f4f-b89hv" podStartSLOduration=3.120215577 podStartE2EDuration="12.221824421s" podCreationTimestamp="2026-04-16 20:12:27 +0000 UTC" firstStartedPulling="2026-04-16 20:12:29.208869864 +0000 UTC m=+70.116215267" lastFinishedPulling="2026-04-16 20:12:38.31047871 +0000 UTC m=+79.217824111" observedRunningTime="2026-04-16 20:12:39.219800993 +0000 UTC m=+80.127146414" watchObservedRunningTime="2026-04-16 20:12:39.221824421 +0000 UTC m=+80.129169840" Apr 16 20:12:39.239791 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:39.239735 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-nbhmg" podStartSLOduration=3.466323486 podStartE2EDuration="12.239718126s" podCreationTimestamp="2026-04-16 20:12:27 +0000 UTC" firstStartedPulling="2026-04-16 20:12:29.537884748 +0000 UTC m=+70.445230148" lastFinishedPulling="2026-04-16 20:12:38.311279384 +0000 UTC m=+79.218624788" observedRunningTime="2026-04-16 20:12:39.237221873 +0000 UTC m=+80.144567292" watchObservedRunningTime="2026-04-16 20:12:39.239718126 +0000 UTC m=+80.147063547" Apr 16 20:12:39.255251 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:39.255183 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h7d6v" podStartSLOduration=2.8715605330000002 podStartE2EDuration="10.255164368s" podCreationTimestamp="2026-04-16 20:12:29 +0000 UTC" firstStartedPulling="2026-04-16 20:12:30.927743512 +0000 UTC m=+71.835088912" lastFinishedPulling="2026-04-16 20:12:38.311347345 +0000 UTC m=+79.218692747" observedRunningTime="2026-04-16 20:12:39.253690211 +0000 UTC m=+80.161035632" watchObservedRunningTime="2026-04-16 20:12:39.255164368 +0000 UTC m=+80.162509788" Apr 16 20:12:39.273639 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:39.273565 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-695457864-sstkk" podStartSLOduration=3.208905434 podStartE2EDuration="12.273546135s" podCreationTimestamp="2026-04-16 20:12:27 +0000 UTC" firstStartedPulling="2026-04-16 20:12:29.265238148 +0000 UTC m=+70.172583550" lastFinishedPulling="2026-04-16 20:12:38.329878849 +0000 UTC m=+79.237224251" observedRunningTime="2026-04-16 20:12:39.270877358 +0000 UTC m=+80.178222781" watchObservedRunningTime="2026-04-16 20:12:39.273546135 +0000 UTC m=+80.180891553" Apr 16 20:12:41.163075 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:41.163043 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8cq5p" Apr 16 20:12:41.241281 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:41.241212 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" event={"ID":"f5a66d9e-b3ca-40dd-a57f-474a431be482","Type":"ContainerStarted","Data":"99f53c20ad4c42136415796bed5813ce0c512cb8ee26dc8cfd44cc9786276eee"} Apr 16 20:12:41.241281 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:41.241261 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" event={"ID":"f5a66d9e-b3ca-40dd-a57f-474a431be482","Type":"ContainerStarted","Data":"dc4097d00572ecde430ccd74c104fc378125e050b884607fb134ac45a9ca3336"} Apr 16 20:12:41.261647 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:41.261561 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-786f447f9f-rtft7" podStartSLOduration=3.068743862 podStartE2EDuration="14.26152248s" podCreationTimestamp="2026-04-16 20:12:27 +0000 UTC" firstStartedPulling="2026-04-16 20:12:29.43367649 +0000 UTC m=+70.341021889" lastFinishedPulling="2026-04-16 20:12:40.626455105 +0000 UTC m=+81.533800507" observedRunningTime="2026-04-16 20:12:41.260819601 +0000 UTC m=+82.168165022" watchObservedRunningTime="2026-04-16 20:12:41.26152248 +0000 UTC m=+82.168867900" Apr 16 20:12:44.690856 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:44.690817 2571 patch_prober.go:28] interesting pod/image-registry-5847757746-fm52t container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 20:12:44.691306 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:44.690893 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5847757746-fm52t" podUID="bc1a4b01-eb2b-4766-9d17-3b931cb5aab1" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:12:45.208816 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.208777 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-fbpfb"] Apr 16 20:12:45.214856 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.214824 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.217569 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.217297 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 20:12:45.217569 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.217399 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 20:12:45.217569 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.217307 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-6m8jn\"" Apr 16 20:12:45.220151 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.218504 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 20:12:45.220151 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.218749 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 20:12:45.315147 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.315107 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmqkd\" (UniqueName: \"kubernetes.io/projected/4b06bd44-2144-4a0c-9348-290d7c1dffdb-kube-api-access-jmqkd\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.315346 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.315157 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4b06bd44-2144-4a0c-9348-290d7c1dffdb-node-exporter-wtmp\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.315346 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.315180 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4b06bd44-2144-4a0c-9348-290d7c1dffdb-node-exporter-textfile\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.315346 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.315207 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4b06bd44-2144-4a0c-9348-290d7c1dffdb-node-exporter-tls\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.315346 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.315247 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4b06bd44-2144-4a0c-9348-290d7c1dffdb-node-exporter-accelerators-collector-config\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.315346 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.315286 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4b06bd44-2144-4a0c-9348-290d7c1dffdb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.315346 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.315307 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b06bd44-2144-4a0c-9348-290d7c1dffdb-sys\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.315346 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.315334 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b06bd44-2144-4a0c-9348-290d7c1dffdb-metrics-client-ca\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.315720 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.315361 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4b06bd44-2144-4a0c-9348-290d7c1dffdb-root\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.415762 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.415726 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4b06bd44-2144-4a0c-9348-290d7c1dffdb-node-exporter-tls\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.415958 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.415783 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4b06bd44-2144-4a0c-9348-290d7c1dffdb-node-exporter-accelerators-collector-config\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.415958 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.415830 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4b06bd44-2144-4a0c-9348-290d7c1dffdb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.415958 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.415859 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b06bd44-2144-4a0c-9348-290d7c1dffdb-sys\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.415958 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:45.415872 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 20:12:45.415958 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.415898 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b06bd44-2144-4a0c-9348-290d7c1dffdb-metrics-client-ca\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.415958 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.415926 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4b06bd44-2144-4a0c-9348-290d7c1dffdb-root\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.415958 ip-10-0-129-41 kubenswrapper[2571]: E0416 20:12:45.415945 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b06bd44-2144-4a0c-9348-290d7c1dffdb-node-exporter-tls podName:4b06bd44-2144-4a0c-9348-290d7c1dffdb nodeName:}" failed. No retries permitted until 2026-04-16 20:12:45.915921884 +0000 UTC m=+86.823267288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/4b06bd44-2144-4a0c-9348-290d7c1dffdb-node-exporter-tls") pod "node-exporter-fbpfb" (UID: "4b06bd44-2144-4a0c-9348-290d7c1dffdb") : secret "node-exporter-tls" not found Apr 16 20:12:45.416329 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.415969 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4b06bd44-2144-4a0c-9348-290d7c1dffdb-root\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.416329 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.416020 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmqkd\" (UniqueName: \"kubernetes.io/projected/4b06bd44-2144-4a0c-9348-290d7c1dffdb-kube-api-access-jmqkd\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.416329 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.416051 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4b06bd44-2144-4a0c-9348-290d7c1dffdb-node-exporter-wtmp\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.416329 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.416084 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4b06bd44-2144-4a0c-9348-290d7c1dffdb-node-exporter-textfile\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.416329 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.416192 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b06bd44-2144-4a0c-9348-290d7c1dffdb-sys\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.416599 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.416423 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4b06bd44-2144-4a0c-9348-290d7c1dffdb-node-exporter-accelerators-collector-config\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.416653 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.416610 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b06bd44-2144-4a0c-9348-290d7c1dffdb-metrics-client-ca\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.416888 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.416824 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4b06bd44-2144-4a0c-9348-290d7c1dffdb-node-exporter-wtmp\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.417017 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.416899 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4b06bd44-2144-4a0c-9348-290d7c1dffdb-node-exporter-textfile\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.418919 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.418890 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4b06bd44-2144-4a0c-9348-290d7c1dffdb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.428945 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.428923 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmqkd\" (UniqueName: \"kubernetes.io/projected/4b06bd44-2144-4a0c-9348-290d7c1dffdb-kube-api-access-jmqkd\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.920717 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.920625 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4b06bd44-2144-4a0c-9348-290d7c1dffdb-node-exporter-tls\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:45.923506 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:45.923472 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4b06bd44-2144-4a0c-9348-290d7c1dffdb-node-exporter-tls\") pod \"node-exporter-fbpfb\" (UID: \"4b06bd44-2144-4a0c-9348-290d7c1dffdb\") " pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:46.130683 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:46.130650 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-fbpfb" Apr 16 20:12:48.048727 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:48.048687 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:12:50.316125 ip-10-0-129-41 kubenswrapper[2571]: W0416 20:12:50.316093 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b06bd44_2144_4a0c_9348_290d7c1dffdb.slice/crio-62c0018a73d57404d8c3feab3f9627f868daf55cdec680756b8bb31f5e2a948c WatchSource:0}: Error finding container 62c0018a73d57404d8c3feab3f9627f868daf55cdec680756b8bb31f5e2a948c: Status 404 returned error can't find the container with id 62c0018a73d57404d8c3feab3f9627f868daf55cdec680756b8bb31f5e2a948c Apr 16 20:12:51.277483 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:51.277443 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-rcwh4" event={"ID":"39df3605-304c-4ee5-ad56-f96333f9a031","Type":"ContainerStarted","Data":"98f6dcce01865accfcbaf4bf8867c61eb23380002562bf7089fa757ae72e2a78"} Apr 16 20:12:51.277936 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:51.277919 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-rcwh4" Apr 16 20:12:51.279343 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:51.279313 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fbpfb" event={"ID":"4b06bd44-2144-4a0c-9348-290d7c1dffdb","Type":"ContainerStarted","Data":"586c1c222d82b186c2e2aa1ad72dff93d10d97da24031faad63e2a8a31f6b1b7"} Apr 16 20:12:51.279454 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:51.279354 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fbpfb" event={"ID":"4b06bd44-2144-4a0c-9348-290d7c1dffdb","Type":"ContainerStarted","Data":"62c0018a73d57404d8c3feab3f9627f868daf55cdec680756b8bb31f5e2a948c"} Apr 16 20:12:51.288559 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:51.288512 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-rcwh4" Apr 16 20:12:51.297415 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:51.297365 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-rcwh4" podStartSLOduration=1.642675966 podStartE2EDuration="21.297349522s" podCreationTimestamp="2026-04-16 20:12:30 +0000 UTC" firstStartedPulling="2026-04-16 20:12:31.048004089 +0000 UTC m=+71.955349489" lastFinishedPulling="2026-04-16 20:12:50.702677643 +0000 UTC m=+91.610023045" observedRunningTime="2026-04-16 20:12:51.295573671 +0000 UTC m=+92.202919115" watchObservedRunningTime="2026-04-16 20:12:51.297349522 +0000 UTC m=+92.204694988" Apr 16 20:12:52.284382 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:52.284340 2571 generic.go:358] "Generic (PLEG): container finished" podID="4b06bd44-2144-4a0c-9348-290d7c1dffdb" containerID="586c1c222d82b186c2e2aa1ad72dff93d10d97da24031faad63e2a8a31f6b1b7" exitCode=0 Apr 16 20:12:52.284876 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:52.284425 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fbpfb" event={"ID":"4b06bd44-2144-4a0c-9348-290d7c1dffdb","Type":"ContainerDied","Data":"586c1c222d82b186c2e2aa1ad72dff93d10d97da24031faad63e2a8a31f6b1b7"} Apr 16 20:12:53.290625 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:53.290573 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fbpfb" event={"ID":"4b06bd44-2144-4a0c-9348-290d7c1dffdb","Type":"ContainerStarted","Data":"4ff4d12e04a11d8f5000d39e1e3cb0f076fdcf655020ca746cd2a1d7ad133be2"} Apr 16 20:12:53.291088 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:53.290635 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fbpfb" event={"ID":"4b06bd44-2144-4a0c-9348-290d7c1dffdb","Type":"ContainerStarted","Data":"5848518debbead4dc508b113c39443b7e6f2da72efc0eb64495727657d37b1fd"} Apr 16 20:12:53.312019 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:53.311956 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-fbpfb" podStartSLOduration=7.532071402 podStartE2EDuration="8.311938808s" podCreationTimestamp="2026-04-16 20:12:45 +0000 UTC" firstStartedPulling="2026-04-16 20:12:50.31786679 +0000 UTC m=+91.225212202" lastFinishedPulling="2026-04-16 20:12:51.097734195 +0000 UTC m=+92.005079608" observedRunningTime="2026-04-16 20:12:53.310965316 +0000 UTC m=+94.218310747" watchObservedRunningTime="2026-04-16 20:12:53.311938808 +0000 UTC m=+94.219284227" Apr 16 20:12:54.193001 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:54.192960 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5847757746-fm52t"] Apr 16 20:12:56.039700 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:12:56.039667 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zdf2s" Apr 16 20:13:13.353239 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:13.353199 2571 generic.go:358] "Generic (PLEG): container finished" podID="2d883e9c-34a0-4bc6-8784-879380b900d3" containerID="34f2d7dfe4272dab333dcc440b007e538772468c17a0f62e2e9662ff96f83527" exitCode=0 Apr 16 20:13:13.353673 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:13.353271 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8f7hz" event={"ID":"2d883e9c-34a0-4bc6-8784-879380b900d3","Type":"ContainerDied","Data":"34f2d7dfe4272dab333dcc440b007e538772468c17a0f62e2e9662ff96f83527"} Apr 16 20:13:13.353673 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:13.353627 2571 scope.go:117] "RemoveContainer" containerID="34f2d7dfe4272dab333dcc440b007e538772468c17a0f62e2e9662ff96f83527" Apr 16 20:13:14.358237 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:14.358191 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-8f7hz" event={"ID":"2d883e9c-34a0-4bc6-8784-879380b900d3","Type":"ContainerStarted","Data":"60c1540c1654c2e95cdbebdafb7f0de9253a3dfb8ad339acea89fb408aafc2f5"} Apr 16 20:13:14.359623 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:14.359596 2571 generic.go:358] "Generic (PLEG): container finished" podID="9293d674-a736-4062-a5ad-cc844313fbfe" containerID="075983a23efeed2cf1e6e6db8ad68d220ed4997c8c6130dd8ec96b979100eb21" exitCode=0 Apr 16 20:13:14.359738 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:14.359635 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g5ns" event={"ID":"9293d674-a736-4062-a5ad-cc844313fbfe","Type":"ContainerDied","Data":"075983a23efeed2cf1e6e6db8ad68d220ed4997c8c6130dd8ec96b979100eb21"} Apr 16 20:13:14.359866 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:14.359855 2571 scope.go:117] "RemoveContainer" containerID="075983a23efeed2cf1e6e6db8ad68d220ed4997c8c6130dd8ec96b979100eb21" Apr 16 20:13:15.369548 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:15.369497 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-4g5ns" event={"ID":"9293d674-a736-4062-a5ad-cc844313fbfe","Type":"ContainerStarted","Data":"c949aa3dc3dbacedeb5f1f89283edc9bf82b9b347599dabed1b967fa5242f1cc"} Apr 16 20:13:19.216947 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.216903 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5847757746-fm52t" podUID="bc1a4b01-eb2b-4766-9d17-3b931cb5aab1" containerName="registry" containerID="cri-o://0d2885ec3c4c868f6fe277cde1a89c76bf360db5f0850e4a526d5d62fa113fe9" gracePeriod=30 Apr 16 20:13:19.382291 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.382246 2571 generic.go:358] "Generic (PLEG): container finished" podID="bc1a4b01-eb2b-4766-9d17-3b931cb5aab1" containerID="0d2885ec3c4c868f6fe277cde1a89c76bf360db5f0850e4a526d5d62fa113fe9" exitCode=0 Apr 16 20:13:19.382445 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.382318 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5847757746-fm52t" event={"ID":"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1","Type":"ContainerDied","Data":"0d2885ec3c4c868f6fe277cde1a89c76bf360db5f0850e4a526d5d62fa113fe9"} Apr 16 20:13:19.466945 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.466922 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:13:19.621772 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.621734 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-installation-pull-secrets\") pod \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " Apr 16 20:13:19.621964 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.621800 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-ca-trust-extracted\") pod \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " Apr 16 20:13:19.621964 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.621860 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-trusted-ca\") pod \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " Apr 16 20:13:19.621964 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.621895 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls\") pod \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " Apr 16 20:13:19.621964 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.621925 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-certificates\") pod \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " Apr 16 20:13:19.621964 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.621957 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rscxz\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-kube-api-access-rscxz\") pod \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " Apr 16 20:13:19.622227 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.621981 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-image-registry-private-configuration\") pod \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " Apr 16 20:13:19.622227 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.622159 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-bound-sa-token\") pod \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\" (UID: \"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1\") " Apr 16 20:13:19.623300 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.623021 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1" (UID: "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:13:19.623300 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.623238 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1" (UID: "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:13:19.625068 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.625039 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1" (UID: "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:13:19.625507 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.625461 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1" (UID: "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:13:19.625507 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.625469 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1" (UID: "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:13:19.625682 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.625651 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1" (UID: "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:13:19.626228 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.626160 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-kube-api-access-rscxz" (OuterVolumeSpecName: "kube-api-access-rscxz") pod "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1" (UID: "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1"). InnerVolumeSpecName "kube-api-access-rscxz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:13:19.633482 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.633454 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1" (UID: "bc1a4b01-eb2b-4766-9d17-3b931cb5aab1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:13:19.723295 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.723259 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-trusted-ca\") on node \"ip-10-0-129-41.ec2.internal\" DevicePath \"\"" Apr 16 20:13:19.723295 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.723289 2571 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-tls\") on node \"ip-10-0-129-41.ec2.internal\" DevicePath \"\"" Apr 16 20:13:19.723295 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.723299 2571 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-registry-certificates\") on node \"ip-10-0-129-41.ec2.internal\" DevicePath \"\"" Apr 16 20:13:19.723522 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.723309 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rscxz\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-kube-api-access-rscxz\") on node \"ip-10-0-129-41.ec2.internal\" DevicePath \"\"" Apr 16 20:13:19.723522 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.723320 2571 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-image-registry-private-configuration\") on node \"ip-10-0-129-41.ec2.internal\" DevicePath \"\"" Apr 16 20:13:19.723522 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.723330 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-bound-sa-token\") on node \"ip-10-0-129-41.ec2.internal\" DevicePath \"\"" Apr 16 20:13:19.723522 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.723342 2571 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-installation-pull-secrets\") on node \"ip-10-0-129-41.ec2.internal\" DevicePath \"\"" Apr 16 20:13:19.723522 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:19.723353 2571 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1-ca-trust-extracted\") on node \"ip-10-0-129-41.ec2.internal\" DevicePath \"\"" Apr 16 20:13:20.386159 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:20.386126 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5847757746-fm52t" Apr 16 20:13:20.386159 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:20.386138 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5847757746-fm52t" event={"ID":"bc1a4b01-eb2b-4766-9d17-3b931cb5aab1","Type":"ContainerDied","Data":"532f909bb8d3d29374e6bbfd39980ec0459c1a97c709b892af2a86812e683329"} Apr 16 20:13:20.386666 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:20.386195 2571 scope.go:117] "RemoveContainer" containerID="0d2885ec3c4c868f6fe277cde1a89c76bf360db5f0850e4a526d5d62fa113fe9" Apr 16 20:13:20.408158 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:20.408129 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5847757746-fm52t"] Apr 16 20:13:20.412425 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:20.412400 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5847757746-fm52t"] Apr 16 20:13:21.619315 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:21.619282 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc1a4b01-eb2b-4766-9d17-3b931cb5aab1" path="/var/lib/kubelet/pods/bc1a4b01-eb2b-4766-9d17-3b931cb5aab1/volumes" Apr 16 20:13:34.428884 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:34.428847 2571 generic.go:358] "Generic (PLEG): container finished" podID="89638695-826b-4c66-a544-96f10200a105" containerID="f0a0a5b71f77a2226e5db348bb5dd029f75af991ae976b9a9c7d32f3cffd8062" exitCode=0 Apr 16 20:13:34.429342 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:34.428921 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-qlvhv" event={"ID":"89638695-826b-4c66-a544-96f10200a105","Type":"ContainerDied","Data":"f0a0a5b71f77a2226e5db348bb5dd029f75af991ae976b9a9c7d32f3cffd8062"} Apr 16 20:13:34.429342 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:34.429295 2571 scope.go:117] "RemoveContainer" containerID="f0a0a5b71f77a2226e5db348bb5dd029f75af991ae976b9a9c7d32f3cffd8062" Apr 16 20:13:35.433866 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:13:35.433831 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-qlvhv" event={"ID":"89638695-826b-4c66-a544-96f10200a105","Type":"ContainerStarted","Data":"ee890ee96bf7bd26d0033bee8c4c73f0964df1242212a65303e4ef1d82e6cef1"} Apr 16 20:16:19.556605 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:16:19.556573 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:16:19.562053 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:16:19.562022 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:16:19.565625 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:16:19.565598 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:16:19.567548 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:16:19.567507 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:16:19.569744 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:16:19.569723 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 20:21:19.583123 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:21:19.583087 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:21:19.583596 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:21:19.583273 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:21:19.588082 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:21:19.588050 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:21:19.588651 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:21:19.588627 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:26:19.603895 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:26:19.603800 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:26:19.605932 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:26:19.605905 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:26:19.608947 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:26:19.608926 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:26:19.611051 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:26:19.611034 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:31:19.626150 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:31:19.626122 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:31:19.635393 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:31:19.635365 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:31:19.636355 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:31:19.636331 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:31:19.640444 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:31:19.640425 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:36:19.655545 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:36:19.655500 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:36:19.660397 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:36:19.660377 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:36:19.660578 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:36:19.660516 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:36:19.664852 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:36:19.664833 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:41:19.677059 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:41:19.676931 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:41:19.682282 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:41:19.682257 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:41:19.682545 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:41:19.682514 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:41:19.686975 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:41:19.686958 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:46:19.702931 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:46:19.702806 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:46:19.707435 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:46:19.707408 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:46:19.707592 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:46:19.707519 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:46:19.712210 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:46:19.712193 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:51:19.727677 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:51:19.727552 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:51:19.732409 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:51:19.732385 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:51:19.733817 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:51:19.733792 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:51:19.738459 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:51:19.738436 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:56:19.749909 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:56:19.749795 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:56:19.755100 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:56:19.755079 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:56:19.758895 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:56:19.758866 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 20:56:19.769061 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:56:19.769034 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 20:59:57.478213 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:59:57.478160 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xqx4j_28b55b45-a77c-407d-b55c-8a2538906ceb/global-pull-secret-syncer/0.log" Apr 16 20:59:57.600493 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:59:57.600458 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-s4k6r_8d864121-3e7d-4667-a357-2bc3c0ff03ca/konnectivity-agent/0.log" Apr 16 20:59:57.621110 ip-10-0-129-41 kubenswrapper[2571]: I0416 20:59:57.621081 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-41.ec2.internal_d35c82c9bfffde60f444dc698a8db807/haproxy/0.log" Apr 16 21:00:01.492780 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:01.492748 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-wndvm_dc063ce5-322d-48b4-9d08-1904d3210ecd/cluster-monitoring-operator/0.log" Apr 16 21:00:01.651893 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:01.651862 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fbpfb_4b06bd44-2144-4a0c-9348-290d7c1dffdb/node-exporter/0.log" Apr 16 21:00:01.678384 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:01.678359 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fbpfb_4b06bd44-2144-4a0c-9348-290d7c1dffdb/kube-rbac-proxy/0.log" Apr 16 21:00:01.700147 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:01.700122 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fbpfb_4b06bd44-2144-4a0c-9348-290d7c1dffdb/init-textfile/0.log" Apr 16 21:00:02.231998 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:02.231965 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-h7d6v_bd25557f-a77c-411d-9d69-73cec79671f0/prometheus-operator-admission-webhook/0.log" Apr 16 21:00:03.457842 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:03.457802 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-h59sf_05a809af-d1a2-4af3-9ac4-46b14e4aada1/networking-console-plugin/0.log" Apr 16 21:00:03.851117 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:03.851086 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/1.log" Apr 16 21:00:03.855113 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:03.855090 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pjp69_0803ae09-3a9f-4d31-988c-4b5c2a29a6a2/console-operator/2.log" Apr 16 21:00:04.199120 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:04.199038 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-rcwh4_39df3605-304c-4ee5-ad56-f96333f9a031/download-server/0.log" Apr 16 21:00:04.547984 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:04.547909 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-p4xv5_799e94dc-a712-492e-a369-129299525b15/volume-data-source-validator/0.log" Apr 16 21:00:05.111696 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.111667 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8cq5p_d89c84d3-91b3-4e2f-8457-b44d10b35a08/dns/0.log" Apr 16 21:00:05.132314 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.132279 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8cq5p_d89c84d3-91b3-4e2f-8457-b44d10b35a08/kube-rbac-proxy/0.log" Apr 16 21:00:05.252670 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.252631 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-24qzw_916eb4ac-9d70-46ff-98fa-3fc5ae7cc85e/dns-node-resolver/0.log" Apr 16 21:00:05.724740 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.724703 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn"] Apr 16 21:00:05.725136 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.725015 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc1a4b01-eb2b-4766-9d17-3b931cb5aab1" containerName="registry" Apr 16 21:00:05.725136 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.725027 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc1a4b01-eb2b-4766-9d17-3b931cb5aab1" containerName="registry" Apr 16 21:00:05.725136 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.725088 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc1a4b01-eb2b-4766-9d17-3b931cb5aab1" containerName="registry" Apr 16 21:00:05.728215 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.728193 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" Apr 16 21:00:05.730561 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.730515 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wcf77\"/\"default-dockercfg-rx7m6\"" Apr 16 21:00:05.731471 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.731454 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wcf77\"/\"kube-root-ca.crt\"" Apr 16 21:00:05.731587 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.731480 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wcf77\"/\"openshift-service-ca.crt\"" Apr 16 21:00:05.738370 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.738345 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn"] Apr 16 21:00:05.761767 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.761733 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dd9022ec-af28-4ea4-899c-f31e47a33646-podres\") pod \"perf-node-gather-daemonset-9lckn\" (UID: \"dd9022ec-af28-4ea4-899c-f31e47a33646\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" Apr 16 21:00:05.761934 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.761777 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8cqr\" (UniqueName: \"kubernetes.io/projected/dd9022ec-af28-4ea4-899c-f31e47a33646-kube-api-access-q8cqr\") pod \"perf-node-gather-daemonset-9lckn\" (UID: \"dd9022ec-af28-4ea4-899c-f31e47a33646\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" Apr 16 21:00:05.761934 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.761828 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd9022ec-af28-4ea4-899c-f31e47a33646-lib-modules\") pod \"perf-node-gather-daemonset-9lckn\" (UID: \"dd9022ec-af28-4ea4-899c-f31e47a33646\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" Apr 16 21:00:05.761934 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.761875 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd9022ec-af28-4ea4-899c-f31e47a33646-sys\") pod \"perf-node-gather-daemonset-9lckn\" (UID: \"dd9022ec-af28-4ea4-899c-f31e47a33646\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" Apr 16 21:00:05.761934 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.761921 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dd9022ec-af28-4ea4-899c-f31e47a33646-proc\") pod \"perf-node-gather-daemonset-9lckn\" (UID: \"dd9022ec-af28-4ea4-899c-f31e47a33646\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" Apr 16 21:00:05.771153 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.771126 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xz7cv_7aa3c845-972e-41e1-89d2-9126f2eb4905/node-ca/0.log" Apr 16 21:00:05.862277 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.862245 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd9022ec-af28-4ea4-899c-f31e47a33646-lib-modules\") pod \"perf-node-gather-daemonset-9lckn\" (UID: \"dd9022ec-af28-4ea4-899c-f31e47a33646\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" Apr 16 21:00:05.862447 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.862291 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd9022ec-af28-4ea4-899c-f31e47a33646-sys\") pod \"perf-node-gather-daemonset-9lckn\" (UID: \"dd9022ec-af28-4ea4-899c-f31e47a33646\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" Apr 16 21:00:05.862447 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.862323 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dd9022ec-af28-4ea4-899c-f31e47a33646-proc\") pod \"perf-node-gather-daemonset-9lckn\" (UID: \"dd9022ec-af28-4ea4-899c-f31e47a33646\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" Apr 16 21:00:05.862447 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.862354 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dd9022ec-af28-4ea4-899c-f31e47a33646-podres\") pod \"perf-node-gather-daemonset-9lckn\" (UID: \"dd9022ec-af28-4ea4-899c-f31e47a33646\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" Apr 16 21:00:05.862447 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.862375 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8cqr\" (UniqueName: \"kubernetes.io/projected/dd9022ec-af28-4ea4-899c-f31e47a33646-kube-api-access-q8cqr\") pod \"perf-node-gather-daemonset-9lckn\" (UID: \"dd9022ec-af28-4ea4-899c-f31e47a33646\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" Apr 16 21:00:05.862447 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.862419 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dd9022ec-af28-4ea4-899c-f31e47a33646-lib-modules\") pod \"perf-node-gather-daemonset-9lckn\" (UID: \"dd9022ec-af28-4ea4-899c-f31e47a33646\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" Apr 16 21:00:05.862619 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.862459 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd9022ec-af28-4ea4-899c-f31e47a33646-sys\") pod \"perf-node-gather-daemonset-9lckn\" (UID: \"dd9022ec-af28-4ea4-899c-f31e47a33646\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" Apr 16 21:00:05.862619 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.862503 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/dd9022ec-af28-4ea4-899c-f31e47a33646-podres\") pod \"perf-node-gather-daemonset-9lckn\" (UID: \"dd9022ec-af28-4ea4-899c-f31e47a33646\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" Apr 16 21:00:05.862619 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.862543 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/dd9022ec-af28-4ea4-899c-f31e47a33646-proc\") pod \"perf-node-gather-daemonset-9lckn\" (UID: \"dd9022ec-af28-4ea4-899c-f31e47a33646\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" Apr 16 21:00:05.870332 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:05.870310 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8cqr\" (UniqueName: \"kubernetes.io/projected/dd9022ec-af28-4ea4-899c-f31e47a33646-kube-api-access-q8cqr\") pod \"perf-node-gather-daemonset-9lckn\" (UID: \"dd9022ec-af28-4ea4-899c-f31e47a33646\") " pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" Apr 16 21:00:06.038324 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:06.038227 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" Apr 16 21:00:06.164231 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:06.164119 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn"] Apr 16 21:00:06.167131 ip-10-0-129-41 kubenswrapper[2571]: W0416 21:00:06.167104 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddd9022ec_af28_4ea4_899c_f31e47a33646.slice/crio-aa6db983424b21889446a4cda59bad69aa30f55be7e6a989c194bfcd587556a0 WatchSource:0}: Error finding container aa6db983424b21889446a4cda59bad69aa30f55be7e6a989c194bfcd587556a0: Status 404 returned error can't find the container with id aa6db983424b21889446a4cda59bad69aa30f55be7e6a989c194bfcd587556a0 Apr 16 21:00:06.168802 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:06.168785 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:00:06.479408 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:06.479368 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-984b845fd-2gf9m_6f259876-3b65-4751-ba98-118d8aa205f9/router/0.log" Apr 16 21:00:06.624039 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:06.624006 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" event={"ID":"dd9022ec-af28-4ea4-899c-f31e47a33646","Type":"ContainerStarted","Data":"8812a18915f0810ce832436027af4cc50eb73829ad6ec68dcd06ff540a100313"} Apr 16 21:00:06.624039 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:06.624043 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" event={"ID":"dd9022ec-af28-4ea4-899c-f31e47a33646","Type":"ContainerStarted","Data":"aa6db983424b21889446a4cda59bad69aa30f55be7e6a989c194bfcd587556a0"} Apr 16 21:00:06.624306 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:06.624081 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" Apr 16 21:00:06.641218 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:06.641163 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" podStartSLOduration=1.641149028 podStartE2EDuration="1.641149028s" podCreationTimestamp="2026-04-16 21:00:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:00:06.639824006 +0000 UTC m=+2927.547169450" watchObservedRunningTime="2026-04-16 21:00:06.641149028 +0000 UTC m=+2927.548494449" Apr 16 21:00:06.777598 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:06.777567 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-mchp6_b3bab96c-fc58-4afa-886a-ff24380a19e6/serve-healthcheck-canary/0.log" Apr 16 21:00:07.157489 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:07.157451 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-qlvhv_89638695-826b-4c66-a544-96f10200a105/insights-operator/0.log" Apr 16 21:00:07.157876 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:07.157859 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-qlvhv_89638695-826b-4c66-a544-96f10200a105/insights-operator/1.log" Apr 16 21:00:07.242334 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:07.242301 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nbhmg_9c8959d6-9e7b-4a7c-a001-6e46516e676d/kube-rbac-proxy/0.log" Apr 16 21:00:07.264542 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:07.264501 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nbhmg_9c8959d6-9e7b-4a7c-a001-6e46516e676d/exporter/0.log" Apr 16 21:00:07.286279 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:07.286248 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nbhmg_9c8959d6-9e7b-4a7c-a001-6e46516e676d/extractor/0.log" Apr 16 21:00:12.637291 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:12.637262 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wcf77/perf-node-gather-daemonset-9lckn" Apr 16 21:00:13.253342 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:13.253316 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-5dbt4_93f97942-290d-4f48-a08a-5866a654cc19/migrator/0.log" Apr 16 21:00:13.276450 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:13.276425 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-5dbt4_93f97942-290d-4f48-a08a-5866a654cc19/graceful-termination/0.log" Apr 16 21:00:13.558568 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:13.558469 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-4g5ns_9293d674-a736-4062-a5ad-cc844313fbfe/kube-storage-version-migrator-operator/1.log" Apr 16 21:00:13.559469 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:13.559442 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-4g5ns_9293d674-a736-4062-a5ad-cc844313fbfe/kube-storage-version-migrator-operator/0.log" Apr 16 21:00:14.474236 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:14.474150 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gf7l_844406c9-7055-481f-ae73-5d4d7500e71d/kube-multus/0.log" Apr 16 21:00:14.856683 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:14.856656 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fbpzt_9c611987-0423-4488-b0f7-408d1c68cda1/kube-multus-additional-cni-plugins/0.log" Apr 16 21:00:14.890087 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:14.890056 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fbpzt_9c611987-0423-4488-b0f7-408d1c68cda1/egress-router-binary-copy/0.log" Apr 16 21:00:14.937233 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:14.937197 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fbpzt_9c611987-0423-4488-b0f7-408d1c68cda1/cni-plugins/0.log" Apr 16 21:00:14.974631 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:14.974602 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fbpzt_9c611987-0423-4488-b0f7-408d1c68cda1/bond-cni-plugin/0.log" Apr 16 21:00:15.014590 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:15.014511 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fbpzt_9c611987-0423-4488-b0f7-408d1c68cda1/routeoverride-cni/0.log" Apr 16 21:00:15.054420 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:15.054383 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fbpzt_9c611987-0423-4488-b0f7-408d1c68cda1/whereabouts-cni-bincopy/0.log" Apr 16 21:00:15.095463 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:15.095436 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fbpzt_9c611987-0423-4488-b0f7-408d1c68cda1/whereabouts-cni/0.log" Apr 16 21:00:15.686387 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:15.686352 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-h5ntx_ee593b7f-fc54-40a3-af7d-f5643196a107/network-metrics-daemon/0.log" Apr 16 21:00:15.715597 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:15.715485 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-h5ntx_ee593b7f-fc54-40a3-af7d-f5643196a107/kube-rbac-proxy/0.log" Apr 16 21:00:16.385520 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:16.385493 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-controller/0.log" Apr 16 21:00:16.405290 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:16.405262 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/0.log" Apr 16 21:00:16.417471 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:16.417443 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovn-acl-logging/1.log" Apr 16 21:00:16.437400 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:16.437374 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/kube-rbac-proxy-node/0.log" Apr 16 21:00:16.463602 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:16.463576 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 21:00:16.487251 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:16.487225 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/northd/0.log" Apr 16 21:00:16.510972 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:16.510944 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/nbdb/0.log" Apr 16 21:00:16.537932 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:16.537909 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/sbdb/0.log" Apr 16 21:00:16.637903 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:16.637816 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-blr8v_1f2eb888-db83-4f12-83ec-2f634c4cf807/ovnkube-controller/0.log" Apr 16 21:00:18.229037 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:18.229007 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-gvgs9_025cbe57-f51f-4203-a8cc-6b592a9735f7/check-endpoints/0.log" Apr 16 21:00:18.299942 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:18.299905 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-zdf2s_ade2626d-4dc5-4796-9c91-0c0699095807/network-check-target-container/0.log" Apr 16 21:00:19.174984 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:19.174953 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-f69jd_933ff827-8c81-4476-a08c-6f416ce84bd6/iptables-alerter/0.log" Apr 16 21:00:19.831319 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:19.831291 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-2qpdd_ebce4803-f39d-4f96-8ee2-9b2eab78da74/tuned/0.log" Apr 16 21:00:21.395727 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:21.395693 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-d2sbf_7f8bccf8-1c1e-4890-98f3-747f76421e6c/cluster-samples-operator/0.log" Apr 16 21:00:21.410949 ip-10-0-129-41 kubenswrapper[2571]: I0416 21:00:21.410913 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-d2sbf_7f8bccf8-1c1e-4890-98f3-747f76421e6c/cluster-samples-operator-watch/0.log"