Apr 17 17:24:32.965915 ip-10-0-130-17 systemd[1]: Starting Kubernetes Kubelet... Apr 17 17:24:33.453462 ip-10-0-130-17 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:24:33.453462 ip-10-0-130-17 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 17:24:33.453462 ip-10-0-130-17 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:24:33.453462 ip-10-0-130-17 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 17:24:33.453462 ip-10-0-130-17 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:24:33.455515 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.455420 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 17:24:33.459937 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459912 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:33.459937 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459932 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:33.459937 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459936 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:33.459937 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459941 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:33.459937 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459944 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:33.460141 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459947 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:33.460141 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459952 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:33.460141 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459956 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:33.460141 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459960 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:33.460141 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459963 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:33.460141 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459966 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:33.460141 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459970 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:33.460141 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459972 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:33.460141 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459975 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:33.460141 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459978 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:33.460141 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459981 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:33.460141 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459984 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:33.460141 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459986 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:33.460141 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459989 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:33.460141 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459992 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:33.460141 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.459994 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:33.460141 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460005 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:33.460141 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460008 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:33.460141 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460011 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:33.460612 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460013 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:33.460612 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460016 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:33.460612 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460019 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:33.460612 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460022 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:33.460612 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460025 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:33.460612 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460027 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:33.460612 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460030 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:33.460612 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460039 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:33.460612 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460042 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:33.460612 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460044 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:33.460612 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460047 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:33.460612 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460051 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:33.460612 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460055 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:33.460612 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460058 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:33.460612 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460062 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:33.460612 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460066 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:33.460612 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460069 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:33.460612 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460072 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:33.460612 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460075 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:33.460612 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460078 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:33.461101 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460080 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:33.461101 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460083 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:33.461101 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460086 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:33.461101 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460088 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:33.461101 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460091 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:33.461101 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460093 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:33.461101 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460096 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:33.461101 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460098 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:33.461101 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460100 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:33.461101 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460103 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:33.461101 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460106 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:33.461101 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460109 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:33.461101 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460111 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:33.461101 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460114 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:33.461101 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460116 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:33.461101 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460119 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:33.461101 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460121 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:33.461101 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460124 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:33.461101 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460127 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:33.461101 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460131 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:33.461632 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460133 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:33.461632 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460136 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:33.461632 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460139 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:33.461632 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460142 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:33.461632 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460144 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:33.461632 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460147 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:33.461632 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460150 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:33.461632 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460153 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:33.461632 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460155 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:33.461632 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460158 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:33.461632 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460161 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:33.461632 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460163 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:33.461632 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460166 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:33.461632 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460168 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:33.461632 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460171 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:33.461632 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460174 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:33.461632 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460177 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:33.461632 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460179 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:33.461632 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460182 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:33.461632 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460184 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:33.462111 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460187 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:33.462111 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460190 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:33.462111 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460609 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:33.462111 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460616 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:33.462111 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460619 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:33.462111 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460622 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:33.462111 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460625 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:33.462111 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460628 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:33.462111 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460632 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:33.462111 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460635 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:33.462111 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460637 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:33.462111 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460640 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:33.462111 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460643 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:33.462111 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460645 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:33.462111 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460648 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:33.462111 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460651 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:33.462111 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460653 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:33.462111 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460656 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:33.462111 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460660 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:33.462111 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460662 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:33.462629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460665 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:33.462629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460668 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:33.462629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460670 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:33.462629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460674 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:33.462629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460677 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:33.462629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460680 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:33.462629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460683 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:33.462629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460686 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:33.462629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460689 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:33.462629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460691 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:33.462629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460694 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:33.462629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460696 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:33.462629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460699 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:33.462629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460702 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:33.462629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460704 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:33.462629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460707 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:33.462629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460709 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:33.462629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460712 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:33.462629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460715 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:33.462629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460717 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:33.463157 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460720 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:33.463157 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460723 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:33.463157 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460726 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:33.463157 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460728 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:33.463157 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460731 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:33.463157 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460734 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:33.463157 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460736 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:33.463157 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460739 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:33.463157 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460742 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:33.463157 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460744 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:33.463157 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460747 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:33.463157 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460751 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:33.463157 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460753 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:33.463157 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460756 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:33.463157 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460758 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:33.463157 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460761 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:33.463157 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460763 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:33.463157 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460765 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:33.463157 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460768 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:33.463157 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460775 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:33.463661 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460778 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:33.463661 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460781 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:33.463661 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460783 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:33.463661 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460786 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:33.463661 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460788 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:33.463661 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460791 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:33.463661 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460794 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:33.463661 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460796 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:33.463661 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460799 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:33.463661 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460802 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:33.463661 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460804 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:33.463661 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460807 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:33.463661 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460809 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:33.463661 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460812 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:33.463661 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460815 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:33.463661 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460817 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:33.463661 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460820 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:33.463661 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460822 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:33.463661 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460827 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460831 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460834 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460836 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460840 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460843 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460845 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460848 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460850 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.460853 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461647 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461658 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461665 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461669 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461675 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461679 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461684 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461688 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461691 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461694 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461698 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 17:24:33.464138 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461701 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461704 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461707 2579 flags.go:64] FLAG: --cgroup-root="" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461710 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461713 2579 flags.go:64] FLAG: --client-ca-file="" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461716 2579 flags.go:64] FLAG: --cloud-config="" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461719 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461722 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461727 2579 flags.go:64] FLAG: --cluster-domain="" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461730 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461733 2579 flags.go:64] FLAG: --config-dir="" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461736 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461739 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461744 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461747 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461750 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461754 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461757 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461760 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461763 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461766 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461783 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461789 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461793 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461796 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 17:24:33.464652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461799 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461803 2579 flags.go:64] FLAG: --enable-server="true" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461806 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461810 2579 flags.go:64] FLAG: --event-burst="100" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461814 2579 flags.go:64] FLAG: --event-qps="50" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461817 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461820 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461823 2579 flags.go:64] FLAG: --eviction-hard="" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461832 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461835 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461837 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461840 2579 flags.go:64] FLAG: --eviction-soft="" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461843 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461846 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461849 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461852 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461855 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461858 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461861 2579 flags.go:64] FLAG: --feature-gates="" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461865 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461869 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461872 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461875 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461878 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461881 2579 flags.go:64] FLAG: --help="false" Apr 17 17:24:33.465251 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461884 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-130-17.ec2.internal" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461887 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461890 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461893 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461896 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461899 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461902 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461905 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461908 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461911 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461913 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461917 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461919 2579 flags.go:64] FLAG: --kube-reserved="" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461922 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461925 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461931 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461933 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461936 2579 flags.go:64] FLAG: --lock-file="" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461939 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461942 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461945 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461950 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461953 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461955 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 17:24:33.465869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461958 2579 flags.go:64] FLAG: --logging-format="text" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461961 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461965 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461968 2579 flags.go:64] FLAG: --manifest-url="" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461971 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461975 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461978 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461982 2579 flags.go:64] FLAG: --max-pods="110" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461985 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461988 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461991 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461994 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.461997 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462000 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462003 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462011 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462014 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462017 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462020 2579 flags.go:64] FLAG: --pod-cidr="" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462023 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462030 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462033 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462036 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462039 2579 flags.go:64] FLAG: --port="10250" Apr 17 17:24:33.466444 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462043 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462046 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0357f4d3d0e0ca47a" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462049 2579 flags.go:64] FLAG: --qos-reserved="" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462052 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462055 2579 flags.go:64] FLAG: --register-node="true" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462058 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462061 2579 flags.go:64] FLAG: --register-with-taints="" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462065 2579 flags.go:64] FLAG: --registry-burst="10" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462068 2579 flags.go:64] FLAG: --registry-qps="5" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462071 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462074 2579 flags.go:64] FLAG: --reserved-memory="" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462078 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462081 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462084 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462087 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462090 2579 flags.go:64] FLAG: --runonce="false" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462093 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462096 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462099 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462102 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462105 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462108 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462110 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462114 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462117 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462151 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 17:24:33.467048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462155 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462158 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462162 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462165 2579 flags.go:64] FLAG: --system-cgroups="" Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462168 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462174 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462177 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462180 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462186 2579 flags.go:64] FLAG: --tls-min-version="" Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462189 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462191 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462194 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462197 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462200 2579 flags.go:64] FLAG: --v="2" Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462204 2579 flags.go:64] FLAG: --version="false" Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462208 2579 flags.go:64] FLAG: --vmodule="" Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462212 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.462216 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462339 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462342 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462346 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462349 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462352 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462355 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:33.467704 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462358 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:33.468298 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462361 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:33.468298 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462364 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:33.468298 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462366 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:33.468298 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462369 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:33.468298 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462372 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:33.468298 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462374 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:33.468298 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462376 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:33.468298 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462379 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:33.468298 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462381 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:33.468298 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462386 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:33.468298 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462389 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:33.468298 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462391 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:33.468298 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462393 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:33.468298 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462397 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:33.468298 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462399 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:33.468298 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462402 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:33.468298 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462404 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:33.468298 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462407 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:33.468298 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462409 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:33.468799 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462412 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:33.468799 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462415 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:33.468799 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462418 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:33.468799 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462422 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:33.468799 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462425 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:33.468799 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462427 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:33.468799 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462430 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:33.468799 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462432 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:33.468799 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462435 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:33.468799 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462437 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:33.468799 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462440 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:33.468799 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462443 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:33.468799 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462445 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:33.468799 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462447 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:33.468799 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462450 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:33.468799 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462452 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:33.468799 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462455 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:33.468799 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462458 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:33.468799 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462460 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:33.468799 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462463 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:33.469376 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462465 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:33.469376 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462467 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:33.469376 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462472 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:33.469376 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462475 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:33.469376 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462478 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:33.469376 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462480 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:33.469376 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462483 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:33.469376 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462486 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:33.469376 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462488 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:33.469376 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462491 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:33.469376 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462494 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:33.469376 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462497 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:33.469376 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462500 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:33.469376 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462502 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:33.469376 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462505 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:33.469376 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462508 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:33.469376 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462510 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:33.469376 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462513 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:33.469376 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462515 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:33.469879 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462517 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:33.469879 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462520 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:33.469879 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462523 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:33.469879 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462525 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:33.469879 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462527 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:33.469879 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462530 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:33.469879 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462532 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:33.469879 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462535 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:33.469879 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462537 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:33.469879 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462540 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:33.469879 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462542 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:33.469879 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462545 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:33.469879 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462547 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:33.469879 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462550 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:33.469879 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462552 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:33.469879 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462557 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:33.469879 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462559 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:33.469879 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462562 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:33.469879 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462565 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:33.469879 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462568 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:33.470374 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.462570 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:33.470374 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.463666 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:24:33.470922 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.470894 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 17:24:33.470922 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.470922 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 17:24:33.470995 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.470987 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:33.470995 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.470993 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:33.470995 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.470997 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:33.471077 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471000 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:33.471077 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471004 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:33.471077 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471007 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:33.471077 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471010 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:33.471077 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471013 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:33.471077 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471015 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:33.471077 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471018 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:33.471077 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471021 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:33.471077 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471023 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:33.471077 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471026 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:33.471077 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471029 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:33.471077 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471031 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:33.471077 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471034 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:33.471077 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471037 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:33.471077 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471039 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:33.471077 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471042 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:33.471077 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471045 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:33.471077 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471048 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:33.471077 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471051 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:33.471077 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471053 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:33.471580 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471056 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:33.471580 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471058 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:33.471580 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471061 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:33.471580 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471064 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:33.471580 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471068 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:33.471580 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471073 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:33.471580 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471077 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:33.471580 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471080 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:33.471580 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471085 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:33.471580 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471088 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:33.471580 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471091 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:33.471580 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471093 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:33.471580 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471096 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:33.471580 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471099 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:33.471580 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471101 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:33.471580 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471104 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:33.471580 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471107 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:33.471580 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471109 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:33.471580 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471112 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:33.472184 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471114 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:33.472184 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471117 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:33.472184 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471119 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:33.472184 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471122 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:33.472184 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471125 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:33.472184 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471127 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:33.472184 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471130 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:33.472184 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471133 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:33.472184 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471136 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:33.472184 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471140 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:33.472184 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471143 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:33.472184 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471145 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:33.472184 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471148 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:33.472184 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471151 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:33.472184 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471153 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:33.472184 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471156 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:33.472184 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471159 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:33.472184 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471162 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:33.472184 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471166 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:33.472686 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471169 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:33.472686 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471171 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:33.472686 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471175 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:33.472686 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471178 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:33.472686 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471181 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:33.472686 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471183 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:33.472686 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471186 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:33.472686 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471188 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:33.472686 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471191 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:33.472686 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471193 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:33.472686 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471196 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:33.472686 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471198 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:33.472686 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471201 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:33.472686 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471203 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:33.472686 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471206 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:33.472686 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471208 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:33.472686 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471211 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:33.472686 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471213 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:33.472686 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471216 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:33.472686 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471218 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:33.473198 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471221 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:33.473198 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471223 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:33.473198 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471226 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:33.473198 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471229 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:33.473198 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471231 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:33.473198 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.471238 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:24:33.473198 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471360 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:33.473198 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471365 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:33.473198 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471369 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:33.473198 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471372 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:33.473198 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471375 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:33.473198 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471378 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:33.473198 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471381 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:33.473198 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471383 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:33.473198 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471386 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:33.473629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471389 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:33.473629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471392 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:33.473629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471395 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:33.473629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471398 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:33.473629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471400 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:33.473629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471403 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:33.473629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471406 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:33.473629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471409 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:33.473629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471411 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:33.473629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471415 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:33.473629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471419 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:33.473629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471422 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:33.473629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471424 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:33.473629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471427 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:33.473629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471429 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:33.473629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471432 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:33.473629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471435 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:33.473629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471437 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:33.473629 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471440 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:33.474107 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471443 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:33.474107 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471447 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:33.474107 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471449 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:33.474107 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471452 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:33.474107 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471455 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:33.474107 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471457 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:33.474107 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471460 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:33.474107 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471462 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:33.474107 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471466 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:33.474107 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471470 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:33.474107 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471473 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:33.474107 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471476 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:33.474107 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471479 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:33.474107 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471482 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:33.474107 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471485 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:33.474107 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471493 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:33.474107 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471496 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:33.474107 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471499 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:33.474107 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471502 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:33.474107 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471505 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:33.474667 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471508 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:33.474667 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471511 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:33.474667 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471513 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:33.474667 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471516 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:33.474667 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471519 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:33.474667 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471521 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:33.474667 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471524 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:33.474667 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471526 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:33.474667 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471529 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:33.474667 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471531 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:33.474667 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471534 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:33.474667 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471536 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:33.474667 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471539 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:33.474667 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471542 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:33.474667 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471544 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:33.474667 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471547 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:33.474667 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471550 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:33.474667 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471552 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:33.474667 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471555 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:33.474667 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471557 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:33.475169 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471560 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:33.475169 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471562 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:33.475169 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471565 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:33.475169 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471567 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:33.475169 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471570 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:33.475169 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471573 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:33.475169 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471575 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:33.475169 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471578 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:33.475169 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471609 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:33.475169 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471615 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:33.475169 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471618 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:33.475169 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471620 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:33.475169 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471623 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:33.475169 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471626 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:33.475169 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471629 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:33.475169 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471631 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:33.475169 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471634 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:33.475169 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:33.471637 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:33.475627 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.471642 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:24:33.475627 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.472384 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 17:24:33.475627 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.474522 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 17:24:33.475719 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.475659 2579 server.go:1019] "Starting client certificate rotation" Apr 17 17:24:33.475781 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.475758 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:24:33.476436 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.476423 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:24:33.505602 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.505569 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:24:33.511495 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.511465 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:24:33.523624 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.523604 2579 log.go:25] "Validated CRI v1 runtime API" Apr 17 17:24:33.533217 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.533194 2579 log.go:25] "Validated CRI v1 image API" Apr 17 17:24:33.534345 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.534325 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 17:24:33.537392 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.537369 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:24:33.538823 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.538797 2579 fs.go:135] Filesystem UUIDs: map[6354b16a-25b3-457c-89ee-965e96fb0e3a:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 9908dd2b-3d86-4af7-93fd-009ee88c6264:/dev/nvme0n1p3] Apr 17 17:24:33.538866 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.538824 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 17:24:33.544754 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.544643 2579 manager.go:217] Machine: {Timestamp:2026-04-17 17:24:33.543214325 +0000 UTC m=+0.449947120 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097891 MemoryCapacity:33164500992 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20cc73c8eaf48de50ae9e0e7d14a79 SystemUUID:ec20cc73-c8ea-f48d-e50a-e9e0e7d14a79 BootID:5ad4e3ac-7e5c-4079-9bb0-284a3a6366da Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582250496 Type:vfs Inodes:4048401 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:11:de:7d:95:c5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:11:de:7d:95:c5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:fe:a4:50:60:12:8d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164500992 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 17:24:33.544754 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.544749 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 17:24:33.544886 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.544874 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 17:24:33.545758 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.545731 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 17:24:33.545908 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.545760 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-17.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 17:24:33.545952 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.545915 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 17:24:33.545952 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.545923 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 17:24:33.545952 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.545936 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:24:33.546688 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.546677 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:24:33.548131 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.548121 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:24:33.548244 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.548235 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 17:24:33.550881 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.550871 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 17 17:24:33.550956 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.550886 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 17:24:33.550956 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.550898 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 17:24:33.550956 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.550908 2579 kubelet.go:397] "Adding apiserver pod source" Apr 17 17:24:33.550956 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.550916 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 17:24:33.551852 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.551838 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:24:33.551895 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.551859 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:24:33.555232 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.555204 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 17:24:33.556512 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.556498 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 17:24:33.558436 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.558424 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 17:24:33.558480 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.558441 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 17:24:33.558480 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.558448 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 17:24:33.558480 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.558453 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 17:24:33.558480 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.558459 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 17:24:33.558480 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.558465 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 17:24:33.558480 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.558471 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 17:24:33.558480 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.558476 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 17:24:33.558683 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.558484 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 17:24:33.558683 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.558490 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 17:24:33.558683 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.558499 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 17:24:33.558683 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.558509 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 17:24:33.559568 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.559559 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 17:24:33.559615 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.559568 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 17:24:33.563636 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.563618 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-17.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 17:24:33.563748 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:33.563632 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-17.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 17:24:33.563748 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.563732 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 17:24:33.563748 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:33.563735 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 17:24:33.563882 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.563774 2579 server.go:1295] "Started kubelet" Apr 17 17:24:33.563932 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.563880 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 17:24:33.563987 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.563973 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 17:24:33.564048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.563925 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 17:24:33.564730 ip-10-0-130-17 systemd[1]: Started Kubernetes Kubelet. Apr 17 17:24:33.565164 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.564945 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 17:24:33.565164 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.565076 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 17 17:24:33.570407 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:33.569277 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-17.ec2.internal.18a734d66c30f146 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-17.ec2.internal,UID:ip-10-0-130-17.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-17.ec2.internal,},FirstTimestamp:2026-04-17 17:24:33.563742534 +0000 UTC m=+0.470475328,LastTimestamp:2026-04-17 17:24:33.563742534 +0000 UTC m=+0.470475328,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-17.ec2.internal,}" Apr 17 17:24:33.570513 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.570481 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 17:24:33.571061 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.571041 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 17:24:33.571718 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.571696 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 17:24:33.571718 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.571714 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 17:24:33.571886 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.571717 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 17:24:33.571886 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.571869 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 17 17:24:33.571886 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.571877 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 17 17:24:33.572223 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:33.572205 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-17.ec2.internal\" not found" Apr 17 17:24:33.576407 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.576389 2579 factory.go:153] Registering CRI-O factory Apr 17 17:24:33.576502 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.576452 2579 factory.go:223] Registration of the crio container factory successfully Apr 17 17:24:33.576568 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.576515 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 17:24:33.576568 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.576524 2579 factory.go:55] Registering systemd factory Apr 17 17:24:33.576568 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.576533 2579 factory.go:223] Registration of the systemd container factory successfully Apr 17 17:24:33.576568 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.576554 2579 factory.go:103] Registering Raw factory Apr 17 17:24:33.576568 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.576568 2579 manager.go:1196] Started watching for new ooms in manager Apr 17 17:24:33.577026 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.577009 2579 manager.go:319] Starting recovery of all containers Apr 17 17:24:33.577568 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:33.577520 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 17:24:33.577708 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:33.577606 2579 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-17.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 17:24:33.578238 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:33.578206 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 17:24:33.587151 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.587136 2579 manager.go:324] Recovery completed Apr 17 17:24:33.587935 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.587913 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-l8ltq" Apr 17 17:24:33.594033 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.594018 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:33.596439 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.596420 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-17.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:33.596517 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.596451 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:33.596517 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.596462 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-17.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:33.596919 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.596905 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 17:24:33.596919 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.596917 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 17:24:33.596981 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.596935 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:24:33.598892 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:33.598830 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-17.ec2.internal.18a734d66e23d624 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-17.ec2.internal,UID:ip-10-0-130-17.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-130-17.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-130-17.ec2.internal,},FirstTimestamp:2026-04-17 17:24:33.596438052 +0000 UTC m=+0.503170845,LastTimestamp:2026-04-17 17:24:33.596438052 +0000 UTC m=+0.503170845,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-17.ec2.internal,}" Apr 17 17:24:33.599190 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.599168 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-l8ltq" Apr 17 17:24:33.600664 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.600651 2579 policy_none.go:49] "None policy: Start" Apr 17 17:24:33.600710 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.600668 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 17:24:33.600710 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.600679 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 17 17:24:33.649653 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.634229 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 17:24:33.649653 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.635230 2579 manager.go:341] "Starting Device Plugin manager" Apr 17 17:24:33.649653 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:33.635344 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 17:24:33.649653 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.635358 2579 server.go:85] "Starting device plugin registration server" Apr 17 17:24:33.649653 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.635551 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 17:24:33.649653 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.635580 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 17:24:33.649653 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.635618 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 17:24:33.649653 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.635627 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 17:24:33.649653 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.635638 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 17:24:33.649653 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.635629 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 17:24:33.649653 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.635799 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 17:24:33.649653 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:33.635805 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 17:24:33.649653 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.635871 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 17:24:33.649653 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.635880 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 17:24:33.649653 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:33.636444 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 17:24:33.649653 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:33.636486 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-17.ec2.internal\" not found" Apr 17 17:24:33.649653 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.639843 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:33.736368 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.736283 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:33.736479 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.736282 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-17.ec2.internal"] Apr 17 17:24:33.736523 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.736483 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:33.741775 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.741754 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-17.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:33.741902 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.741787 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:33.741902 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.741802 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-17.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:33.741902 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.741756 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-17.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:33.741902 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.741868 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:33.741902 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.741889 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-17.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:33.742128 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.741915 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-17.ec2.internal" Apr 17 17:24:33.744179 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.744166 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:33.744296 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.744283 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal" Apr 17 17:24:33.744343 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.744315 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:33.744930 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.744908 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-17.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:33.745017 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.744971 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:33.745017 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.744984 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-17.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:33.745017 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.744908 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-17.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:33.745117 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.745022 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:33.745117 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.745035 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-17.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:33.747071 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.747056 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-17.ec2.internal" Apr 17 17:24:33.747145 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.747082 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:33.747793 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.747778 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-17.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:33.747850 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.747805 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:33.747850 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.747815 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-17.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:33.750826 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.750811 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-17.ec2.internal" Apr 17 17:24:33.750919 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:33.750835 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-17.ec2.internal\": node \"ip-10-0-130-17.ec2.internal\" not found" Apr 17 17:24:33.766527 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:33.766502 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-17.ec2.internal\" not found" Apr 17 17:24:33.770240 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:33.770224 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-17.ec2.internal\" not found" node="ip-10-0-130-17.ec2.internal" Apr 17 17:24:33.774642 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:33.774625 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-17.ec2.internal\" not found" node="ip-10-0-130-17.ec2.internal" Apr 17 17:24:33.866810 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:33.866780 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-17.ec2.internal\" not found" Apr 17 17:24:33.873096 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.873077 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f3374a8f3007e3f35c70fe0710807983-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal\" (UID: \"f3374a8f3007e3f35c70fe0710807983\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal" Apr 17 17:24:33.873172 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.873102 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3374a8f3007e3f35c70fe0710807983-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal\" (UID: \"f3374a8f3007e3f35c70fe0710807983\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal" Apr 17 17:24:33.873172 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.873121 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ee539be09c69658279b69c4b8f0acb61-config\") pod \"kube-apiserver-proxy-ip-10-0-130-17.ec2.internal\" (UID: \"ee539be09c69658279b69c4b8f0acb61\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-17.ec2.internal" Apr 17 17:24:33.967266 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:33.967221 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-17.ec2.internal\" not found" Apr 17 17:24:33.973600 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.973569 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f3374a8f3007e3f35c70fe0710807983-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal\" (UID: \"f3374a8f3007e3f35c70fe0710807983\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal" Apr 17 17:24:33.973680 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.973616 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3374a8f3007e3f35c70fe0710807983-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal\" (UID: \"f3374a8f3007e3f35c70fe0710807983\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal" Apr 17 17:24:33.973680 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.973635 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ee539be09c69658279b69c4b8f0acb61-config\") pod \"kube-apiserver-proxy-ip-10-0-130-17.ec2.internal\" (UID: \"ee539be09c69658279b69c4b8f0acb61\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-17.ec2.internal" Apr 17 17:24:33.973783 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.973707 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f3374a8f3007e3f35c70fe0710807983-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal\" (UID: \"f3374a8f3007e3f35c70fe0710807983\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal" Apr 17 17:24:33.973783 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.973698 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3374a8f3007e3f35c70fe0710807983-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal\" (UID: \"f3374a8f3007e3f35c70fe0710807983\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal" Apr 17 17:24:33.973783 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:33.973714 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ee539be09c69658279b69c4b8f0acb61-config\") pod \"kube-apiserver-proxy-ip-10-0-130-17.ec2.internal\" (UID: \"ee539be09c69658279b69c4b8f0acb61\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-17.ec2.internal" Apr 17 17:24:34.068117 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:34.068041 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-17.ec2.internal\" not found" Apr 17 17:24:34.072219 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.072205 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal" Apr 17 17:24:34.076890 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.076863 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-17.ec2.internal" Apr 17 17:24:34.168372 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:34.168327 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-17.ec2.internal\" not found" Apr 17 17:24:34.268960 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:34.268916 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-17.ec2.internal\" not found" Apr 17 17:24:34.369464 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:34.369391 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-17.ec2.internal\" not found" Apr 17 17:24:34.463536 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.463510 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:34.470425 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:34.470402 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-17.ec2.internal\" not found" Apr 17 17:24:34.475819 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.475802 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 17:24:34.475945 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.475928 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:24:34.476003 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.475927 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:24:34.571049 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.571020 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 17:24:34.571222 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:34.571022 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-17.ec2.internal\" not found" Apr 17 17:24:34.582398 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.582368 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:24:34.601997 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.601956 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 17:19:33 +0000 UTC" deadline="2027-11-01 03:34:51.930932637 +0000 UTC" Apr 17 17:24:34.601997 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.601991 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13498h10m17.328945188s" Apr 17 17:24:34.604772 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.604750 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-fw4sp" Apr 17 17:24:34.613205 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.613180 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-fw4sp" Apr 17 17:24:34.617363 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:34.617329 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee539be09c69658279b69c4b8f0acb61.slice/crio-2027d3115374ae2dc0f30ddea70b2300cdb2205e0f353479020c8e4b29c72513 WatchSource:0}: Error finding container 2027d3115374ae2dc0f30ddea70b2300cdb2205e0f353479020c8e4b29c72513: Status 404 returned error can't find the container with id 2027d3115374ae2dc0f30ddea70b2300cdb2205e0f353479020c8e4b29c72513 Apr 17 17:24:34.617762 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:34.617740 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3374a8f3007e3f35c70fe0710807983.slice/crio-0b84d7167167a8b80da2afe49ebc641bcae87a2051c317f607085ba03d0255f4 WatchSource:0}: Error finding container 0b84d7167167a8b80da2afe49ebc641bcae87a2051c317f607085ba03d0255f4: Status 404 returned error can't find the container with id 0b84d7167167a8b80da2afe49ebc641bcae87a2051c317f607085ba03d0255f4 Apr 17 17:24:34.622311 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.622297 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:24:34.639424 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.639381 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal" event={"ID":"f3374a8f3007e3f35c70fe0710807983","Type":"ContainerStarted","Data":"0b84d7167167a8b80da2afe49ebc641bcae87a2051c317f607085ba03d0255f4"} Apr 17 17:24:34.640295 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.640271 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-17.ec2.internal" event={"ID":"ee539be09c69658279b69c4b8f0acb61","Type":"ContainerStarted","Data":"2027d3115374ae2dc0f30ddea70b2300cdb2205e0f353479020c8e4b29c72513"} Apr 17 17:24:34.658239 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.658217 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:34.671512 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:34.671482 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-17.ec2.internal\" not found" Apr 17 17:24:34.771953 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:34.771915 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-17.ec2.internal\" not found" Apr 17 17:24:34.872372 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:34.872284 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-17.ec2.internal\" not found" Apr 17 17:24:34.969987 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.969955 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:34.971613 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.971583 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal" Apr 17 17:24:34.983390 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.983365 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:24:34.984389 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.984376 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-17.ec2.internal" Apr 17 17:24:34.992415 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:34.992397 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:24:35.551486 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.551398 2579 apiserver.go:52] "Watching apiserver" Apr 17 17:24:35.560635 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.560604 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 17:24:35.560931 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.560906 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-130-17.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6","openshift-cluster-node-tuning-operator/tuned-jscv2","openshift-network-diagnostics/network-check-target-m8x5g","openshift-image-registry/node-ca-9szwb","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal","openshift-multus/multus-additional-cni-plugins-z8qx6","openshift-multus/multus-njgk7","openshift-multus/network-metrics-daemon-z942n","openshift-network-operator/iptables-alerter-bbk45","openshift-ovn-kubernetes/ovnkube-node-h29v4","kube-system/konnectivity-agent-bzxhn"] Apr 17 17:24:35.563393 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.563361 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.565626 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.565581 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.565753 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.565698 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 17:24:35.565819 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.565802 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 17:24:35.565902 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.565886 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-b6dcv\"" Apr 17 17:24:35.565949 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.565911 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 17:24:35.567472 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.567451 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:24:35.568127 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.568014 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-5x4pc\"" Apr 17 17:24:35.568127 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.568045 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 17:24:35.570482 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.570444 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.572495 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.572415 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 17:24:35.572614 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.572536 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 17:24:35.572684 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.572663 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 17:24:35.572747 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.572731 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-6t24b\"" Apr 17 17:24:35.572800 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.572761 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 17:24:35.572800 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.572763 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.572891 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.572877 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 17:24:35.574702 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.574682 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 17:24:35.574796 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.574762 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-v4g62\"" Apr 17 17:24:35.574966 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.574941 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:35.575035 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.575027 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-bbk45" Apr 17 17:24:35.575263 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:35.575018 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z942n" podUID="d168f0a0-7fcd-4905-a424-24a94b7fcdbb" Apr 17 17:24:35.576782 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.576763 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:24:35.577047 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.577025 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-sp75x\"" Apr 17 17:24:35.577124 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.577045 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 17:24:35.577304 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.577282 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bzxhn" Apr 17 17:24:35.577944 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.577926 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 17:24:35.579265 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.579247 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 17:24:35.579416 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.579401 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-f9gp2\"" Apr 17 17:24:35.579487 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.579473 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 17:24:35.579641 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.579626 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:24:35.579713 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:35.579691 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m8x5g" podUID="0ace93ad-4902-4616-82aa-f2d931df41ef" Apr 17 17:24:35.580519 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.580489 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-etc-kubernetes\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.580614 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.580537 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/54d743cb-8308-489d-817b-2a68d49ddfa1-registration-dir\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.580614 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.580567 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85xhq\" (UniqueName: \"kubernetes.io/projected/54d743cb-8308-489d-817b-2a68d49ddfa1-kube-api-access-85xhq\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.580727 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.580614 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-lib-modules\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.580727 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.580640 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ced94cc1-575e-4efc-8406-8add5b3da29c-tmp\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.580727 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.580663 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28463658-293e-4847-bb58-c40452c9ceba-system-cni-dir\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.580727 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.580687 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-hostroot\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.580727 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.580710 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-multus-conf-dir\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.581001 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.580755 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs\") pod \"network-metrics-daemon-z942n\" (UID: \"d168f0a0-7fcd-4905-a424-24a94b7fcdbb\") " pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:35.581001 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.580787 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-systemd\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.581001 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.580811 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/28463658-293e-4847-bb58-c40452c9ceba-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.581001 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.580849 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54d743cb-8308-489d-817b-2a68d49ddfa1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.581001 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.580877 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/54d743cb-8308-489d-817b-2a68d49ddfa1-etc-selinux\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.581001 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.580903 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/54d743cb-8308-489d-817b-2a68d49ddfa1-sys-fs\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.581001 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.580935 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-sys\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.581001 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.580969 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-tuned\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.581381 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581007 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28463658-293e-4847-bb58-c40452c9ceba-cni-binary-copy\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.581381 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581064 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-multus-socket-dir-parent\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.581381 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581100 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-host-var-lib-kubelet\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.581381 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581126 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-host\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.581381 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581153 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28463658-293e-4847-bb58-c40452c9ceba-cnibin\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.581381 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581178 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/28463658-293e-4847-bb58-c40452c9ceba-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.581381 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581211 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-cni-binary-copy\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.581381 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581233 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-host-var-lib-cni-bin\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.581381 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581260 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsvx7\" (UniqueName: \"kubernetes.io/projected/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-kube-api-access-wsvx7\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.581381 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581288 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b5bl\" (UniqueName: \"kubernetes.io/projected/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-kube-api-access-6b5bl\") pod \"network-metrics-daemon-z942n\" (UID: \"d168f0a0-7fcd-4905-a424-24a94b7fcdbb\") " pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:35.581381 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581316 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/54d743cb-8308-489d-817b-2a68d49ddfa1-device-dir\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.581381 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581358 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-sysconfig\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.581862 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581400 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-sysctl-d\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.581862 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581422 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-var-lib-kubelet\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.581862 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581467 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28463658-293e-4847-bb58-c40452c9ceba-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.581862 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581492 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-system-cni-dir\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.581862 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581515 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-multus-cni-dir\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.581862 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581537 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-run\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.581862 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581561 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zxp8\" (UniqueName: \"kubernetes.io/projected/ced94cc1-575e-4efc-8406-8add5b3da29c-kube-api-access-9zxp8\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.581862 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581582 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrl8b\" (UniqueName: \"kubernetes.io/projected/28463658-293e-4847-bb58-c40452c9ceba-kube-api-access-mrl8b\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.581862 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581628 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-cnibin\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.581862 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581664 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-os-release\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.581862 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581705 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-host-var-lib-cni-multus\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.581862 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581745 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-host-run-multus-certs\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.581862 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581781 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/54d743cb-8308-489d-817b-2a68d49ddfa1-socket-dir\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.581862 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581819 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-modprobe-d\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.581862 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581843 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-kubernetes\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.582451 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581884 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-sysctl-conf\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.582451 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581907 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28463658-293e-4847-bb58-c40452c9ceba-os-release\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.582451 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581930 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-host-run-k8s-cni-cncf-io\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.582451 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.581974 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-host-run-netns\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.582451 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.582022 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-multus-daemon-config\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.582451 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.582217 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9szwb" Apr 17 17:24:35.584832 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.584713 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 17:24:35.584832 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.584745 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 17:24:35.585151 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.585128 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 17:24:35.586365 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.585417 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.586365 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.585707 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vlcl6\"" Apr 17 17:24:35.588902 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.588469 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 17:24:35.588985 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.588948 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 17:24:35.589792 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.589317 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 17:24:35.589792 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.589531 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 17:24:35.589792 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.589762 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-zqhdk\"" Apr 17 17:24:35.590396 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.589978 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 17:24:35.591255 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.591238 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 17:24:35.613922 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.613855 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:19:34 +0000 UTC" deadline="2027-12-28 07:34:03.774299338 +0000 UTC" Apr 17 17:24:35.613922 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.613886 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14870h9m28.160417318s" Apr 17 17:24:35.673298 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.673267 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 17:24:35.682656 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.682625 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbsn2\" (UniqueName: \"kubernetes.io/projected/8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b-kube-api-access-xbsn2\") pod \"iptables-alerter-bbk45\" (UID: \"8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b\") " pod="openshift-network-operator/iptables-alerter-bbk45" Apr 17 17:24:35.682816 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.682666 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-cni-binary-copy\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.682816 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.682684 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-host-var-lib-cni-bin\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.682816 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.682703 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4tws\" (UniqueName: \"kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws\") pod \"network-check-target-m8x5g\" (UID: \"0ace93ad-4902-4616-82aa-f2d931df41ef\") " pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:24:35.682816 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.682758 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-run-systemd\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.682816 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.682769 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-host-var-lib-cni-bin\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.683014 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.682865 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a926d37e-a35b-4d6d-a341-5f224db6cd94-agent-certs\") pod \"konnectivity-agent-bzxhn\" (UID: \"a926d37e-a35b-4d6d-a341-5f224db6cd94\") " pod="kube-system/konnectivity-agent-bzxhn" Apr 17 17:24:35.683014 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.682885 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b-iptables-alerter-script\") pod \"iptables-alerter-bbk45\" (UID: \"8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b\") " pod="openshift-network-operator/iptables-alerter-bbk45" Apr 17 17:24:35.683014 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.682905 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-run\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.683014 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.682962 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zxp8\" (UniqueName: \"kubernetes.io/projected/ced94cc1-575e-4efc-8406-8add5b3da29c-kube-api-access-9zxp8\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.683014 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.682991 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrl8b\" (UniqueName: \"kubernetes.io/projected/28463658-293e-4847-bb58-c40452c9ceba-kube-api-access-mrl8b\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.683240 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683030 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-host-var-lib-cni-multus\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.683240 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683052 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-run-ovn\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.683240 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683068 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d18b6520-db25-43a0-bca5-6990fef41e34-serviceca\") pod \"node-ca-9szwb\" (UID: \"d18b6520-db25-43a0-bca5-6990fef41e34\") " pod="openshift-image-registry/node-ca-9szwb" Apr 17 17:24:35.683240 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683087 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-kubernetes\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.683240 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683111 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-host-run-k8s-cni-cncf-io\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.683240 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683145 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-host-var-lib-cni-multus\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.683240 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683158 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-host-run-netns\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.683240 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683196 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-host-run-netns\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.683240 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683208 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-host-run-k8s-cni-cncf-io\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.683240 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683234 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-etc-kubernetes\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.683714 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683247 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-kubernetes\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.683714 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683267 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-run-openvswitch\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.683714 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683301 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-ovnkube-config\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.683714 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683290 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-etc-kubernetes\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.683714 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683330 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgsxj\" (UniqueName: \"kubernetes.io/projected/d18b6520-db25-43a0-bca5-6990fef41e34-kube-api-access-mgsxj\") pod \"node-ca-9szwb\" (UID: \"d18b6520-db25-43a0-bca5-6990fef41e34\") " pod="openshift-image-registry/node-ca-9szwb" Apr 17 17:24:35.683714 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683358 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-cni-binary-copy\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.683714 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683358 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-run\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.683714 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683363 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b-host-slash\") pod \"iptables-alerter-bbk45\" (UID: \"8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b\") " pod="openshift-network-operator/iptables-alerter-bbk45" Apr 17 17:24:35.683714 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683420 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85xhq\" (UniqueName: \"kubernetes.io/projected/54d743cb-8308-489d-817b-2a68d49ddfa1-kube-api-access-85xhq\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.683714 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683439 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28463658-293e-4847-bb58-c40452c9ceba-system-cni-dir\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.683714 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683456 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs\") pod \"network-metrics-daemon-z942n\" (UID: \"d168f0a0-7fcd-4905-a424-24a94b7fcdbb\") " pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:35.683714 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683482 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/28463658-293e-4847-bb58-c40452c9ceba-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.683714 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683514 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-log-socket\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.683714 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683539 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-env-overrides\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.683714 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683569 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d18b6520-db25-43a0-bca5-6990fef41e34-host\") pod \"node-ca-9szwb\" (UID: \"d18b6520-db25-43a0-bca5-6990fef41e34\") " pod="openshift-image-registry/node-ca-9szwb" Apr 17 17:24:35.683714 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683614 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54d743cb-8308-489d-817b-2a68d49ddfa1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.683714 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683646 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/54d743cb-8308-489d-817b-2a68d49ddfa1-etc-selinux\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.684453 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683678 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/54d743cb-8308-489d-817b-2a68d49ddfa1-sys-fs\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.684453 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683709 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-sys\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.684453 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:35.683716 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:35.684453 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683733 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28463658-293e-4847-bb58-c40452c9ceba-cni-binary-copy\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.684453 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683787 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-host-var-lib-kubelet\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.684453 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683843 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wsvx7\" (UniqueName: \"kubernetes.io/projected/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-kube-api-access-wsvx7\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.684453 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683867 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6b5bl\" (UniqueName: \"kubernetes.io/projected/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-kube-api-access-6b5bl\") pod \"network-metrics-daemon-z942n\" (UID: \"d168f0a0-7fcd-4905-a424-24a94b7fcdbb\") " pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:35.684453 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683890 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-host\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.684453 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683916 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28463658-293e-4847-bb58-c40452c9ceba-cnibin\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.684453 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683932 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/28463658-293e-4847-bb58-c40452c9ceba-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.684453 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.683959 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-ovnkube-script-lib\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.684453 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684053 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a926d37e-a35b-4d6d-a341-5f224db6cd94-konnectivity-ca\") pod \"konnectivity-agent-bzxhn\" (UID: \"a926d37e-a35b-4d6d-a341-5f224db6cd94\") " pod="kube-system/konnectivity-agent-bzxhn" Apr 17 17:24:35.684453 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684063 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/28463658-293e-4847-bb58-c40452c9ceba-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.684453 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684099 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/54d743cb-8308-489d-817b-2a68d49ddfa1-device-dir\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.684453 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684117 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-sysconfig\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.684453 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684119 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-host-var-lib-kubelet\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.684453 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684148 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-sysctl-d\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.685263 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684168 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-var-lib-kubelet\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.685263 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684193 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28463658-293e-4847-bb58-c40452c9ceba-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.685263 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684220 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-system-cni-dir\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.685263 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684236 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-multus-cni-dir\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.685263 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684252 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-cnibin\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.685263 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684275 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-os-release\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.685263 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684284 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/54d743cb-8308-489d-817b-2a68d49ddfa1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.685263 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684300 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-host-run-multus-certs\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.685263 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684331 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-cni-netd\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.685263 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:35.684369 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs podName:d168f0a0-7fcd-4905-a424-24a94b7fcdbb nodeName:}" failed. No retries permitted until 2026-04-17 17:24:36.184332618 +0000 UTC m=+3.091065406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs") pod "network-metrics-daemon-z942n" (UID: "d168f0a0-7fcd-4905-a424-24a94b7fcdbb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:35.685263 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684384 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/28463658-293e-4847-bb58-c40452c9ceba-cni-binary-copy\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.685263 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684510 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/28463658-293e-4847-bb58-c40452c9ceba-system-cni-dir\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.685263 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684506 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/54d743cb-8308-489d-817b-2a68d49ddfa1-socket-dir\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.685263 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684504 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-cnibin\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.685263 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684546 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/54d743cb-8308-489d-817b-2a68d49ddfa1-etc-selinux\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.685263 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684563 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-multus-cni-dir\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.685263 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684654 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-modprobe-d\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.686023 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684670 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-os-release\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.686023 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684679 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-sysctl-conf\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.686023 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684722 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-host\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.686023 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684723 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/54d743cb-8308-489d-817b-2a68d49ddfa1-sys-fs\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.686023 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684767 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-system-cni-dir\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.686023 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684771 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-sysconfig\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.686023 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684775 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-modprobe-d\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.686023 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684820 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-var-lib-kubelet\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.686023 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684828 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-host-run-multus-certs\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.686023 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684852 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-sysctl-d\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.686023 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684854 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/28463658-293e-4847-bb58-c40452c9ceba-cnibin\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.686023 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684882 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/54d743cb-8308-489d-817b-2a68d49ddfa1-socket-dir\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.686023 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684893 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/54d743cb-8308-489d-817b-2a68d49ddfa1-device-dir\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.686023 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684888 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-sys\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.686023 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684919 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-sysctl-conf\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.686023 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684977 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/28463658-293e-4847-bb58-c40452c9ceba-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.686023 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.684984 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28463658-293e-4847-bb58-c40452c9ceba-os-release\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.686023 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685040 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/28463658-293e-4847-bb58-c40452c9ceba-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.686654 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685042 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-multus-daemon-config\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.686654 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685104 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/28463658-293e-4847-bb58-c40452c9ceba-os-release\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.686654 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685153 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-kubelet\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.686654 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685246 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-node-log\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.686654 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685282 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-run-ovn-kubernetes\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.686654 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685309 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.686654 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685338 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-ovn-node-metrics-cert\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.686654 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685364 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/54d743cb-8308-489d-817b-2a68d49ddfa1-registration-dir\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.686654 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685391 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-lib-modules\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.686654 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685414 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ced94cc1-575e-4efc-8406-8add5b3da29c-tmp\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.686654 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685437 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-hostroot\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.686654 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685462 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-multus-conf-dir\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.686654 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685512 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-multus-conf-dir\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.686654 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685513 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-lib-modules\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.686654 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685546 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-hostroot\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.686654 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685562 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/54d743cb-8308-489d-817b-2a68d49ddfa1-registration-dir\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.686654 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685562 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-run-netns\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.687223 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685608 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-multus-daemon-config\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.687223 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685614 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-etc-openvswitch\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.687223 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685668 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-systemd\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.687223 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685703 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-systemd-units\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.687223 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685729 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-var-lib-openvswitch\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.687223 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685738 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 17:24:35.687223 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685755 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-tuned\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.687223 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685779 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-multus-socket-dir-parent\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.687223 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685803 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-slash\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.687223 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685824 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-cni-bin\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.687223 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685844 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvtmx\" (UniqueName: \"kubernetes.io/projected/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-kube-api-access-fvtmx\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.687223 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685867 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-multus-socket-dir-parent\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.687223 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.685742 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-systemd\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.688516 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.688356 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ced94cc1-575e-4efc-8406-8add5b3da29c-tmp\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.688639 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.688399 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ced94cc1-575e-4efc-8406-8add5b3da29c-etc-tuned\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.697131 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.697109 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zxp8\" (UniqueName: \"kubernetes.io/projected/ced94cc1-575e-4efc-8406-8add5b3da29c-kube-api-access-9zxp8\") pod \"tuned-jscv2\" (UID: \"ced94cc1-575e-4efc-8406-8add5b3da29c\") " pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.703730 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.700015 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsvx7\" (UniqueName: \"kubernetes.io/projected/5f87164f-e1cb-4cad-aeef-d75c4e3648c7-kube-api-access-wsvx7\") pod \"multus-njgk7\" (UID: \"5f87164f-e1cb-4cad-aeef-d75c4e3648c7\") " pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.703730 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.700954 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85xhq\" (UniqueName: \"kubernetes.io/projected/54d743cb-8308-489d-817b-2a68d49ddfa1-kube-api-access-85xhq\") pod \"aws-ebs-csi-driver-node-swxn6\" (UID: \"54d743cb-8308-489d-817b-2a68d49ddfa1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.704129 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.704105 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrl8b\" (UniqueName: \"kubernetes.io/projected/28463658-293e-4847-bb58-c40452c9ceba-kube-api-access-mrl8b\") pod \"multus-additional-cni-plugins-z8qx6\" (UID: \"28463658-293e-4847-bb58-c40452c9ceba\") " pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.705082 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.705056 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b5bl\" (UniqueName: \"kubernetes.io/projected/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-kube-api-access-6b5bl\") pod \"network-metrics-daemon-z942n\" (UID: \"d168f0a0-7fcd-4905-a424-24a94b7fcdbb\") " pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:35.786135 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786104 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-ovnkube-script-lib\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786301 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786144 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a926d37e-a35b-4d6d-a341-5f224db6cd94-konnectivity-ca\") pod \"konnectivity-agent-bzxhn\" (UID: \"a926d37e-a35b-4d6d-a341-5f224db6cd94\") " pod="kube-system/konnectivity-agent-bzxhn" Apr 17 17:24:35.786301 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786183 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-cni-netd\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786301 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786213 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-kubelet\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786301 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786240 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-node-log\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786301 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786264 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-run-ovn-kubernetes\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786301 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786289 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786622 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786304 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-cni-netd\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786622 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786320 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-kubelet\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786622 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786351 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-node-log\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786622 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786385 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786622 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786317 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-ovn-node-metrics-cert\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786622 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786390 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-run-ovn-kubernetes\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786622 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786412 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-run-netns\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786622 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786428 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-etc-openvswitch\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786622 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786444 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-systemd-units\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786622 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786466 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-var-lib-openvswitch\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786622 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786486 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-slash\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786622 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786484 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-run-netns\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786622 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786507 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-cni-bin\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786622 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786541 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-systemd-units\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786622 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786559 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fvtmx\" (UniqueName: \"kubernetes.io/projected/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-kube-api-access-fvtmx\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.786622 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786582 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbsn2\" (UniqueName: \"kubernetes.io/projected/8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b-kube-api-access-xbsn2\") pod \"iptables-alerter-bbk45\" (UID: \"8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b\") " pod="openshift-network-operator/iptables-alerter-bbk45" Apr 17 17:24:35.786622 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786581 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-etc-openvswitch\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.787390 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786634 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4tws\" (UniqueName: \"kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws\") pod \"network-check-target-m8x5g\" (UID: \"0ace93ad-4902-4616-82aa-f2d931df41ef\") " pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:24:35.787390 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786663 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-run-systemd\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.787390 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786690 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a926d37e-a35b-4d6d-a341-5f224db6cd94-agent-certs\") pod \"konnectivity-agent-bzxhn\" (UID: \"a926d37e-a35b-4d6d-a341-5f224db6cd94\") " pod="kube-system/konnectivity-agent-bzxhn" Apr 17 17:24:35.787390 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786704 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a926d37e-a35b-4d6d-a341-5f224db6cd94-konnectivity-ca\") pod \"konnectivity-agent-bzxhn\" (UID: \"a926d37e-a35b-4d6d-a341-5f224db6cd94\") " pod="kube-system/konnectivity-agent-bzxhn" Apr 17 17:24:35.787390 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786715 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b-iptables-alerter-script\") pod \"iptables-alerter-bbk45\" (UID: \"8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b\") " pod="openshift-network-operator/iptables-alerter-bbk45" Apr 17 17:24:35.787390 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786737 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-ovnkube-script-lib\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.787390 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786760 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-run-ovn\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.787390 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786791 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d18b6520-db25-43a0-bca5-6990fef41e34-serviceca\") pod \"node-ca-9szwb\" (UID: \"d18b6520-db25-43a0-bca5-6990fef41e34\") " pod="openshift-image-registry/node-ca-9szwb" Apr 17 17:24:35.787390 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786791 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-run-systemd\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.787390 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786820 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-run-openvswitch\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.787390 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786846 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-ovnkube-config\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.787390 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786869 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgsxj\" (UniqueName: \"kubernetes.io/projected/d18b6520-db25-43a0-bca5-6990fef41e34-kube-api-access-mgsxj\") pod \"node-ca-9szwb\" (UID: \"d18b6520-db25-43a0-bca5-6990fef41e34\") " pod="openshift-image-registry/node-ca-9szwb" Apr 17 17:24:35.787390 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786893 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b-host-slash\") pod \"iptables-alerter-bbk45\" (UID: \"8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b\") " pod="openshift-network-operator/iptables-alerter-bbk45" Apr 17 17:24:35.787390 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786931 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-log-socket\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.787390 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786934 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-var-lib-openvswitch\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.787390 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786954 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-env-overrides\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.787390 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.786978 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d18b6520-db25-43a0-bca5-6990fef41e34-host\") pod \"node-ca-9szwb\" (UID: \"d18b6520-db25-43a0-bca5-6990fef41e34\") " pod="openshift-image-registry/node-ca-9szwb" Apr 17 17:24:35.787980 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.787011 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-slash\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.787980 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.787046 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d18b6520-db25-43a0-bca5-6990fef41e34-host\") pod \"node-ca-9szwb\" (UID: \"d18b6520-db25-43a0-bca5-6990fef41e34\") " pod="openshift-image-registry/node-ca-9szwb" Apr 17 17:24:35.787980 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.787183 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-host-cni-bin\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.787980 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.787216 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b-iptables-alerter-script\") pod \"iptables-alerter-bbk45\" (UID: \"8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b\") " pod="openshift-network-operator/iptables-alerter-bbk45" Apr 17 17:24:35.787980 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.787232 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-run-ovn\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.787980 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.787245 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b-host-slash\") pod \"iptables-alerter-bbk45\" (UID: \"8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b\") " pod="openshift-network-operator/iptables-alerter-bbk45" Apr 17 17:24:35.787980 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.787271 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-log-socket\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.787980 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.787311 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-run-openvswitch\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.787980 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.787460 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d18b6520-db25-43a0-bca5-6990fef41e34-serviceca\") pod \"node-ca-9szwb\" (UID: \"d18b6520-db25-43a0-bca5-6990fef41e34\") " pod="openshift-image-registry/node-ca-9szwb" Apr 17 17:24:35.787980 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.787547 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-ovnkube-config\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.787980 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.787618 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-env-overrides\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.789007 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.788978 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-ovn-node-metrics-cert\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.789333 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.789316 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a926d37e-a35b-4d6d-a341-5f224db6cd94-agent-certs\") pod \"konnectivity-agent-bzxhn\" (UID: \"a926d37e-a35b-4d6d-a341-5f224db6cd94\") " pod="kube-system/konnectivity-agent-bzxhn" Apr 17 17:24:35.792846 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:35.792827 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:35.792939 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:35.792850 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:35.792939 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:35.792864 2579 projected.go:194] Error preparing data for projected volume kube-api-access-q4tws for pod openshift-network-diagnostics/network-check-target-m8x5g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:35.793027 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:35.792938 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws podName:0ace93ad-4902-4616-82aa-f2d931df41ef nodeName:}" failed. No retries permitted until 2026-04-17 17:24:36.292918899 +0000 UTC m=+3.199651696 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-q4tws" (UniqueName: "kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws") pod "network-check-target-m8x5g" (UID: "0ace93ad-4902-4616-82aa-f2d931df41ef") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:35.795352 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.795332 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgsxj\" (UniqueName: \"kubernetes.io/projected/d18b6520-db25-43a0-bca5-6990fef41e34-kube-api-access-mgsxj\") pod \"node-ca-9szwb\" (UID: \"d18b6520-db25-43a0-bca5-6990fef41e34\") " pod="openshift-image-registry/node-ca-9szwb" Apr 17 17:24:35.795460 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.795375 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvtmx\" (UniqueName: \"kubernetes.io/projected/5ec8213b-d815-438e-ab3d-f610b8fc1f8a-kube-api-access-fvtmx\") pod \"ovnkube-node-h29v4\" (UID: \"5ec8213b-d815-438e-ab3d-f610b8fc1f8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:35.795557 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.795540 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbsn2\" (UniqueName: \"kubernetes.io/projected/8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b-kube-api-access-xbsn2\") pod \"iptables-alerter-bbk45\" (UID: \"8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b\") " pod="openshift-network-operator/iptables-alerter-bbk45" Apr 17 17:24:35.874690 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.874586 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" Apr 17 17:24:35.882418 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.882388 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jscv2" Apr 17 17:24:35.884614 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.884575 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:35.894463 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.894439 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z8qx6" Apr 17 17:24:35.899103 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.899082 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-njgk7" Apr 17 17:24:35.905645 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.905625 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-bbk45" Apr 17 17:24:35.912216 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.912193 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bzxhn" Apr 17 17:24:35.918770 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.918749 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9szwb" Apr 17 17:24:35.925372 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:35.925355 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:36.190286 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:36.190203 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs\") pod \"network-metrics-daemon-z942n\" (UID: \"d168f0a0-7fcd-4905-a424-24a94b7fcdbb\") " pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:36.190446 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:36.190354 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:36.190446 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:36.190421 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs podName:d168f0a0-7fcd-4905-a424-24a94b7fcdbb nodeName:}" failed. No retries permitted until 2026-04-17 17:24:37.190406038 +0000 UTC m=+4.097138818 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs") pod "network-metrics-daemon-z942n" (UID: "d168f0a0-7fcd-4905-a424-24a94b7fcdbb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:36.307468 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:36.307433 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda926d37e_a35b_4d6d_a341_5f224db6cd94.slice/crio-9656db9c58b14c30ce6a17a345bbabd0d8756b262efa537b038deee67d4f944e WatchSource:0}: Error finding container 9656db9c58b14c30ce6a17a345bbabd0d8756b262efa537b038deee67d4f944e: Status 404 returned error can't find the container with id 9656db9c58b14c30ce6a17a345bbabd0d8756b262efa537b038deee67d4f944e Apr 17 17:24:36.310055 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:36.309903 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ec8213b_d815_438e_ab3d_f610b8fc1f8a.slice/crio-9162a01595e7b8d86651e81b6a12019ef6e19740df409ecd822067f7ab0ea7bc WatchSource:0}: Error finding container 9162a01595e7b8d86651e81b6a12019ef6e19740df409ecd822067f7ab0ea7bc: Status 404 returned error can't find the container with id 9162a01595e7b8d86651e81b6a12019ef6e19740df409ecd822067f7ab0ea7bc Apr 17 17:24:36.310785 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:36.310721 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podced94cc1_575e_4efc_8406_8add5b3da29c.slice/crio-1b35c1c81f5991213ec176d03e6849249754edd5b1c657bb17f268c4dfcf1714 WatchSource:0}: Error finding container 1b35c1c81f5991213ec176d03e6849249754edd5b1c657bb17f268c4dfcf1714: Status 404 returned error can't find the container with id 1b35c1c81f5991213ec176d03e6849249754edd5b1c657bb17f268c4dfcf1714 Apr 17 17:24:36.313981 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:36.313956 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f87164f_e1cb_4cad_aeef_d75c4e3648c7.slice/crio-17a2dbc865859fa3815f4ca46127b0e035e92a1a0dca63329c5fb9e13b6977be WatchSource:0}: Error finding container 17a2dbc865859fa3815f4ca46127b0e035e92a1a0dca63329c5fb9e13b6977be: Status 404 returned error can't find the container with id 17a2dbc865859fa3815f4ca46127b0e035e92a1a0dca63329c5fb9e13b6977be Apr 17 17:24:36.314997 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:36.314971 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd18b6520_db25_43a0_bca5_6990fef41e34.slice/crio-d9e8b2ba901dd7e03a4156ce5ee3b2941040fcd904c3fcea7050eaae92eb2008 WatchSource:0}: Error finding container d9e8b2ba901dd7e03a4156ce5ee3b2941040fcd904c3fcea7050eaae92eb2008: Status 404 returned error can't find the container with id d9e8b2ba901dd7e03a4156ce5ee3b2941040fcd904c3fcea7050eaae92eb2008 Apr 17 17:24:36.315661 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:36.315639 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b5e6ca6_9cb4_41b2_9b42_cece2ca5ad9b.slice/crio-3e2637975a91c20ec1653909d897fcacd3c0541ad7eb30d88de2c6e06aff75e2 WatchSource:0}: Error finding container 3e2637975a91c20ec1653909d897fcacd3c0541ad7eb30d88de2c6e06aff75e2: Status 404 returned error can't find the container with id 3e2637975a91c20ec1653909d897fcacd3c0541ad7eb30d88de2c6e06aff75e2 Apr 17 17:24:36.317094 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:36.316988 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54d743cb_8308_489d_817b_2a68d49ddfa1.slice/crio-b12bc1a0eb8493a052c3d9afd2e7e12d57e90fb23d849495bbebafa762a420d7 WatchSource:0}: Error finding container b12bc1a0eb8493a052c3d9afd2e7e12d57e90fb23d849495bbebafa762a420d7: Status 404 returned error can't find the container with id b12bc1a0eb8493a052c3d9afd2e7e12d57e90fb23d849495bbebafa762a420d7 Apr 17 17:24:36.391072 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:36.391031 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4tws\" (UniqueName: \"kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws\") pod \"network-check-target-m8x5g\" (UID: \"0ace93ad-4902-4616-82aa-f2d931df41ef\") " pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:24:36.391198 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:36.391170 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:36.391198 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:36.391190 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:36.391198 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:36.391200 2579 projected.go:194] Error preparing data for projected volume kube-api-access-q4tws for pod openshift-network-diagnostics/network-check-target-m8x5g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:36.391373 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:36.391252 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws podName:0ace93ad-4902-4616-82aa-f2d931df41ef nodeName:}" failed. No retries permitted until 2026-04-17 17:24:37.391233416 +0000 UTC m=+4.297966198 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-q4tws" (UniqueName: "kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws") pod "network-check-target-m8x5g" (UID: "0ace93ad-4902-4616-82aa-f2d931df41ef") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:36.537967 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:36.537863 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:36.614169 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:36.614132 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:19:34 +0000 UTC" deadline="2027-11-08 11:57:33.598786612 +0000 UTC" Apr 17 17:24:36.614169 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:36.614163 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13674h32m56.984626976s" Apr 17 17:24:36.645975 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:36.645932 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-17.ec2.internal" event={"ID":"ee539be09c69658279b69c4b8f0acb61","Type":"ContainerStarted","Data":"802938eaa171797ef9050aa706633843d5411b0d3fadb010794fcd4e61c6ee79"} Apr 17 17:24:36.647501 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:36.647473 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" event={"ID":"54d743cb-8308-489d-817b-2a68d49ddfa1","Type":"ContainerStarted","Data":"b12bc1a0eb8493a052c3d9afd2e7e12d57e90fb23d849495bbebafa762a420d7"} Apr 17 17:24:36.648636 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:36.648618 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-njgk7" event={"ID":"5f87164f-e1cb-4cad-aeef-d75c4e3648c7","Type":"ContainerStarted","Data":"17a2dbc865859fa3815f4ca46127b0e035e92a1a0dca63329c5fb9e13b6977be"} Apr 17 17:24:36.650660 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:36.650267 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" event={"ID":"5ec8213b-d815-438e-ab3d-f610b8fc1f8a","Type":"ContainerStarted","Data":"9162a01595e7b8d86651e81b6a12019ef6e19740df409ecd822067f7ab0ea7bc"} Apr 17 17:24:36.653628 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:36.652656 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-bbk45" event={"ID":"8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b","Type":"ContainerStarted","Data":"3e2637975a91c20ec1653909d897fcacd3c0541ad7eb30d88de2c6e06aff75e2"} Apr 17 17:24:36.654427 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:36.654369 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9szwb" event={"ID":"d18b6520-db25-43a0-bca5-6990fef41e34","Type":"ContainerStarted","Data":"d9e8b2ba901dd7e03a4156ce5ee3b2941040fcd904c3fcea7050eaae92eb2008"} Apr 17 17:24:36.655664 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:36.655630 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jscv2" event={"ID":"ced94cc1-575e-4efc-8406-8add5b3da29c","Type":"ContainerStarted","Data":"1b35c1c81f5991213ec176d03e6849249754edd5b1c657bb17f268c4dfcf1714"} Apr 17 17:24:36.657961 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:36.657932 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8qx6" event={"ID":"28463658-293e-4847-bb58-c40452c9ceba","Type":"ContainerStarted","Data":"5ae56ca7a30d6869d8fe305fbeccda964fdb902647a275559f46e71876524577"} Apr 17 17:24:36.661178 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:36.661154 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bzxhn" event={"ID":"a926d37e-a35b-4d6d-a341-5f224db6cd94","Type":"ContainerStarted","Data":"9656db9c58b14c30ce6a17a345bbabd0d8756b262efa537b038deee67d4f944e"} Apr 17 17:24:37.201173 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:37.200569 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs\") pod \"network-metrics-daemon-z942n\" (UID: \"d168f0a0-7fcd-4905-a424-24a94b7fcdbb\") " pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:37.201173 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:37.200783 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:37.201173 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:37.200842 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs podName:d168f0a0-7fcd-4905-a424-24a94b7fcdbb nodeName:}" failed. No retries permitted until 2026-04-17 17:24:39.200822849 +0000 UTC m=+6.107555637 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs") pod "network-metrics-daemon-z942n" (UID: "d168f0a0-7fcd-4905-a424-24a94b7fcdbb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:37.402966 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:37.402221 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4tws\" (UniqueName: \"kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws\") pod \"network-check-target-m8x5g\" (UID: \"0ace93ad-4902-4616-82aa-f2d931df41ef\") " pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:24:37.402966 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:37.402435 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:37.402966 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:37.402462 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:37.402966 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:37.402476 2579 projected.go:194] Error preparing data for projected volume kube-api-access-q4tws for pod openshift-network-diagnostics/network-check-target-m8x5g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:37.402966 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:37.402537 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws podName:0ace93ad-4902-4616-82aa-f2d931df41ef nodeName:}" failed. No retries permitted until 2026-04-17 17:24:39.402518167 +0000 UTC m=+6.309250964 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-q4tws" (UniqueName: "kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws") pod "network-check-target-m8x5g" (UID: "0ace93ad-4902-4616-82aa-f2d931df41ef") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:37.639381 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:37.638625 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:24:37.639381 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:37.638743 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m8x5g" podUID="0ace93ad-4902-4616-82aa-f2d931df41ef" Apr 17 17:24:37.639381 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:37.639159 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:37.639381 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:37.639256 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z942n" podUID="d168f0a0-7fcd-4905-a424-24a94b7fcdbb" Apr 17 17:24:37.672427 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:37.672344 2579 generic.go:358] "Generic (PLEG): container finished" podID="f3374a8f3007e3f35c70fe0710807983" containerID="ecdb323414c2f37633738c93c9dcaeb060d148f81c2e6573f081dc4d896119de" exitCode=0 Apr 17 17:24:37.673348 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:37.673320 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal" event={"ID":"f3374a8f3007e3f35c70fe0710807983","Type":"ContainerDied","Data":"ecdb323414c2f37633738c93c9dcaeb060d148f81c2e6573f081dc4d896119de"} Apr 17 17:24:37.688230 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:37.688177 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-17.ec2.internal" podStartSLOduration=3.688160712 podStartE2EDuration="3.688160712s" podCreationTimestamp="2026-04-17 17:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:24:36.664291667 +0000 UTC m=+3.571024472" watchObservedRunningTime="2026-04-17 17:24:37.688160712 +0000 UTC m=+4.594893516" Apr 17 17:24:38.680397 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:38.679680 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal" event={"ID":"f3374a8f3007e3f35c70fe0710807983","Type":"ContainerStarted","Data":"1cc1883fadef2c9bffacb68e72722887cc5de4dc7f3885c57ec7216c4058fe43"} Apr 17 17:24:39.215424 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:39.215378 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs\") pod \"network-metrics-daemon-z942n\" (UID: \"d168f0a0-7fcd-4905-a424-24a94b7fcdbb\") " pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:39.215648 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:39.215519 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:39.215648 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:39.215581 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs podName:d168f0a0-7fcd-4905-a424-24a94b7fcdbb nodeName:}" failed. No retries permitted until 2026-04-17 17:24:43.215563319 +0000 UTC m=+10.122296103 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs") pod "network-metrics-daemon-z942n" (UID: "d168f0a0-7fcd-4905-a424-24a94b7fcdbb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:39.417187 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:39.417145 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4tws\" (UniqueName: \"kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws\") pod \"network-check-target-m8x5g\" (UID: \"0ace93ad-4902-4616-82aa-f2d931df41ef\") " pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:24:39.417371 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:39.417336 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:39.417371 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:39.417355 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:39.417371 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:39.417368 2579 projected.go:194] Error preparing data for projected volume kube-api-access-q4tws for pod openshift-network-diagnostics/network-check-target-m8x5g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:39.417525 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:39.417430 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws podName:0ace93ad-4902-4616-82aa-f2d931df41ef nodeName:}" failed. No retries permitted until 2026-04-17 17:24:43.417411009 +0000 UTC m=+10.324143811 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-q4tws" (UniqueName: "kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws") pod "network-check-target-m8x5g" (UID: "0ace93ad-4902-4616-82aa-f2d931df41ef") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:39.637479 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:39.636725 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:39.637479 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:39.636880 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z942n" podUID="d168f0a0-7fcd-4905-a424-24a94b7fcdbb" Apr 17 17:24:39.637479 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:39.637267 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:24:39.637479 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:39.637362 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m8x5g" podUID="0ace93ad-4902-4616-82aa-f2d931df41ef" Apr 17 17:24:41.636639 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:41.636342 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:24:41.636639 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:41.636471 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m8x5g" podUID="0ace93ad-4902-4616-82aa-f2d931df41ef" Apr 17 17:24:41.636639 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:41.636550 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:41.637194 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:41.636665 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z942n" podUID="d168f0a0-7fcd-4905-a424-24a94b7fcdbb" Apr 17 17:24:43.249998 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:43.249956 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs\") pod \"network-metrics-daemon-z942n\" (UID: \"d168f0a0-7fcd-4905-a424-24a94b7fcdbb\") " pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:43.250446 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:43.250126 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:43.250446 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:43.250191 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs podName:d168f0a0-7fcd-4905-a424-24a94b7fcdbb nodeName:}" failed. No retries permitted until 2026-04-17 17:24:51.250171198 +0000 UTC m=+18.156903984 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs") pod "network-metrics-daemon-z942n" (UID: "d168f0a0-7fcd-4905-a424-24a94b7fcdbb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:43.452202 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:43.452154 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4tws\" (UniqueName: \"kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws\") pod \"network-check-target-m8x5g\" (UID: \"0ace93ad-4902-4616-82aa-f2d931df41ef\") " pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:24:43.452376 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:43.452353 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:43.452376 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:43.452376 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:43.452521 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:43.452389 2579 projected.go:194] Error preparing data for projected volume kube-api-access-q4tws for pod openshift-network-diagnostics/network-check-target-m8x5g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:43.452521 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:43.452452 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws podName:0ace93ad-4902-4616-82aa-f2d931df41ef nodeName:}" failed. No retries permitted until 2026-04-17 17:24:51.452432621 +0000 UTC m=+18.359165416 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-q4tws" (UniqueName: "kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws") pod "network-check-target-m8x5g" (UID: "0ace93ad-4902-4616-82aa-f2d931df41ef") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:43.637287 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:43.637174 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:43.637450 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:43.637298 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z942n" podUID="d168f0a0-7fcd-4905-a424-24a94b7fcdbb" Apr 17 17:24:43.637754 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:43.637586 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:24:43.637754 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:43.637714 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m8x5g" podUID="0ace93ad-4902-4616-82aa-f2d931df41ef" Apr 17 17:24:45.636086 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:45.636056 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:24:45.636531 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:45.636057 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:45.636531 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:45.636182 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m8x5g" podUID="0ace93ad-4902-4616-82aa-f2d931df41ef" Apr 17 17:24:45.636531 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:45.636290 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z942n" podUID="d168f0a0-7fcd-4905-a424-24a94b7fcdbb" Apr 17 17:24:47.639108 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:47.639080 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:24:47.639560 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:47.639090 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:47.639560 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:47.639192 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m8x5g" podUID="0ace93ad-4902-4616-82aa-f2d931df41ef" Apr 17 17:24:47.639560 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:47.639287 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z942n" podUID="d168f0a0-7fcd-4905-a424-24a94b7fcdbb" Apr 17 17:24:49.045312 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:49.045250 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-17.ec2.internal" podStartSLOduration=15.045230825 podStartE2EDuration="15.045230825s" podCreationTimestamp="2026-04-17 17:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:24:38.693673604 +0000 UTC m=+5.600406408" watchObservedRunningTime="2026-04-17 17:24:49.045230825 +0000 UTC m=+15.951963636" Apr 17 17:24:49.045997 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:49.045978 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-5j2pv"] Apr 17 17:24:49.050160 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:49.050140 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:24:49.050270 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:49.050210 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5j2pv" podUID="f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95" Apr 17 17:24:49.091499 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:49.091462 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-dbus\") pod \"global-pull-secret-syncer-5j2pv\" (UID: \"f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95\") " pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:24:49.091672 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:49.091510 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-kubelet-config\") pod \"global-pull-secret-syncer-5j2pv\" (UID: \"f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95\") " pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:24:49.091672 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:49.091571 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret\") pod \"global-pull-secret-syncer-5j2pv\" (UID: \"f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95\") " pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:24:49.192752 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:49.192720 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-dbus\") pod \"global-pull-secret-syncer-5j2pv\" (UID: \"f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95\") " pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:24:49.192752 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:49.192761 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-kubelet-config\") pod \"global-pull-secret-syncer-5j2pv\" (UID: \"f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95\") " pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:24:49.192997 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:49.192785 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret\") pod \"global-pull-secret-syncer-5j2pv\" (UID: \"f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95\") " pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:24:49.192997 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:49.192861 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-kubelet-config\") pod \"global-pull-secret-syncer-5j2pv\" (UID: \"f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95\") " pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:24:49.192997 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:49.192921 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:49.192997 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:49.192952 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-dbus\") pod \"global-pull-secret-syncer-5j2pv\" (UID: \"f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95\") " pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:24:49.192997 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:49.192977 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret podName:f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:49.692959074 +0000 UTC m=+16.599691868 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret") pod "global-pull-secret-syncer-5j2pv" (UID: "f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:49.638827 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:49.638796 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:24:49.638990 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:49.638796 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:49.638990 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:49.638916 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m8x5g" podUID="0ace93ad-4902-4616-82aa-f2d931df41ef" Apr 17 17:24:49.639090 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:49.639028 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z942n" podUID="d168f0a0-7fcd-4905-a424-24a94b7fcdbb" Apr 17 17:24:49.697144 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:49.697114 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret\") pod \"global-pull-secret-syncer-5j2pv\" (UID: \"f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95\") " pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:24:49.697341 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:49.697255 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:49.697341 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:49.697322 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret podName:f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:50.697305261 +0000 UTC m=+17.604038042 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret") pod "global-pull-secret-syncer-5j2pv" (UID: "f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:50.636305 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:50.636269 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:24:50.636874 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:50.636400 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5j2pv" podUID="f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95" Apr 17 17:24:50.701462 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:50.701419 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret\") pod \"global-pull-secret-syncer-5j2pv\" (UID: \"f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95\") " pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:24:50.701673 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:50.701577 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:50.701673 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:50.701661 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret podName:f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:52.701641503 +0000 UTC m=+19.608374285 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret") pod "global-pull-secret-syncer-5j2pv" (UID: "f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:51.306030 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:51.305988 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs\") pod \"network-metrics-daemon-z942n\" (UID: \"d168f0a0-7fcd-4905-a424-24a94b7fcdbb\") " pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:51.306215 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:51.306162 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:51.306263 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:51.306238 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs podName:d168f0a0-7fcd-4905-a424-24a94b7fcdbb nodeName:}" failed. No retries permitted until 2026-04-17 17:25:07.306220953 +0000 UTC m=+34.212953735 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs") pod "network-metrics-daemon-z942n" (UID: "d168f0a0-7fcd-4905-a424-24a94b7fcdbb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:51.507782 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:51.507735 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4tws\" (UniqueName: \"kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws\") pod \"network-check-target-m8x5g\" (UID: \"0ace93ad-4902-4616-82aa-f2d931df41ef\") " pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:24:51.507951 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:51.507931 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:51.508018 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:51.507960 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:51.508018 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:51.507974 2579 projected.go:194] Error preparing data for projected volume kube-api-access-q4tws for pod openshift-network-diagnostics/network-check-target-m8x5g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:51.508115 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:51.508047 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws podName:0ace93ad-4902-4616-82aa-f2d931df41ef nodeName:}" failed. No retries permitted until 2026-04-17 17:25:07.50802805 +0000 UTC m=+34.414760854 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-q4tws" (UniqueName: "kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws") pod "network-check-target-m8x5g" (UID: "0ace93ad-4902-4616-82aa-f2d931df41ef") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:51.636564 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:51.636458 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:24:51.637010 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:51.636583 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m8x5g" podUID="0ace93ad-4902-4616-82aa-f2d931df41ef" Apr 17 17:24:51.637010 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:51.636663 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:51.637010 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:51.636779 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z942n" podUID="d168f0a0-7fcd-4905-a424-24a94b7fcdbb" Apr 17 17:24:52.636399 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:52.636366 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:24:52.636586 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:52.636485 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5j2pv" podUID="f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95" Apr 17 17:24:52.715809 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:52.715779 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret\") pod \"global-pull-secret-syncer-5j2pv\" (UID: \"f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95\") " pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:24:52.715965 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:52.715880 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:52.715965 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:52.715928 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret podName:f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:56.715915063 +0000 UTC m=+23.622647857 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret") pod "global-pull-secret-syncer-5j2pv" (UID: "f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:53.636874 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:53.636678 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:53.637685 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:53.636729 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:24:53.637685 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:53.636960 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z942n" podUID="d168f0a0-7fcd-4905-a424-24a94b7fcdbb" Apr 17 17:24:53.637685 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:53.637490 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m8x5g" podUID="0ace93ad-4902-4616-82aa-f2d931df41ef" Apr 17 17:24:53.706486 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:53.706302 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9szwb" event={"ID":"d18b6520-db25-43a0-bca5-6990fef41e34","Type":"ContainerStarted","Data":"5128fd41cbd883ead90deac48178a20a23cacde9a8b04d5c9a577c69e2d976e2"} Apr 17 17:24:53.707547 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:53.707522 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jscv2" event={"ID":"ced94cc1-575e-4efc-8406-8add5b3da29c","Type":"ContainerStarted","Data":"26b1317cc8b9c1c27dcc4b074ae151303455df3437a7db5861a111cb181f7f7b"} Apr 17 17:24:53.708708 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:53.708685 2579 generic.go:358] "Generic (PLEG): container finished" podID="28463658-293e-4847-bb58-c40452c9ceba" containerID="07e2ad9b137e724472441d515a0cda5fe4a26beb894257fa7d8eba1c4b978c12" exitCode=0 Apr 17 17:24:53.708785 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:53.708755 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8qx6" event={"ID":"28463658-293e-4847-bb58-c40452c9ceba","Type":"ContainerDied","Data":"07e2ad9b137e724472441d515a0cda5fe4a26beb894257fa7d8eba1c4b978c12"} Apr 17 17:24:53.710128 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:53.710093 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bzxhn" event={"ID":"a926d37e-a35b-4d6d-a341-5f224db6cd94","Type":"ContainerStarted","Data":"887fe243c5b9a23b75084b917341fd5cc53f808e33460b9845b4aa6a0c26514a"} Apr 17 17:24:53.711228 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:53.711208 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" event={"ID":"54d743cb-8308-489d-817b-2a68d49ddfa1","Type":"ContainerStarted","Data":"6fd168ad57bc7a046b29f1261629d2dbd2faeb2880b12bf5285621c81708aab9"} Apr 17 17:24:53.712531 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:53.712501 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-njgk7" event={"ID":"5f87164f-e1cb-4cad-aeef-d75c4e3648c7","Type":"ContainerStarted","Data":"0e452af24b5757587c124c6dd4722700b1476641a21e421cbfb7af87577fdaf8"} Apr 17 17:24:53.714249 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:53.714226 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 17:24:53.714532 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:53.714512 2579 generic.go:358] "Generic (PLEG): container finished" podID="5ec8213b-d815-438e-ab3d-f610b8fc1f8a" containerID="c1af1047462a98e7a5ebe9552463688a2f3923c2d5bc9fb59637415b99da9b7f" exitCode=1 Apr 17 17:24:53.714633 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:53.714538 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" event={"ID":"5ec8213b-d815-438e-ab3d-f610b8fc1f8a","Type":"ContainerStarted","Data":"8aae6db657467f642a1961cebabf2b97c65f963f2e7e44160ab69d0389cafc13"} Apr 17 17:24:53.714633 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:53.714560 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" event={"ID":"5ec8213b-d815-438e-ab3d-f610b8fc1f8a","Type":"ContainerStarted","Data":"ca9b43dc413c12db985c6e7092b886606db8f92b2c14851021514e325c7d8656"} Apr 17 17:24:53.714633 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:53.714570 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" event={"ID":"5ec8213b-d815-438e-ab3d-f610b8fc1f8a","Type":"ContainerDied","Data":"c1af1047462a98e7a5ebe9552463688a2f3923c2d5bc9fb59637415b99da9b7f"} Apr 17 17:24:53.714633 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:53.714580 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" event={"ID":"5ec8213b-d815-438e-ab3d-f610b8fc1f8a","Type":"ContainerStarted","Data":"e9e6d29a957024b2bedeaa7d5b62b59e6c46931ee6695e96628366d4a64b8eb2"} Apr 17 17:24:53.743044 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:53.742987 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-njgk7" podStartSLOduration=3.891118498 podStartE2EDuration="20.742973423s" podCreationTimestamp="2026-04-17 17:24:33 +0000 UTC" firstStartedPulling="2026-04-17 17:24:36.315801081 +0000 UTC m=+3.222533868" lastFinishedPulling="2026-04-17 17:24:53.167656013 +0000 UTC m=+20.074388793" observedRunningTime="2026-04-17 17:24:53.74287771 +0000 UTC m=+20.649610514" watchObservedRunningTime="2026-04-17 17:24:53.742973423 +0000 UTC m=+20.649706225" Apr 17 17:24:53.755237 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:53.755194 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-bzxhn" podStartSLOduration=3.918382983 podStartE2EDuration="20.755181177s" podCreationTimestamp="2026-04-17 17:24:33 +0000 UTC" firstStartedPulling="2026-04-17 17:24:36.310939929 +0000 UTC m=+3.217672721" lastFinishedPulling="2026-04-17 17:24:53.147738125 +0000 UTC m=+20.054470915" observedRunningTime="2026-04-17 17:24:53.754913887 +0000 UTC m=+20.661646692" watchObservedRunningTime="2026-04-17 17:24:53.755181177 +0000 UTC m=+20.661914023" Apr 17 17:24:53.782490 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:53.782431 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9szwb" podStartSLOduration=7.686115923 podStartE2EDuration="19.782413901s" podCreationTimestamp="2026-04-17 17:24:34 +0000 UTC" firstStartedPulling="2026-04-17 17:24:36.316577193 +0000 UTC m=+3.223309980" lastFinishedPulling="2026-04-17 17:24:48.412875174 +0000 UTC m=+15.319607958" observedRunningTime="2026-04-17 17:24:53.770432737 +0000 UTC m=+20.677165541" watchObservedRunningTime="2026-04-17 17:24:53.782413901 +0000 UTC m=+20.689146704" Apr 17 17:24:53.783104 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:53.783068 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jscv2" podStartSLOduration=3.9467932489999997 podStartE2EDuration="20.78305936s" podCreationTimestamp="2026-04-17 17:24:33 +0000 UTC" firstStartedPulling="2026-04-17 17:24:36.31317304 +0000 UTC m=+3.219905821" lastFinishedPulling="2026-04-17 17:24:53.149439151 +0000 UTC m=+20.056171932" observedRunningTime="2026-04-17 17:24:53.7823397 +0000 UTC m=+20.689072503" watchObservedRunningTime="2026-04-17 17:24:53.78305936 +0000 UTC m=+20.689792164" Apr 17 17:24:54.535237 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:54.535199 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 17:24:54.636389 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:54.636356 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:24:54.636572 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:54.636474 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5j2pv" podUID="f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95" Apr 17 17:24:54.660416 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:54.660301 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T17:24:54.535219789Z","UUID":"4768ab50-a345-4df4-8151-279fa781d7c6","Handler":null,"Name":"","Endpoint":""} Apr 17 17:24:54.662636 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:54.662612 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 17:24:54.662636 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:54.662640 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 17:24:54.718243 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:54.718198 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-bbk45" event={"ID":"8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b","Type":"ContainerStarted","Data":"fc84b9e96288cb9de776603a10f243c9b35436da7dc68f3bb3bcd14492f686b8"} Apr 17 17:24:54.720113 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:54.720073 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" event={"ID":"54d743cb-8308-489d-817b-2a68d49ddfa1","Type":"ContainerStarted","Data":"9f50093297dc46b0a761654a922638397c65bd334ccac3128c94fb6ab92a716f"} Apr 17 17:24:54.722784 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:54.722763 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 17:24:54.723219 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:54.723183 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" event={"ID":"5ec8213b-d815-438e-ab3d-f610b8fc1f8a","Type":"ContainerStarted","Data":"1ee0025ffc99a708a9f9f148f8861f3662b765d557207c679af85fbcdf2f5001"} Apr 17 17:24:54.723332 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:54.723225 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" event={"ID":"5ec8213b-d815-438e-ab3d-f610b8fc1f8a","Type":"ContainerStarted","Data":"b91ce280aae0929cefbe5a8d7c2e43f4c4699c139f1fe1a035626a352f6b9e51"} Apr 17 17:24:54.730994 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:54.730941 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-bbk45" podStartSLOduration=5.299603009 podStartE2EDuration="21.73092677s" podCreationTimestamp="2026-04-17 17:24:33 +0000 UTC" firstStartedPulling="2026-04-17 17:24:36.339752861 +0000 UTC m=+3.246485643" lastFinishedPulling="2026-04-17 17:24:52.771076623 +0000 UTC m=+19.677809404" observedRunningTime="2026-04-17 17:24:54.730479449 +0000 UTC m=+21.637212252" watchObservedRunningTime="2026-04-17 17:24:54.73092677 +0000 UTC m=+21.637659574" Apr 17 17:24:55.383110 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:55.383031 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-bzxhn" Apr 17 17:24:55.383771 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:55.383748 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-bzxhn" Apr 17 17:24:55.636527 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:55.636449 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:55.636704 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:55.636449 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:24:55.636704 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:55.636586 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z942n" podUID="d168f0a0-7fcd-4905-a424-24a94b7fcdbb" Apr 17 17:24:55.636704 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:55.636680 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m8x5g" podUID="0ace93ad-4902-4616-82aa-f2d931df41ef" Apr 17 17:24:55.726257 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:55.726004 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-bzxhn" Apr 17 17:24:55.726660 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:55.726325 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-bzxhn" Apr 17 17:24:56.117372 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.117341 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-szhhg"] Apr 17 17:24:56.141927 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.141903 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-szhhg" Apr 17 17:24:56.144636 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.144609 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 17:24:56.144755 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.144736 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 17:24:56.144827 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.144743 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-zpnn2\"" Apr 17 17:24:56.244915 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.244889 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb9e24d4-7146-488f-a450-9bd6feba5465-tmp-dir\") pod \"node-resolver-szhhg\" (UID: \"eb9e24d4-7146-488f-a450-9bd6feba5465\") " pod="openshift-dns/node-resolver-szhhg" Apr 17 17:24:56.245068 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.244941 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7rrq\" (UniqueName: \"kubernetes.io/projected/eb9e24d4-7146-488f-a450-9bd6feba5465-kube-api-access-c7rrq\") pod \"node-resolver-szhhg\" (UID: \"eb9e24d4-7146-488f-a450-9bd6feba5465\") " pod="openshift-dns/node-resolver-szhhg" Apr 17 17:24:56.245068 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.244967 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eb9e24d4-7146-488f-a450-9bd6feba5465-hosts-file\") pod \"node-resolver-szhhg\" (UID: \"eb9e24d4-7146-488f-a450-9bd6feba5465\") " pod="openshift-dns/node-resolver-szhhg" Apr 17 17:24:56.346343 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.346262 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb9e24d4-7146-488f-a450-9bd6feba5465-tmp-dir\") pod \"node-resolver-szhhg\" (UID: \"eb9e24d4-7146-488f-a450-9bd6feba5465\") " pod="openshift-dns/node-resolver-szhhg" Apr 17 17:24:56.346343 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.346323 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7rrq\" (UniqueName: \"kubernetes.io/projected/eb9e24d4-7146-488f-a450-9bd6feba5465-kube-api-access-c7rrq\") pod \"node-resolver-szhhg\" (UID: \"eb9e24d4-7146-488f-a450-9bd6feba5465\") " pod="openshift-dns/node-resolver-szhhg" Apr 17 17:24:56.346553 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.346356 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eb9e24d4-7146-488f-a450-9bd6feba5465-hosts-file\") pod \"node-resolver-szhhg\" (UID: \"eb9e24d4-7146-488f-a450-9bd6feba5465\") " pod="openshift-dns/node-resolver-szhhg" Apr 17 17:24:56.346553 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.346448 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eb9e24d4-7146-488f-a450-9bd6feba5465-hosts-file\") pod \"node-resolver-szhhg\" (UID: \"eb9e24d4-7146-488f-a450-9bd6feba5465\") " pod="openshift-dns/node-resolver-szhhg" Apr 17 17:24:56.346685 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.346660 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb9e24d4-7146-488f-a450-9bd6feba5465-tmp-dir\") pod \"node-resolver-szhhg\" (UID: \"eb9e24d4-7146-488f-a450-9bd6feba5465\") " pod="openshift-dns/node-resolver-szhhg" Apr 17 17:24:56.357338 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.357309 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7rrq\" (UniqueName: \"kubernetes.io/projected/eb9e24d4-7146-488f-a450-9bd6feba5465-kube-api-access-c7rrq\") pod \"node-resolver-szhhg\" (UID: \"eb9e24d4-7146-488f-a450-9bd6feba5465\") " pod="openshift-dns/node-resolver-szhhg" Apr 17 17:24:56.465790 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.465754 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-szhhg" Apr 17 17:24:56.636794 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.636760 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:24:56.636985 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:56.636870 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5j2pv" podUID="f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95" Apr 17 17:24:56.730072 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.730038 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" event={"ID":"54d743cb-8308-489d-817b-2a68d49ddfa1","Type":"ContainerStarted","Data":"21cccee2aad1636155b9f646c6d502893c8445bf58f914dc3b05b7525d2f9cda"} Apr 17 17:24:56.733013 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.732990 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 17:24:56.733387 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.733352 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" event={"ID":"5ec8213b-d815-438e-ab3d-f610b8fc1f8a","Type":"ContainerStarted","Data":"35578c9286a30c76b3180196af33bb620b5b65c382768328f6af135aa5d8bf1d"} Apr 17 17:24:56.747857 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.747822 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret\") pod \"global-pull-secret-syncer-5j2pv\" (UID: \"f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95\") " pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:24:56.748010 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:56.747977 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:56.748074 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:56.748062 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret podName:f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:04.74804418 +0000 UTC m=+31.654776985 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret") pod "global-pull-secret-syncer-5j2pv" (UID: "f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:56.751383 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:56.751340 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swxn6" podStartSLOduration=4.404742965 podStartE2EDuration="23.751324823s" podCreationTimestamp="2026-04-17 17:24:33 +0000 UTC" firstStartedPulling="2026-04-17 17:24:36.339816897 +0000 UTC m=+3.246549682" lastFinishedPulling="2026-04-17 17:24:55.686398759 +0000 UTC m=+22.593131540" observedRunningTime="2026-04-17 17:24:56.750746704 +0000 UTC m=+23.657479508" watchObservedRunningTime="2026-04-17 17:24:56.751324823 +0000 UTC m=+23.658057652" Apr 17 17:24:57.636861 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:57.636825 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:24:57.637064 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:57.636947 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m8x5g" podUID="0ace93ad-4902-4616-82aa-f2d931df41ef" Apr 17 17:24:57.637064 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:57.637007 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:57.637179 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:57.637114 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z942n" podUID="d168f0a0-7fcd-4905-a424-24a94b7fcdbb" Apr 17 17:24:58.064164 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:24:58.064125 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb9e24d4_7146_488f_a450_9bd6feba5465.slice/crio-de6648bd604dee96797e3494baa4df86c1d5a218950b73e603b2d88cb0d25b97 WatchSource:0}: Error finding container de6648bd604dee96797e3494baa4df86c1d5a218950b73e603b2d88cb0d25b97: Status 404 returned error can't find the container with id de6648bd604dee96797e3494baa4df86c1d5a218950b73e603b2d88cb0d25b97 Apr 17 17:24:58.637063 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:58.636905 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:24:58.637190 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:58.637170 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5j2pv" podUID="f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95" Apr 17 17:24:58.739891 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:58.739862 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 17:24:58.740219 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:58.740193 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" event={"ID":"5ec8213b-d815-438e-ab3d-f610b8fc1f8a","Type":"ContainerStarted","Data":"cf2823bae82e39989cd426d2b2060aef28d83ed3cff15f635a5edda8b0356791"} Apr 17 17:24:58.740467 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:58.740448 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:58.740548 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:58.740476 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:58.740696 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:58.740676 2579 scope.go:117] "RemoveContainer" containerID="c1af1047462a98e7a5ebe9552463688a2f3923c2d5bc9fb59637415b99da9b7f" Apr 17 17:24:58.741627 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:58.741604 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-szhhg" event={"ID":"eb9e24d4-7146-488f-a450-9bd6feba5465","Type":"ContainerStarted","Data":"72318aa1820117df4ce97707f699e5112973b7a0667e5030d686a1ac6bb42453"} Apr 17 17:24:58.741724 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:58.741635 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-szhhg" event={"ID":"eb9e24d4-7146-488f-a450-9bd6feba5465","Type":"ContainerStarted","Data":"de6648bd604dee96797e3494baa4df86c1d5a218950b73e603b2d88cb0d25b97"} Apr 17 17:24:58.743236 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:58.743207 2579 generic.go:358] "Generic (PLEG): container finished" podID="28463658-293e-4847-bb58-c40452c9ceba" containerID="423f5235bacdaee1b3a82683894e8eb1f1eb9e2bce3e4846122e0ee5b6ebfac0" exitCode=0 Apr 17 17:24:58.743236 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:58.743236 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8qx6" event={"ID":"28463658-293e-4847-bb58-c40452c9ceba","Type":"ContainerDied","Data":"423f5235bacdaee1b3a82683894e8eb1f1eb9e2bce3e4846122e0ee5b6ebfac0"} Apr 17 17:24:58.756872 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:58.756850 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:58.837921 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:58.837873 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-szhhg" podStartSLOduration=2.837858624 podStartE2EDuration="2.837858624s" podCreationTimestamp="2026-04-17 17:24:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:24:58.837779679 +0000 UTC m=+25.744512481" watchObservedRunningTime="2026-04-17 17:24:58.837858624 +0000 UTC m=+25.744591424" Apr 17 17:24:59.636163 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:59.636123 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:24:59.636163 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:59.636153 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:24:59.636652 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:59.636228 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m8x5g" podUID="0ace93ad-4902-4616-82aa-f2d931df41ef" Apr 17 17:24:59.636652 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:24:59.636336 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z942n" podUID="d168f0a0-7fcd-4905-a424-24a94b7fcdbb" Apr 17 17:24:59.762259 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:59.762231 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 17:24:59.762630 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:59.762586 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" event={"ID":"5ec8213b-d815-438e-ab3d-f610b8fc1f8a","Type":"ContainerStarted","Data":"3b361a8ba9d738705e90cd8cd0748b1a6e72f01737b466d9baee3f13578c74c4"} Apr 17 17:24:59.762893 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:59.762875 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:59.779051 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:59.779026 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:24:59.794529 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:24:59.794470 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" podStartSLOduration=8.900699037999999 podStartE2EDuration="25.794452803s" podCreationTimestamp="2026-04-17 17:24:34 +0000 UTC" firstStartedPulling="2026-04-17 17:24:36.312321935 +0000 UTC m=+3.219054730" lastFinishedPulling="2026-04-17 17:24:53.206075696 +0000 UTC m=+20.112808495" observedRunningTime="2026-04-17 17:24:59.793426381 +0000 UTC m=+26.700159187" watchObservedRunningTime="2026-04-17 17:24:59.794452803 +0000 UTC m=+26.701185610" Apr 17 17:25:00.636837 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:00.636664 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:25:00.637184 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:00.636906 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5j2pv" podUID="f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95" Apr 17 17:25:00.705986 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:00.705953 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5j2pv"] Apr 17 17:25:00.711448 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:00.711422 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z942n"] Apr 17 17:25:00.711559 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:00.711537 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:25:00.711703 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:00.711678 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z942n" podUID="d168f0a0-7fcd-4905-a424-24a94b7fcdbb" Apr 17 17:25:00.714603 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:00.714568 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-m8x5g"] Apr 17 17:25:00.714710 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:00.714657 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:25:00.714788 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:00.714731 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m8x5g" podUID="0ace93ad-4902-4616-82aa-f2d931df41ef" Apr 17 17:25:00.766134 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:00.766097 2579 generic.go:358] "Generic (PLEG): container finished" podID="28463658-293e-4847-bb58-c40452c9ceba" containerID="3d4a6654e8bd6ef64a8300fa5c37d675098b1bdf5636001b1a178b6ab8e7fdcb" exitCode=0 Apr 17 17:25:00.766296 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:00.766183 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8qx6" event={"ID":"28463658-293e-4847-bb58-c40452c9ceba","Type":"ContainerDied","Data":"3d4a6654e8bd6ef64a8300fa5c37d675098b1bdf5636001b1a178b6ab8e7fdcb"} Apr 17 17:25:00.766779 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:00.766758 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:25:00.766891 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:00.766877 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5j2pv" podUID="f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95" Apr 17 17:25:02.636001 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:02.635960 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:25:02.636001 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:02.635974 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:25:02.636484 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:02.635960 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:25:02.636484 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:02.636100 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z942n" podUID="d168f0a0-7fcd-4905-a424-24a94b7fcdbb" Apr 17 17:25:02.636484 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:02.636203 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5j2pv" podUID="f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95" Apr 17 17:25:02.636484 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:02.636276 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m8x5g" podUID="0ace93ad-4902-4616-82aa-f2d931df41ef" Apr 17 17:25:02.772741 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:02.772702 2579 generic.go:358] "Generic (PLEG): container finished" podID="28463658-293e-4847-bb58-c40452c9ceba" containerID="826c173a4649fea2dacde2f6994a1eac7f261eba2c23452b3a8f748ef27ebeef" exitCode=0 Apr 17 17:25:02.772888 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:02.772758 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8qx6" event={"ID":"28463658-293e-4847-bb58-c40452c9ceba","Type":"ContainerDied","Data":"826c173a4649fea2dacde2f6994a1eac7f261eba2c23452b3a8f748ef27ebeef"} Apr 17 17:25:04.636667 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:04.636628 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:25:04.637349 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:04.636749 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:25:04.637349 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:04.636759 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5j2pv" podUID="f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95" Apr 17 17:25:04.637349 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:04.636779 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:25:04.637349 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:04.636862 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m8x5g" podUID="0ace93ad-4902-4616-82aa-f2d931df41ef" Apr 17 17:25:04.637349 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:04.636944 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z942n" podUID="d168f0a0-7fcd-4905-a424-24a94b7fcdbb" Apr 17 17:25:04.806634 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:04.806585 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret\") pod \"global-pull-secret-syncer-5j2pv\" (UID: \"f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95\") " pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:25:04.806805 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:04.806734 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:04.806805 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:04.806802 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret podName:f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:20.806787834 +0000 UTC m=+47.713520620 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret") pod "global-pull-secret-syncer-5j2pv" (UID: "f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:05.440216 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.440184 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-17.ec2.internal" event="NodeReady" Apr 17 17:25:05.440401 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.440353 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 17:25:05.485073 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.484710 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-87bk5"] Apr 17 17:25:05.516554 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.516512 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5f49dd587d-rxz8k"] Apr 17 17:25:05.517438 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.516794 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-87bk5" Apr 17 17:25:05.520649 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.520347 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 17:25:05.520649 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.520379 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-hrh8k\"" Apr 17 17:25:05.520996 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.520974 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:25:05.521185 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.521170 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 17:25:05.521393 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.521374 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 17:25:05.531473 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.531445 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr"] Apr 17 17:25:05.532372 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.532084 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:05.535235 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.535208 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 17:25:05.535346 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.535327 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 17:25:05.535450 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.535430 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-86rgw\"" Apr 17 17:25:05.535621 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.535583 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 17:25:05.535702 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.535620 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 17:25:05.535702 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.535698 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 17:25:05.535952 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.535934 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 17:25:05.551884 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.551856 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-r2mbx"] Apr 17 17:25:05.552061 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.552043 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr" Apr 17 17:25:05.557652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.557617 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 17:25:05.557652 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.557644 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 17:25:05.558449 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.558429 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 17:25:05.558558 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.558460 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 17:25:05.558662 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.558643 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-2scp9\"" Apr 17 17:25:05.573063 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.573038 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5fc859f89d-s6vt8"] Apr 17 17:25:05.573184 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.573104 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-r2mbx" Apr 17 17:25:05.576163 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.576142 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 17:25:05.579104 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.579081 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-6pbk6\"" Apr 17 17:25:05.580456 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.580187 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 17:25:05.580456 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.580291 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 17:25:05.580712 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.580690 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 17:25:05.586231 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.586208 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 17:25:05.588229 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.588203 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-r2mbx"] Apr 17 17:25:05.588307 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.588249 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-87bk5"] Apr 17 17:25:05.588307 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.588267 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-46l9f"] Apr 17 17:25:05.588307 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.588278 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.590531 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.590507 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qvzpk\"" Apr 17 17:25:05.590647 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.590512 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 17:25:05.591223 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.591197 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 17:25:05.591324 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.591224 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 17:25:05.597813 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.597796 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 17:25:05.600159 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.600139 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr"] Apr 17 17:25:05.600252 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.600164 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w"] Apr 17 17:25:05.600322 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.600294 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:05.603782 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.603517 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9l5w8\"" Apr 17 17:25:05.603782 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.603640 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 17:25:05.603948 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.603922 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 17:25:05.612308 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.612285 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rnfvq"] Apr 17 17:25:05.612410 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.612326 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kxqw\" (UniqueName: \"kubernetes.io/projected/ed6fdd82-ec63-4507-83a7-188a60111e24-kube-api-access-2kxqw\") pod \"service-ca-operator-d6fc45fc5-87bk5\" (UID: \"ed6fdd82-ec63-4507-83a7-188a60111e24\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-87bk5" Apr 17 17:25:05.612410 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.612360 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:05.612410 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.612397 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed6fdd82-ec63-4507-83a7-188a60111e24-config\") pod \"service-ca-operator-d6fc45fc5-87bk5\" (UID: \"ed6fdd82-ec63-4507-83a7-188a60111e24\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-87bk5" Apr 17 17:25:05.612609 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.612414 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w" Apr 17 17:25:05.612609 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.612423 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-stats-auth\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:05.612609 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.612454 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-default-certificate\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:05.612609 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.612479 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfnmw\" (UniqueName: \"kubernetes.io/projected/757e5944-43d8-40d9-bf59-81391d9f77cf-kube-api-access-jfnmw\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:05.612609 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.612531 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:05.612609 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.612556 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed6fdd82-ec63-4507-83a7-188a60111e24-serving-cert\") pod \"service-ca-operator-d6fc45fc5-87bk5\" (UID: \"ed6fdd82-ec63-4507-83a7-188a60111e24\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-87bk5" Apr 17 17:25:05.615244 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.615222 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 17:25:05.615509 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.615493 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:25:05.615824 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.615804 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 17:25:05.616445 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.616354 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-5qqw4\"" Apr 17 17:25:05.624414 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.624394 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fdkcf"] Apr 17 17:25:05.624551 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.624534 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rnfvq" Apr 17 17:25:05.627402 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.627384 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 17:25:05.627515 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.627454 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-27x84\"" Apr 17 17:25:05.627720 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.627703 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:25:05.636472 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.636449 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nnctm"] Apr 17 17:25:05.636634 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.636616 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fdkcf" Apr 17 17:25:05.638773 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.638753 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 17:25:05.638773 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.638771 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-rw8lm\"" Apr 17 17:25:05.639162 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.638759 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 17:25:05.648517 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.648492 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-z445c"] Apr 17 17:25:05.648667 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.648651 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nnctm" Apr 17 17:25:05.651124 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.651079 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:25:05.651229 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.651183 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 17:25:05.651285 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.651082 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 17:25:05.651372 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.651079 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 17:25:05.651372 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.651370 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-jd7lf\"" Apr 17 17:25:05.657820 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.657799 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-z445c" Apr 17 17:25:05.660839 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.660154 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:25:05.660839 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.660217 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 17:25:05.660839 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.660230 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-mnc4k\"" Apr 17 17:25:05.660839 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.660281 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 17:25:05.660839 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.660527 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 17:25:05.661816 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.661799 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8mf4x"] Apr 17 17:25:05.664737 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.664716 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 17:25:05.673462 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.673433 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-n6tvz"] Apr 17 17:25:05.673634 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.673570 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8mf4x" Apr 17 17:25:05.675858 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.675837 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 17:25:05.675963 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.675923 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-n9t7x\"" Apr 17 17:25:05.676232 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.676184 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 17:25:05.676232 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.676206 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 17:25:05.685537 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.685518 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5f49dd587d-rxz8k"] Apr 17 17:25:05.685651 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.685543 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5fc859f89d-s6vt8"] Apr 17 17:25:05.685651 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.685553 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-n6tvz" Apr 17 17:25:05.685651 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.685556 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-46l9f"] Apr 17 17:25:05.685651 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.685644 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rnfvq"] Apr 17 17:25:05.685845 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.685660 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fdkcf"] Apr 17 17:25:05.685845 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.685674 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8mf4x"] Apr 17 17:25:05.685845 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.685686 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w"] Apr 17 17:25:05.685845 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.685706 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-n6tvz"] Apr 17 17:25:05.685845 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.685728 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nnctm"] Apr 17 17:25:05.685845 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.685740 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-z445c"] Apr 17 17:25:05.687740 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.687722 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:25:05.687837 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.687736 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:25:05.687837 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.687757 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-sv2fg\"" Apr 17 17:25:05.712967 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.712930 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6656a7d-18be-4793-8aae-ca80248fd4ac-config-volume\") pod \"dns-default-46l9f\" (UID: \"f6656a7d-18be-4793-8aae-ca80248fd4ac\") " pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:05.713143 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.712983 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kxqw\" (UniqueName: \"kubernetes.io/projected/ed6fdd82-ec63-4507-83a7-188a60111e24-kube-api-access-2kxqw\") pod \"service-ca-operator-d6fc45fc5-87bk5\" (UID: \"ed6fdd82-ec63-4507-83a7-188a60111e24\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-87bk5" Apr 17 17:25:05.713143 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713025 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94b536ad-a08a-4eea-b44e-9a2802212a72-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-r2mbx\" (UID: \"94b536ad-a08a-4eea-b44e-9a2802212a72\") " pod="openshift-insights/insights-operator-585dfdc468-r2mbx" Apr 17 17:25:05.713143 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713056 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-stats-auth\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:05.713143 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713082 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.713143 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713130 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-default-certificate\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:05.713398 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713166 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2e1343de-f52a-4262-8e36-2270dd39d6a2-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tzqfr\" (UID: \"2e1343de-f52a-4262-8e36-2270dd39d6a2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr" Apr 17 17:25:05.713398 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713198 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c0da0d0b-459b-4fcc-a426-b97d20867b60-installation-pull-secrets\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.713398 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713224 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/94b536ad-a08a-4eea-b44e-9a2802212a72-snapshots\") pod \"insights-operator-585dfdc468-r2mbx\" (UID: \"94b536ad-a08a-4eea-b44e-9a2802212a72\") " pod="openshift-insights/insights-operator-585dfdc468-r2mbx" Apr 17 17:25:05.713398 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713268 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c0da0d0b-459b-4fcc-a426-b97d20867b60-image-registry-private-configuration\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.713398 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713290 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fdkcf\" (UID: \"b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fdkcf" Apr 17 17:25:05.713398 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713307 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-bound-sa-token\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.713398 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713346 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fdkcf\" (UID: \"b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fdkcf" Apr 17 17:25:05.713398 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713380 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0da0d0b-459b-4fcc-a426-b97d20867b60-trusted-ca\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.713778 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713443 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nddj\" (UniqueName: \"kubernetes.io/projected/94b536ad-a08a-4eea-b44e-9a2802212a72-kube-api-access-5nddj\") pod \"insights-operator-585dfdc468-r2mbx\" (UID: \"94b536ad-a08a-4eea-b44e-9a2802212a72\") " pod="openshift-insights/insights-operator-585dfdc468-r2mbx" Apr 17 17:25:05.713778 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713473 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94b536ad-a08a-4eea-b44e-9a2802212a72-serving-cert\") pod \"insights-operator-585dfdc468-r2mbx\" (UID: \"94b536ad-a08a-4eea-b44e-9a2802212a72\") " pod="openshift-insights/insights-operator-585dfdc468-r2mbx" Apr 17 17:25:05.713778 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713494 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:05.713778 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713514 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd9jt\" (UniqueName: \"kubernetes.io/projected/2e1343de-f52a-4262-8e36-2270dd39d6a2-kube-api-access-hd9jt\") pod \"cluster-monitoring-operator-75587bd455-tzqfr\" (UID: \"2e1343de-f52a-4262-8e36-2270dd39d6a2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr" Apr 17 17:25:05.713778 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713542 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6htf\" (UniqueName: \"kubernetes.io/projected/ad57895c-7274-4a8b-a653-20f918afed96-kube-api-access-n6htf\") pod \"volume-data-source-validator-7c6cbb6c87-rnfvq\" (UID: \"ad57895c-7274-4a8b-a653-20f918afed96\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rnfvq" Apr 17 17:25:05.713778 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713575 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94b536ad-a08a-4eea-b44e-9a2802212a72-service-ca-bundle\") pod \"insights-operator-585dfdc468-r2mbx\" (UID: \"94b536ad-a08a-4eea-b44e-9a2802212a72\") " pod="openshift-insights/insights-operator-585dfdc468-r2mbx" Apr 17 17:25:05.713778 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713615 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed6fdd82-ec63-4507-83a7-188a60111e24-config\") pod \"service-ca-operator-d6fc45fc5-87bk5\" (UID: \"ed6fdd82-ec63-4507-83a7-188a60111e24\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-87bk5" Apr 17 17:25:05.713778 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:05.713646 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle podName:757e5944-43d8-40d9-bf59-81391d9f77cf nodeName:}" failed. No retries permitted until 2026-04-17 17:25:06.213628373 +0000 UTC m=+33.120361174 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle") pod "router-default-5f49dd587d-rxz8k" (UID: "757e5944-43d8-40d9-bf59-81391d9f77cf") : configmap references non-existent config key: service-ca.crt Apr 17 17:25:05.713778 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713667 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-certificates\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.713778 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713700 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfnmw\" (UniqueName: \"kubernetes.io/projected/757e5944-43d8-40d9-bf59-81391d9f77cf-kube-api-access-jfnmw\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:05.713778 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713725 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/94b536ad-a08a-4eea-b44e-9a2802212a72-tmp\") pod \"insights-operator-585dfdc468-r2mbx\" (UID: \"94b536ad-a08a-4eea-b44e-9a2802212a72\") " pod="openshift-insights/insights-operator-585dfdc468-r2mbx" Apr 17 17:25:05.714287 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713829 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nnj6\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-kube-api-access-7nnj6\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.714287 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713858 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5647w\" (UID: \"9617e4e1-7e6f-467b-92b9-0a933eb32dc6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w" Apr 17 17:25:05.714287 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713888 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482108e9-1395-4eda-884c-859f77d7a6be-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-nnctm\" (UID: \"482108e9-1395-4eda-884c-859f77d7a6be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nnctm" Apr 17 17:25:05.714287 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.713971 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tzqfr\" (UID: \"2e1343de-f52a-4262-8e36-2270dd39d6a2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr" Apr 17 17:25:05.714287 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.714071 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cqwp\" (UniqueName: \"kubernetes.io/projected/482108e9-1395-4eda-884c-859f77d7a6be-kube-api-access-8cqwp\") pod \"kube-storage-version-migrator-operator-6769c5d45-nnctm\" (UID: \"482108e9-1395-4eda-884c-859f77d7a6be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nnctm" Apr 17 17:25:05.714287 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.714098 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f6656a7d-18be-4793-8aae-ca80248fd4ac-tmp-dir\") pod \"dns-default-46l9f\" (UID: \"f6656a7d-18be-4793-8aae-ca80248fd4ac\") " pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:05.714287 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.714134 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed6fdd82-ec63-4507-83a7-188a60111e24-config\") pod \"service-ca-operator-d6fc45fc5-87bk5\" (UID: \"ed6fdd82-ec63-4507-83a7-188a60111e24\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-87bk5" Apr 17 17:25:05.714287 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.714145 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:05.714287 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.714161 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86fsg\" (UniqueName: \"kubernetes.io/projected/f6656a7d-18be-4793-8aae-ca80248fd4ac-kube-api-access-86fsg\") pod \"dns-default-46l9f\" (UID: \"f6656a7d-18be-4793-8aae-ca80248fd4ac\") " pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:05.714287 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.714194 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed6fdd82-ec63-4507-83a7-188a60111e24-serving-cert\") pod \"service-ca-operator-d6fc45fc5-87bk5\" (UID: \"ed6fdd82-ec63-4507-83a7-188a60111e24\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-87bk5" Apr 17 17:25:05.714287 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.714217 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls\") pod \"dns-default-46l9f\" (UID: \"f6656a7d-18be-4793-8aae-ca80248fd4ac\") " pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:05.714287 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.714250 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482108e9-1395-4eda-884c-859f77d7a6be-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-nnctm\" (UID: \"482108e9-1395-4eda-884c-859f77d7a6be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nnctm" Apr 17 17:25:05.714287 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:05.714258 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:25:05.714287 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.714272 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffz8n\" (UniqueName: \"kubernetes.io/projected/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-kube-api-access-ffz8n\") pod \"cluster-samples-operator-6dc5bdb6b4-5647w\" (UID: \"9617e4e1-7e6f-467b-92b9-0a933eb32dc6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w" Apr 17 17:25:05.714287 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.714296 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c0da0d0b-459b-4fcc-a426-b97d20867b60-ca-trust-extracted\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.715002 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:05.714320 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs podName:757e5944-43d8-40d9-bf59-81391d9f77cf nodeName:}" failed. No retries permitted until 2026-04-17 17:25:06.214304475 +0000 UTC m=+33.121037272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs") pod "router-default-5f49dd587d-rxz8k" (UID: "757e5944-43d8-40d9-bf59-81391d9f77cf") : secret "router-metrics-certs-default" not found Apr 17 17:25:05.717882 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.717693 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed6fdd82-ec63-4507-83a7-188a60111e24-serving-cert\") pod \"service-ca-operator-d6fc45fc5-87bk5\" (UID: \"ed6fdd82-ec63-4507-83a7-188a60111e24\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-87bk5" Apr 17 17:25:05.718004 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.717842 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-default-certificate\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:05.718004 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.717959 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-stats-auth\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:05.722193 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.722166 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kxqw\" (UniqueName: \"kubernetes.io/projected/ed6fdd82-ec63-4507-83a7-188a60111e24-kube-api-access-2kxqw\") pod \"service-ca-operator-d6fc45fc5-87bk5\" (UID: \"ed6fdd82-ec63-4507-83a7-188a60111e24\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-87bk5" Apr 17 17:25:05.722193 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.722187 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfnmw\" (UniqueName: \"kubernetes.io/projected/757e5944-43d8-40d9-bf59-81391d9f77cf-kube-api-access-jfnmw\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:05.815114 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815074 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0da0d0b-459b-4fcc-a426-b97d20867b60-trusted-ca\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.815114 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815118 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nddj\" (UniqueName: \"kubernetes.io/projected/94b536ad-a08a-4eea-b44e-9a2802212a72-kube-api-access-5nddj\") pod \"insights-operator-585dfdc468-r2mbx\" (UID: \"94b536ad-a08a-4eea-b44e-9a2802212a72\") " pod="openshift-insights/insights-operator-585dfdc468-r2mbx" Apr 17 17:25:05.815355 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815145 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94b536ad-a08a-4eea-b44e-9a2802212a72-serving-cert\") pod \"insights-operator-585dfdc468-r2mbx\" (UID: \"94b536ad-a08a-4eea-b44e-9a2802212a72\") " pod="openshift-insights/insights-operator-585dfdc468-r2mbx" Apr 17 17:25:05.815355 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815190 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hd9jt\" (UniqueName: \"kubernetes.io/projected/2e1343de-f52a-4262-8e36-2270dd39d6a2-kube-api-access-hd9jt\") pod \"cluster-monitoring-operator-75587bd455-tzqfr\" (UID: \"2e1343de-f52a-4262-8e36-2270dd39d6a2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr" Apr 17 17:25:05.815355 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815227 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6htf\" (UniqueName: \"kubernetes.io/projected/ad57895c-7274-4a8b-a653-20f918afed96-kube-api-access-n6htf\") pod \"volume-data-source-validator-7c6cbb6c87-rnfvq\" (UID: \"ad57895c-7274-4a8b-a653-20f918afed96\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rnfvq" Apr 17 17:25:05.815355 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815258 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94b536ad-a08a-4eea-b44e-9a2802212a72-service-ca-bundle\") pod \"insights-operator-585dfdc468-r2mbx\" (UID: \"94b536ad-a08a-4eea-b44e-9a2802212a72\") " pod="openshift-insights/insights-operator-585dfdc468-r2mbx" Apr 17 17:25:05.815536 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815458 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-certificates\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.815536 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815510 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/94b536ad-a08a-4eea-b44e-9a2802212a72-tmp\") pod \"insights-operator-585dfdc468-r2mbx\" (UID: \"94b536ad-a08a-4eea-b44e-9a2802212a72\") " pod="openshift-insights/insights-operator-585dfdc468-r2mbx" Apr 17 17:25:05.815638 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815539 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nnj6\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-kube-api-access-7nnj6\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.815638 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815568 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5647w\" (UID: \"9617e4e1-7e6f-467b-92b9-0a933eb32dc6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w" Apr 17 17:25:05.815638 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815612 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d8f932-f808-4d50-8b55-ad125b8b9a2c-config\") pod \"console-operator-9d4b6777b-z445c\" (UID: \"d7d8f932-f808-4d50-8b55-ad125b8b9a2c\") " pod="openshift-console-operator/console-operator-9d4b6777b-z445c" Apr 17 17:25:05.815784 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815643 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482108e9-1395-4eda-884c-859f77d7a6be-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-nnctm\" (UID: \"482108e9-1395-4eda-884c-859f77d7a6be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nnctm" Apr 17 17:25:05.815784 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815672 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tzqfr\" (UID: \"2e1343de-f52a-4262-8e36-2270dd39d6a2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr" Apr 17 17:25:05.815784 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815707 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8cqwp\" (UniqueName: \"kubernetes.io/projected/482108e9-1395-4eda-884c-859f77d7a6be-kube-api-access-8cqwp\") pod \"kube-storage-version-migrator-operator-6769c5d45-nnctm\" (UID: \"482108e9-1395-4eda-884c-859f77d7a6be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nnctm" Apr 17 17:25:05.815784 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815733 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f6656a7d-18be-4793-8aae-ca80248fd4ac-tmp-dir\") pod \"dns-default-46l9f\" (UID: \"f6656a7d-18be-4793-8aae-ca80248fd4ac\") " pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:05.815784 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815759 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8cvx\" (UniqueName: \"kubernetes.io/projected/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-kube-api-access-x8cvx\") pod \"ingress-canary-8mf4x\" (UID: \"6ba0a3f3-4f4d-4ea1-a514-e449db3682e3\") " pod="openshift-ingress-canary/ingress-canary-8mf4x" Apr 17 17:25:05.815784 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815781 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d8f932-f808-4d50-8b55-ad125b8b9a2c-serving-cert\") pod \"console-operator-9d4b6777b-z445c\" (UID: \"d7d8f932-f808-4d50-8b55-ad125b8b9a2c\") " pod="openshift-console-operator/console-operator-9d4b6777b-z445c" Apr 17 17:25:05.816085 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815834 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86fsg\" (UniqueName: \"kubernetes.io/projected/f6656a7d-18be-4793-8aae-ca80248fd4ac-kube-api-access-86fsg\") pod \"dns-default-46l9f\" (UID: \"f6656a7d-18be-4793-8aae-ca80248fd4ac\") " pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:05.816085 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815859 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7d8f932-f808-4d50-8b55-ad125b8b9a2c-trusted-ca\") pod \"console-operator-9d4b6777b-z445c\" (UID: \"d7d8f932-f808-4d50-8b55-ad125b8b9a2c\") " pod="openshift-console-operator/console-operator-9d4b6777b-z445c" Apr 17 17:25:05.816085 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815926 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/94b536ad-a08a-4eea-b44e-9a2802212a72-tmp\") pod \"insights-operator-585dfdc468-r2mbx\" (UID: \"94b536ad-a08a-4eea-b44e-9a2802212a72\") " pod="openshift-insights/insights-operator-585dfdc468-r2mbx" Apr 17 17:25:05.816085 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815964 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94b536ad-a08a-4eea-b44e-9a2802212a72-service-ca-bundle\") pod \"insights-operator-585dfdc468-r2mbx\" (UID: \"94b536ad-a08a-4eea-b44e-9a2802212a72\") " pod="openshift-insights/insights-operator-585dfdc468-r2mbx" Apr 17 17:25:05.816085 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.815993 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-certificates\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.816085 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:05.816052 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:25:05.816357 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:05.816107 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:05.816357 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816109 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls\") pod \"dns-default-46l9f\" (UID: \"f6656a7d-18be-4793-8aae-ca80248fd4ac\") " pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:05.816357 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:05.816126 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls podName:9617e4e1-7e6f-467b-92b9-0a933eb32dc6 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:06.316105871 +0000 UTC m=+33.222838658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5647w" (UID: "9617e4e1-7e6f-467b-92b9-0a933eb32dc6") : secret "samples-operator-tls" not found Apr 17 17:25:05.816357 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:05.816168 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls podName:2e1343de-f52a-4262-8e36-2270dd39d6a2 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:06.316151852 +0000 UTC m=+33.222884648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tzqfr" (UID: "2e1343de-f52a-4262-8e36-2270dd39d6a2") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:05.816357 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816169 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f6656a7d-18be-4793-8aae-ca80248fd4ac-tmp-dir\") pod \"dns-default-46l9f\" (UID: \"f6656a7d-18be-4793-8aae-ca80248fd4ac\") " pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:05.816357 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:05.816187 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:05.816357 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816204 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482108e9-1395-4eda-884c-859f77d7a6be-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-nnctm\" (UID: \"482108e9-1395-4eda-884c-859f77d7a6be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nnctm" Apr 17 17:25:05.816357 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:05.816239 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls podName:f6656a7d-18be-4793-8aae-ca80248fd4ac nodeName:}" failed. No retries permitted until 2026-04-17 17:25:06.31622566 +0000 UTC m=+33.222958452 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls") pod "dns-default-46l9f" (UID: "f6656a7d-18be-4793-8aae-ca80248fd4ac") : secret "dns-default-metrics-tls" not found Apr 17 17:25:05.816357 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816273 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffz8n\" (UniqueName: \"kubernetes.io/projected/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-kube-api-access-ffz8n\") pod \"cluster-samples-operator-6dc5bdb6b4-5647w\" (UID: \"9617e4e1-7e6f-467b-92b9-0a933eb32dc6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w" Apr 17 17:25:05.816357 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816309 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnkkm\" (UniqueName: \"kubernetes.io/projected/d7d8f932-f808-4d50-8b55-ad125b8b9a2c-kube-api-access-tnkkm\") pod \"console-operator-9d4b6777b-z445c\" (UID: \"d7d8f932-f808-4d50-8b55-ad125b8b9a2c\") " pod="openshift-console-operator/console-operator-9d4b6777b-z445c" Apr 17 17:25:05.816357 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816339 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c0da0d0b-459b-4fcc-a426-b97d20867b60-ca-trust-extracted\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.816924 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816369 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6656a7d-18be-4793-8aae-ca80248fd4ac-config-volume\") pod \"dns-default-46l9f\" (UID: \"f6656a7d-18be-4793-8aae-ca80248fd4ac\") " pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:05.816924 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816369 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0da0d0b-459b-4fcc-a426-b97d20867b60-trusted-ca\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.816924 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816402 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert\") pod \"ingress-canary-8mf4x\" (UID: \"6ba0a3f3-4f4d-4ea1-a514-e449db3682e3\") " pod="openshift-ingress-canary/ingress-canary-8mf4x" Apr 17 17:25:05.816924 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816444 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94b536ad-a08a-4eea-b44e-9a2802212a72-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-r2mbx\" (UID: \"94b536ad-a08a-4eea-b44e-9a2802212a72\") " pod="openshift-insights/insights-operator-585dfdc468-r2mbx" Apr 17 17:25:05.816924 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816474 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.816924 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816504 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2e1343de-f52a-4262-8e36-2270dd39d6a2-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tzqfr\" (UID: \"2e1343de-f52a-4262-8e36-2270dd39d6a2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr" Apr 17 17:25:05.816924 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816532 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c0da0d0b-459b-4fcc-a426-b97d20867b60-installation-pull-secrets\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.816924 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816562 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/94b536ad-a08a-4eea-b44e-9a2802212a72-snapshots\") pod \"insights-operator-585dfdc468-r2mbx\" (UID: \"94b536ad-a08a-4eea-b44e-9a2802212a72\") " pod="openshift-insights/insights-operator-585dfdc468-r2mbx" Apr 17 17:25:05.816924 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816611 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wds9d\" (UniqueName: \"kubernetes.io/projected/0c9357c2-cf5b-4e52-889b-e7a839ac8e1d-kube-api-access-wds9d\") pod \"network-check-source-8894fc9bd-n6tvz\" (UID: \"0c9357c2-cf5b-4e52-889b-e7a839ac8e1d\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-n6tvz" Apr 17 17:25:05.816924 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816667 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c0da0d0b-459b-4fcc-a426-b97d20867b60-image-registry-private-configuration\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.816924 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816678 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c0da0d0b-459b-4fcc-a426-b97d20867b60-ca-trust-extracted\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.816924 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816694 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fdkcf\" (UID: \"b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fdkcf" Apr 17 17:25:05.816924 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816752 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-bound-sa-token\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.816924 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:05.816775 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:25:05.816924 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:05.816790 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5fc859f89d-s6vt8: secret "image-registry-tls" not found Apr 17 17:25:05.816924 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816794 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fdkcf\" (UID: \"b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fdkcf" Apr 17 17:25:05.817698 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:05.816831 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls podName:c0da0d0b-459b-4fcc-a426-b97d20867b60 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:06.316816743 +0000 UTC m=+33.223549543 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls") pod "image-registry-5fc859f89d-s6vt8" (UID: "c0da0d0b-459b-4fcc-a426-b97d20867b60") : secret "image-registry-tls" not found Apr 17 17:25:05.817698 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:05.816919 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 17:25:05.817698 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:05.816981 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert podName:b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:06.31696607 +0000 UTC m=+33.223698855 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fdkcf" (UID: "b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5") : secret "networking-console-plugin-cert" not found Apr 17 17:25:05.817698 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.816997 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f6656a7d-18be-4793-8aae-ca80248fd4ac-config-volume\") pod \"dns-default-46l9f\" (UID: \"f6656a7d-18be-4793-8aae-ca80248fd4ac\") " pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:05.817698 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.817309 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94b536ad-a08a-4eea-b44e-9a2802212a72-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-r2mbx\" (UID: \"94b536ad-a08a-4eea-b44e-9a2802212a72\") " pod="openshift-insights/insights-operator-585dfdc468-r2mbx" Apr 17 17:25:05.817698 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.817452 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2e1343de-f52a-4262-8e36-2270dd39d6a2-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-tzqfr\" (UID: \"2e1343de-f52a-4262-8e36-2270dd39d6a2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr" Apr 17 17:25:05.817698 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.817467 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fdkcf\" (UID: \"b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fdkcf" Apr 17 17:25:05.818036 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.817746 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/94b536ad-a08a-4eea-b44e-9a2802212a72-snapshots\") pod \"insights-operator-585dfdc468-r2mbx\" (UID: \"94b536ad-a08a-4eea-b44e-9a2802212a72\") " pod="openshift-insights/insights-operator-585dfdc468-r2mbx" Apr 17 17:25:05.818566 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.818545 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482108e9-1395-4eda-884c-859f77d7a6be-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-nnctm\" (UID: \"482108e9-1395-4eda-884c-859f77d7a6be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nnctm" Apr 17 17:25:05.819560 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.819530 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c0da0d0b-459b-4fcc-a426-b97d20867b60-image-registry-private-configuration\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.819668 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.819613 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c0da0d0b-459b-4fcc-a426-b97d20867b60-installation-pull-secrets\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.819913 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.819893 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94b536ad-a08a-4eea-b44e-9a2802212a72-serving-cert\") pod \"insights-operator-585dfdc468-r2mbx\" (UID: \"94b536ad-a08a-4eea-b44e-9a2802212a72\") " pod="openshift-insights/insights-operator-585dfdc468-r2mbx" Apr 17 17:25:05.824466 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.824384 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6htf\" (UniqueName: \"kubernetes.io/projected/ad57895c-7274-4a8b-a653-20f918afed96-kube-api-access-n6htf\") pod \"volume-data-source-validator-7c6cbb6c87-rnfvq\" (UID: \"ad57895c-7274-4a8b-a653-20f918afed96\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rnfvq" Apr 17 17:25:05.824569 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.824487 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nnj6\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-kube-api-access-7nnj6\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.825336 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.825311 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffz8n\" (UniqueName: \"kubernetes.io/projected/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-kube-api-access-ffz8n\") pod \"cluster-samples-operator-6dc5bdb6b4-5647w\" (UID: \"9617e4e1-7e6f-467b-92b9-0a933eb32dc6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w" Apr 17 17:25:05.825866 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.825838 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482108e9-1395-4eda-884c-859f77d7a6be-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-nnctm\" (UID: \"482108e9-1395-4eda-884c-859f77d7a6be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nnctm" Apr 17 17:25:05.826296 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.826274 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86fsg\" (UniqueName: \"kubernetes.io/projected/f6656a7d-18be-4793-8aae-ca80248fd4ac-kube-api-access-86fsg\") pod \"dns-default-46l9f\" (UID: \"f6656a7d-18be-4793-8aae-ca80248fd4ac\") " pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:05.826968 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.826920 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nddj\" (UniqueName: \"kubernetes.io/projected/94b536ad-a08a-4eea-b44e-9a2802212a72-kube-api-access-5nddj\") pod \"insights-operator-585dfdc468-r2mbx\" (UID: \"94b536ad-a08a-4eea-b44e-9a2802212a72\") " pod="openshift-insights/insights-operator-585dfdc468-r2mbx" Apr 17 17:25:05.827306 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.827261 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd9jt\" (UniqueName: \"kubernetes.io/projected/2e1343de-f52a-4262-8e36-2270dd39d6a2-kube-api-access-hd9jt\") pod \"cluster-monitoring-operator-75587bd455-tzqfr\" (UID: \"2e1343de-f52a-4262-8e36-2270dd39d6a2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr" Apr 17 17:25:05.832527 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.832506 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-87bk5" Apr 17 17:25:05.835846 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.835803 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-bound-sa-token\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:05.836624 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.836545 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cqwp\" (UniqueName: \"kubernetes.io/projected/482108e9-1395-4eda-884c-859f77d7a6be-kube-api-access-8cqwp\") pod \"kube-storage-version-migrator-operator-6769c5d45-nnctm\" (UID: \"482108e9-1395-4eda-884c-859f77d7a6be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nnctm" Apr 17 17:25:05.883183 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.883148 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-r2mbx" Apr 17 17:25:05.917875 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.917779 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7d8f932-f808-4d50-8b55-ad125b8b9a2c-trusted-ca\") pod \"console-operator-9d4b6777b-z445c\" (UID: \"d7d8f932-f808-4d50-8b55-ad125b8b9a2c\") " pod="openshift-console-operator/console-operator-9d4b6777b-z445c" Apr 17 17:25:05.918668 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.918130 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnkkm\" (UniqueName: \"kubernetes.io/projected/d7d8f932-f808-4d50-8b55-ad125b8b9a2c-kube-api-access-tnkkm\") pod \"console-operator-9d4b6777b-z445c\" (UID: \"d7d8f932-f808-4d50-8b55-ad125b8b9a2c\") " pod="openshift-console-operator/console-operator-9d4b6777b-z445c" Apr 17 17:25:05.918668 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.918200 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert\") pod \"ingress-canary-8mf4x\" (UID: \"6ba0a3f3-4f4d-4ea1-a514-e449db3682e3\") " pod="openshift-ingress-canary/ingress-canary-8mf4x" Apr 17 17:25:05.918668 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.918296 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wds9d\" (UniqueName: \"kubernetes.io/projected/0c9357c2-cf5b-4e52-889b-e7a839ac8e1d-kube-api-access-wds9d\") pod \"network-check-source-8894fc9bd-n6tvz\" (UID: \"0c9357c2-cf5b-4e52-889b-e7a839ac8e1d\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-n6tvz" Apr 17 17:25:05.918668 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:05.918383 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:05.918668 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.918425 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d8f932-f808-4d50-8b55-ad125b8b9a2c-config\") pod \"console-operator-9d4b6777b-z445c\" (UID: \"d7d8f932-f808-4d50-8b55-ad125b8b9a2c\") " pod="openshift-console-operator/console-operator-9d4b6777b-z445c" Apr 17 17:25:05.918668 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:05.918446 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert podName:6ba0a3f3-4f4d-4ea1-a514-e449db3682e3 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:06.418426866 +0000 UTC m=+33.325159664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert") pod "ingress-canary-8mf4x" (UID: "6ba0a3f3-4f4d-4ea1-a514-e449db3682e3") : secret "canary-serving-cert" not found Apr 17 17:25:05.918668 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.918503 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8cvx\" (UniqueName: \"kubernetes.io/projected/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-kube-api-access-x8cvx\") pod \"ingress-canary-8mf4x\" (UID: \"6ba0a3f3-4f4d-4ea1-a514-e449db3682e3\") " pod="openshift-ingress-canary/ingress-canary-8mf4x" Apr 17 17:25:05.918668 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.918554 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d8f932-f808-4d50-8b55-ad125b8b9a2c-serving-cert\") pod \"console-operator-9d4b6777b-z445c\" (UID: \"d7d8f932-f808-4d50-8b55-ad125b8b9a2c\") " pod="openshift-console-operator/console-operator-9d4b6777b-z445c" Apr 17 17:25:05.919181 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.918859 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7d8f932-f808-4d50-8b55-ad125b8b9a2c-trusted-ca\") pod \"console-operator-9d4b6777b-z445c\" (UID: \"d7d8f932-f808-4d50-8b55-ad125b8b9a2c\") " pod="openshift-console-operator/console-operator-9d4b6777b-z445c" Apr 17 17:25:05.919717 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.919667 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d8f932-f808-4d50-8b55-ad125b8b9a2c-config\") pod \"console-operator-9d4b6777b-z445c\" (UID: \"d7d8f932-f808-4d50-8b55-ad125b8b9a2c\") " pod="openshift-console-operator/console-operator-9d4b6777b-z445c" Apr 17 17:25:05.922976 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.922948 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d8f932-f808-4d50-8b55-ad125b8b9a2c-serving-cert\") pod \"console-operator-9d4b6777b-z445c\" (UID: \"d7d8f932-f808-4d50-8b55-ad125b8b9a2c\") " pod="openshift-console-operator/console-operator-9d4b6777b-z445c" Apr 17 17:25:05.928260 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.928202 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnkkm\" (UniqueName: \"kubernetes.io/projected/d7d8f932-f808-4d50-8b55-ad125b8b9a2c-kube-api-access-tnkkm\") pod \"console-operator-9d4b6777b-z445c\" (UID: \"d7d8f932-f808-4d50-8b55-ad125b8b9a2c\") " pod="openshift-console-operator/console-operator-9d4b6777b-z445c" Apr 17 17:25:05.928902 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.928808 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8cvx\" (UniqueName: \"kubernetes.io/projected/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-kube-api-access-x8cvx\") pod \"ingress-canary-8mf4x\" (UID: \"6ba0a3f3-4f4d-4ea1-a514-e449db3682e3\") " pod="openshift-ingress-canary/ingress-canary-8mf4x" Apr 17 17:25:05.929767 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.929744 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wds9d\" (UniqueName: \"kubernetes.io/projected/0c9357c2-cf5b-4e52-889b-e7a839ac8e1d-kube-api-access-wds9d\") pod \"network-check-source-8894fc9bd-n6tvz\" (UID: \"0c9357c2-cf5b-4e52-889b-e7a839ac8e1d\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-n6tvz" Apr 17 17:25:05.934769 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.934384 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rnfvq" Apr 17 17:25:05.958880 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.958849 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nnctm" Apr 17 17:25:05.968769 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.968301 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-z445c" Apr 17 17:25:05.995027 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:05.994991 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-n6tvz" Apr 17 17:25:06.042851 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.042825 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-87bk5"] Apr 17 17:25:06.061797 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.061703 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-r2mbx"] Apr 17 17:25:06.079315 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:25:06.079262 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded6fdd82_ec63_4507_83a7_188a60111e24.slice/crio-cef443667352d7992b5962b630a2f7ec415e627805ef0c34bb5ecefe5db5c53a WatchSource:0}: Error finding container cef443667352d7992b5962b630a2f7ec415e627805ef0c34bb5ecefe5db5c53a: Status 404 returned error can't find the container with id cef443667352d7992b5962b630a2f7ec415e627805ef0c34bb5ecefe5db5c53a Apr 17 17:25:06.105809 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:25:06.105768 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94b536ad_a08a_4eea_b44e_9a2802212a72.slice/crio-06499f13d450c4226113b2b873b17611cc69c478ea0d200740ccebe2bf45e3aa WatchSource:0}: Error finding container 06499f13d450c4226113b2b873b17611cc69c478ea0d200740ccebe2bf45e3aa: Status 404 returned error can't find the container with id 06499f13d450c4226113b2b873b17611cc69c478ea0d200740ccebe2bf45e3aa Apr 17 17:25:06.145658 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.145582 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rnfvq"] Apr 17 17:25:06.170396 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.170369 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nnctm"] Apr 17 17:25:06.173503 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:25:06.173462 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod482108e9_1395_4eda_884c_859f77d7a6be.slice/crio-c974ba7857a5a4bd878e141a4a6a90937a9034442c14e15043fb7c77fe911e7c WatchSource:0}: Error finding container c974ba7857a5a4bd878e141a4a6a90937a9034442c14e15043fb7c77fe911e7c: Status 404 returned error can't find the container with id c974ba7857a5a4bd878e141a4a6a90937a9034442c14e15043fb7c77fe911e7c Apr 17 17:25:06.182982 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.182955 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-z445c"] Apr 17 17:25:06.186557 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:25:06.186482 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7d8f932_f808_4d50_8b55_ad125b8b9a2c.slice/crio-bc81e20432cbf7beb82375f08feaeb08b2c10c581b0a3dfe7552ed4c2c1c1320 WatchSource:0}: Error finding container bc81e20432cbf7beb82375f08feaeb08b2c10c581b0a3dfe7552ed4c2c1c1320: Status 404 returned error can't find the container with id bc81e20432cbf7beb82375f08feaeb08b2c10c581b0a3dfe7552ed4c2c1c1320 Apr 17 17:25:06.201974 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.201912 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-n6tvz"] Apr 17 17:25:06.205548 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:25:06.205518 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c9357c2_cf5b_4e52_889b_e7a839ac8e1d.slice/crio-3474901c880a975ed2bf8754059bfbca0b98e0374680c48b84b7c2584702d204 WatchSource:0}: Error finding container 3474901c880a975ed2bf8754059bfbca0b98e0374680c48b84b7c2584702d204: Status 404 returned error can't find the container with id 3474901c880a975ed2bf8754059bfbca0b98e0374680c48b84b7c2584702d204 Apr 17 17:25:06.222115 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.222086 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:06.222234 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.222165 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:06.222327 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:06.222305 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle podName:757e5944-43d8-40d9-bf59-81391d9f77cf nodeName:}" failed. No retries permitted until 2026-04-17 17:25:07.222276376 +0000 UTC m=+34.129009168 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle") pod "router-default-5f49dd587d-rxz8k" (UID: "757e5944-43d8-40d9-bf59-81391d9f77cf") : configmap references non-existent config key: service-ca.crt Apr 17 17:25:06.222440 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:06.222339 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:25:06.222440 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:06.222404 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs podName:757e5944-43d8-40d9-bf59-81391d9f77cf nodeName:}" failed. No retries permitted until 2026-04-17 17:25:07.222387984 +0000 UTC m=+34.129120781 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs") pod "router-default-5f49dd587d-rxz8k" (UID: "757e5944-43d8-40d9-bf59-81391d9f77cf") : secret "router-metrics-certs-default" not found Apr 17 17:25:06.323036 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.322990 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5647w\" (UID: \"9617e4e1-7e6f-467b-92b9-0a933eb32dc6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w" Apr 17 17:25:06.323036 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.323042 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tzqfr\" (UID: \"2e1343de-f52a-4262-8e36-2270dd39d6a2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr" Apr 17 17:25:06.323283 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.323082 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls\") pod \"dns-default-46l9f\" (UID: \"f6656a7d-18be-4793-8aae-ca80248fd4ac\") " pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:06.323283 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:06.323114 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:25:06.323283 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.323142 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:06.323283 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:06.323194 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls podName:9617e4e1-7e6f-467b-92b9-0a933eb32dc6 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:07.323173394 +0000 UTC m=+34.229906199 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5647w" (UID: "9617e4e1-7e6f-467b-92b9-0a933eb32dc6") : secret "samples-operator-tls" not found Apr 17 17:25:06.323283 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:06.323217 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:06.323283 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:06.323230 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:25:06.323283 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:06.323243 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5fc859f89d-s6vt8: secret "image-registry-tls" not found Apr 17 17:25:06.323283 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.323246 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fdkcf\" (UID: \"b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fdkcf" Apr 17 17:25:06.323283 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:06.323255 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls podName:f6656a7d-18be-4793-8aae-ca80248fd4ac nodeName:}" failed. No retries permitted until 2026-04-17 17:25:07.323244449 +0000 UTC m=+34.229977243 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls") pod "dns-default-46l9f" (UID: "f6656a7d-18be-4793-8aae-ca80248fd4ac") : secret "dns-default-metrics-tls" not found Apr 17 17:25:06.323283 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:06.323219 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:06.323283 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:06.323286 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls podName:c0da0d0b-459b-4fcc-a426-b97d20867b60 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:07.323272219 +0000 UTC m=+34.230005014 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls") pod "image-registry-5fc859f89d-s6vt8" (UID: "c0da0d0b-459b-4fcc-a426-b97d20867b60") : secret "image-registry-tls" not found Apr 17 17:25:06.323764 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:06.323305 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls podName:2e1343de-f52a-4262-8e36-2270dd39d6a2 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:07.32329691 +0000 UTC m=+34.230029692 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tzqfr" (UID: "2e1343de-f52a-4262-8e36-2270dd39d6a2") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:06.323764 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:06.323306 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 17:25:06.323764 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:06.323343 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert podName:b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:07.323333421 +0000 UTC m=+34.230066204 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fdkcf" (UID: "b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5") : secret "networking-console-plugin-cert" not found Apr 17 17:25:06.424147 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.424096 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert\") pod \"ingress-canary-8mf4x\" (UID: \"6ba0a3f3-4f4d-4ea1-a514-e449db3682e3\") " pod="openshift-ingress-canary/ingress-canary-8mf4x" Apr 17 17:25:06.424331 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:06.424269 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:06.424385 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:06.424357 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert podName:6ba0a3f3-4f4d-4ea1-a514-e449db3682e3 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:07.424336287 +0000 UTC m=+34.331069071 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert") pod "ingress-canary-8mf4x" (UID: "6ba0a3f3-4f4d-4ea1-a514-e449db3682e3") : secret "canary-serving-cert" not found Apr 17 17:25:06.636058 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.636013 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:25:06.636249 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.636016 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:25:06.636732 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.636383 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:25:06.638511 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.638488 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:25:06.638654 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.638636 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bp9pv\"" Apr 17 17:25:06.638786 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.638766 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 17:25:06.639236 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.638861 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vr6gd\"" Apr 17 17:25:06.785074 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.785011 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-87bk5" event={"ID":"ed6fdd82-ec63-4507-83a7-188a60111e24","Type":"ContainerStarted","Data":"cef443667352d7992b5962b630a2f7ec415e627805ef0c34bb5ecefe5db5c53a"} Apr 17 17:25:06.787640 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.787544 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-n6tvz" event={"ID":"0c9357c2-cf5b-4e52-889b-e7a839ac8e1d","Type":"ContainerStarted","Data":"3474901c880a975ed2bf8754059bfbca0b98e0374680c48b84b7c2584702d204"} Apr 17 17:25:06.790095 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.790047 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nnctm" event={"ID":"482108e9-1395-4eda-884c-859f77d7a6be","Type":"ContainerStarted","Data":"c974ba7857a5a4bd878e141a4a6a90937a9034442c14e15043fb7c77fe911e7c"} Apr 17 17:25:06.791424 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.791401 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rnfvq" event={"ID":"ad57895c-7274-4a8b-a653-20f918afed96","Type":"ContainerStarted","Data":"d6e3e8528bdc51cdd925b740f6845fdec93874f44f88e24dc845759ae409fb7b"} Apr 17 17:25:06.792565 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.792517 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-z445c" event={"ID":"d7d8f932-f808-4d50-8b55-ad125b8b9a2c","Type":"ContainerStarted","Data":"bc81e20432cbf7beb82375f08feaeb08b2c10c581b0a3dfe7552ed4c2c1c1320"} Apr 17 17:25:06.798620 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:06.794457 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-r2mbx" event={"ID":"94b536ad-a08a-4eea-b44e-9a2802212a72","Type":"ContainerStarted","Data":"06499f13d450c4226113b2b873b17611cc69c478ea0d200740ccebe2bf45e3aa"} Apr 17 17:25:07.233616 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:07.232657 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:07.233616 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:07.232764 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:07.233616 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:07.232931 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:25:07.233616 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:07.233010 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs podName:757e5944-43d8-40d9-bf59-81391d9f77cf nodeName:}" failed. No retries permitted until 2026-04-17 17:25:09.23299105 +0000 UTC m=+36.139723851 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs") pod "router-default-5f49dd587d-rxz8k" (UID: "757e5944-43d8-40d9-bf59-81391d9f77cf") : secret "router-metrics-certs-default" not found Apr 17 17:25:07.233616 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:07.233534 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle podName:757e5944-43d8-40d9-bf59-81391d9f77cf nodeName:}" failed. No retries permitted until 2026-04-17 17:25:09.233518689 +0000 UTC m=+36.140251473 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle") pod "router-default-5f49dd587d-rxz8k" (UID: "757e5944-43d8-40d9-bf59-81391d9f77cf") : configmap references non-existent config key: service-ca.crt Apr 17 17:25:07.335042 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:07.333690 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fdkcf\" (UID: \"b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fdkcf" Apr 17 17:25:07.335042 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:07.333794 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5647w\" (UID: \"9617e4e1-7e6f-467b-92b9-0a933eb32dc6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w" Apr 17 17:25:07.335042 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:07.333833 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tzqfr\" (UID: \"2e1343de-f52a-4262-8e36-2270dd39d6a2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr" Apr 17 17:25:07.335042 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:07.333885 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls\") pod \"dns-default-46l9f\" (UID: \"f6656a7d-18be-4793-8aae-ca80248fd4ac\") " pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:07.335042 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:07.333911 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs\") pod \"network-metrics-daemon-z942n\" (UID: \"d168f0a0-7fcd-4905-a424-24a94b7fcdbb\") " pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:25:07.335042 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:07.333972 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:07.335042 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:07.334100 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:25:07.335042 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:07.334115 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5fc859f89d-s6vt8: secret "image-registry-tls" not found Apr 17 17:25:07.335042 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:07.334175 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls podName:c0da0d0b-459b-4fcc-a426-b97d20867b60 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:09.3341566 +0000 UTC m=+36.240889401 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls") pod "image-registry-5fc859f89d-s6vt8" (UID: "c0da0d0b-459b-4fcc-a426-b97d20867b60") : secret "image-registry-tls" not found Apr 17 17:25:07.335042 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:07.334614 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 17:25:07.335042 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:07.334670 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert podName:b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:09.334653803 +0000 UTC m=+36.241386605 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fdkcf" (UID: "b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5") : secret "networking-console-plugin-cert" not found Apr 17 17:25:07.335042 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:07.334730 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:25:07.335042 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:07.334764 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls podName:9617e4e1-7e6f-467b-92b9-0a933eb32dc6 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:09.3347528 +0000 UTC m=+36.241485585 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5647w" (UID: "9617e4e1-7e6f-467b-92b9-0a933eb32dc6") : secret "samples-operator-tls" not found Apr 17 17:25:07.335042 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:07.334814 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:07.335042 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:07.334849 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls podName:2e1343de-f52a-4262-8e36-2270dd39d6a2 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:09.334838391 +0000 UTC m=+36.241571193 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tzqfr" (UID: "2e1343de-f52a-4262-8e36-2270dd39d6a2") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:07.335042 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:07.334897 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:07.336127 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:07.334924 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls podName:f6656a7d-18be-4793-8aae-ca80248fd4ac nodeName:}" failed. No retries permitted until 2026-04-17 17:25:09.334914437 +0000 UTC m=+36.241647233 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls") pod "dns-default-46l9f" (UID: "f6656a7d-18be-4793-8aae-ca80248fd4ac") : secret "dns-default-metrics-tls" not found Apr 17 17:25:07.336127 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:07.334973 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:25:07.336127 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:07.335001 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs podName:d168f0a0-7fcd-4905-a424-24a94b7fcdbb nodeName:}" failed. No retries permitted until 2026-04-17 17:25:39.334991068 +0000 UTC m=+66.241723854 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs") pod "network-metrics-daemon-z942n" (UID: "d168f0a0-7fcd-4905-a424-24a94b7fcdbb") : secret "metrics-daemon-secret" not found Apr 17 17:25:07.435610 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:07.434972 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert\") pod \"ingress-canary-8mf4x\" (UID: \"6ba0a3f3-4f4d-4ea1-a514-e449db3682e3\") " pod="openshift-ingress-canary/ingress-canary-8mf4x" Apr 17 17:25:07.435610 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:07.435149 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:07.435610 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:07.435213 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert podName:6ba0a3f3-4f4d-4ea1-a514-e449db3682e3 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:09.4351942 +0000 UTC m=+36.341927002 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert") pod "ingress-canary-8mf4x" (UID: "6ba0a3f3-4f4d-4ea1-a514-e449db3682e3") : secret "canary-serving-cert" not found Apr 17 17:25:07.537038 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:07.535976 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4tws\" (UniqueName: \"kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws\") pod \"network-check-target-m8x5g\" (UID: \"0ace93ad-4902-4616-82aa-f2d931df41ef\") " pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:25:07.552569 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:07.552485 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4tws\" (UniqueName: \"kubernetes.io/projected/0ace93ad-4902-4616-82aa-f2d931df41ef-kube-api-access-q4tws\") pod \"network-check-target-m8x5g\" (UID: \"0ace93ad-4902-4616-82aa-f2d931df41ef\") " pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:25:07.566916 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:07.566297 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:25:09.253954 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:09.253737 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:09.254408 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:09.253908 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:25:09.254408 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:09.254084 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:09.254408 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:09.254105 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs podName:757e5944-43d8-40d9-bf59-81391d9f77cf nodeName:}" failed. No retries permitted until 2026-04-17 17:25:13.254078301 +0000 UTC m=+40.160811117 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs") pod "router-default-5f49dd587d-rxz8k" (UID: "757e5944-43d8-40d9-bf59-81391d9f77cf") : secret "router-metrics-certs-default" not found Apr 17 17:25:09.254408 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:09.254219 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle podName:757e5944-43d8-40d9-bf59-81391d9f77cf nodeName:}" failed. No retries permitted until 2026-04-17 17:25:13.254200012 +0000 UTC m=+40.160932881 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle") pod "router-default-5f49dd587d-rxz8k" (UID: "757e5944-43d8-40d9-bf59-81391d9f77cf") : configmap references non-existent config key: service-ca.crt Apr 17 17:25:09.355121 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:09.355083 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fdkcf\" (UID: \"b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fdkcf" Apr 17 17:25:09.355289 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:09.355167 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5647w\" (UID: \"9617e4e1-7e6f-467b-92b9-0a933eb32dc6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w" Apr 17 17:25:09.355289 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:09.355221 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tzqfr\" (UID: \"2e1343de-f52a-4262-8e36-2270dd39d6a2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr" Apr 17 17:25:09.355289 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:09.355270 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls\") pod \"dns-default-46l9f\" (UID: \"f6656a7d-18be-4793-8aae-ca80248fd4ac\") " pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:09.355289 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:09.355278 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 17:25:09.355450 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:09.355323 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:25:09.355450 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:09.355356 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:09.355450 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:09.355362 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert podName:b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:13.355337136 +0000 UTC m=+40.262069935 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fdkcf" (UID: "b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5") : secret "networking-console-plugin-cert" not found Apr 17 17:25:09.355450 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:09.355382 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls podName:9617e4e1-7e6f-467b-92b9-0a933eb32dc6 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:13.355372569 +0000 UTC m=+40.262105354 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5647w" (UID: "9617e4e1-7e6f-467b-92b9-0a933eb32dc6") : secret "samples-operator-tls" not found Apr 17 17:25:09.355450 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:09.355388 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:09.355450 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:09.355444 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls podName:2e1343de-f52a-4262-8e36-2270dd39d6a2 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:13.355426514 +0000 UTC m=+40.262159313 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tzqfr" (UID: "2e1343de-f52a-4262-8e36-2270dd39d6a2") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:09.355713 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:09.355487 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:09.355713 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:09.355519 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls podName:f6656a7d-18be-4793-8aae-ca80248fd4ac nodeName:}" failed. No retries permitted until 2026-04-17 17:25:13.35549766 +0000 UTC m=+40.262230449 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls") pod "dns-default-46l9f" (UID: "f6656a7d-18be-4793-8aae-ca80248fd4ac") : secret "dns-default-metrics-tls" not found Apr 17 17:25:09.355713 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:09.355553 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:25:09.355713 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:09.355564 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5fc859f89d-s6vt8: secret "image-registry-tls" not found Apr 17 17:25:09.355713 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:09.355620 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls podName:c0da0d0b-459b-4fcc-a426-b97d20867b60 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:13.355608482 +0000 UTC m=+40.262341263 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls") pod "image-registry-5fc859f89d-s6vt8" (UID: "c0da0d0b-459b-4fcc-a426-b97d20867b60") : secret "image-registry-tls" not found Apr 17 17:25:09.456240 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:09.456201 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert\") pod \"ingress-canary-8mf4x\" (UID: \"6ba0a3f3-4f4d-4ea1-a514-e449db3682e3\") " pod="openshift-ingress-canary/ingress-canary-8mf4x" Apr 17 17:25:09.456419 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:09.456350 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:09.456487 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:09.456459 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert podName:6ba0a3f3-4f4d-4ea1-a514-e449db3682e3 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:13.456404087 +0000 UTC m=+40.363136891 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert") pod "ingress-canary-8mf4x" (UID: "6ba0a3f3-4f4d-4ea1-a514-e449db3682e3") : secret "canary-serving-cert" not found Apr 17 17:25:13.290413 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:13.290373 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:13.290914 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:13.290601 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:25:13.290914 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:13.290662 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:13.290914 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:13.290685 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs podName:757e5944-43d8-40d9-bf59-81391d9f77cf nodeName:}" failed. No retries permitted until 2026-04-17 17:25:21.290658554 +0000 UTC m=+48.197391357 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs") pod "router-default-5f49dd587d-rxz8k" (UID: "757e5944-43d8-40d9-bf59-81391d9f77cf") : secret "router-metrics-certs-default" not found Apr 17 17:25:13.290914 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:13.290762 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle podName:757e5944-43d8-40d9-bf59-81391d9f77cf nodeName:}" failed. No retries permitted until 2026-04-17 17:25:21.290747471 +0000 UTC m=+48.197480265 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle") pod "router-default-5f49dd587d-rxz8k" (UID: "757e5944-43d8-40d9-bf59-81391d9f77cf") : configmap references non-existent config key: service-ca.crt Apr 17 17:25:13.391198 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:13.391162 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:13.391378 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:13.391235 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fdkcf\" (UID: \"b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fdkcf" Apr 17 17:25:13.391378 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:13.391308 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5647w\" (UID: \"9617e4e1-7e6f-467b-92b9-0a933eb32dc6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w" Apr 17 17:25:13.391378 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:13.391340 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tzqfr\" (UID: \"2e1343de-f52a-4262-8e36-2270dd39d6a2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr" Apr 17 17:25:13.391546 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:13.391370 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:25:13.391546 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:13.391383 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 17:25:13.391546 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:13.391390 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls\") pod \"dns-default-46l9f\" (UID: \"f6656a7d-18be-4793-8aae-ca80248fd4ac\") " pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:13.391546 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:13.391395 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5fc859f89d-s6vt8: secret "image-registry-tls" not found Apr 17 17:25:13.391546 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:13.391447 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:25:13.391546 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:13.391459 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert podName:b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:21.391436523 +0000 UTC m=+48.298169319 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fdkcf" (UID: "b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5") : secret "networking-console-plugin-cert" not found Apr 17 17:25:13.391546 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:13.391499 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls podName:c0da0d0b-459b-4fcc-a426-b97d20867b60 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:21.391479016 +0000 UTC m=+48.298211811 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls") pod "image-registry-5fc859f89d-s6vt8" (UID: "c0da0d0b-459b-4fcc-a426-b97d20867b60") : secret "image-registry-tls" not found Apr 17 17:25:13.391546 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:13.391517 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls podName:9617e4e1-7e6f-467b-92b9-0a933eb32dc6 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:21.391508176 +0000 UTC m=+48.298240964 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5647w" (UID: "9617e4e1-7e6f-467b-92b9-0a933eb32dc6") : secret "samples-operator-tls" not found Apr 17 17:25:13.391546 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:13.391539 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:13.391945 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:13.391572 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls podName:2e1343de-f52a-4262-8e36-2270dd39d6a2 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:21.39155898 +0000 UTC m=+48.298291780 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tzqfr" (UID: "2e1343de-f52a-4262-8e36-2270dd39d6a2") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:13.391945 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:13.391620 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:13.391945 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:13.391660 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls podName:f6656a7d-18be-4793-8aae-ca80248fd4ac nodeName:}" failed. No retries permitted until 2026-04-17 17:25:21.391648718 +0000 UTC m=+48.298381521 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls") pod "dns-default-46l9f" (UID: "f6656a7d-18be-4793-8aae-ca80248fd4ac") : secret "dns-default-metrics-tls" not found Apr 17 17:25:13.493069 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:13.493034 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert\") pod \"ingress-canary-8mf4x\" (UID: \"6ba0a3f3-4f4d-4ea1-a514-e449db3682e3\") " pod="openshift-ingress-canary/ingress-canary-8mf4x" Apr 17 17:25:13.493253 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:13.493199 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:13.493309 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:13.493287 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert podName:6ba0a3f3-4f4d-4ea1-a514-e449db3682e3 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:21.493265972 +0000 UTC m=+48.399998770 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert") pod "ingress-canary-8mf4x" (UID: "6ba0a3f3-4f4d-4ea1-a514-e449db3682e3") : secret "canary-serving-cert" not found Apr 17 17:25:14.799774 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:14.799747 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-m8x5g"] Apr 17 17:25:14.820783 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:14.820749 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-m8x5g" event={"ID":"0ace93ad-4902-4616-82aa-f2d931df41ef","Type":"ContainerStarted","Data":"307df7c5599ebf6fb25c2a0e0fce2b8a08b5c8ab7eaf8afd6b8ff9243dd03797"} Apr 17 17:25:15.826127 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:15.826083 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-87bk5" event={"ID":"ed6fdd82-ec63-4507-83a7-188a60111e24","Type":"ContainerStarted","Data":"b4ae02ec51839858971be87c938dae4ec8d6c97f7b12de3d49bb357ab72874a1"} Apr 17 17:25:15.829048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:15.829018 2579 generic.go:358] "Generic (PLEG): container finished" podID="28463658-293e-4847-bb58-c40452c9ceba" containerID="4999e312220f7e6237115fd9bcd36a49b82d73391ad3a04551448e753f0746d4" exitCode=0 Apr 17 17:25:15.829178 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:15.829100 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8qx6" event={"ID":"28463658-293e-4847-bb58-c40452c9ceba","Type":"ContainerDied","Data":"4999e312220f7e6237115fd9bcd36a49b82d73391ad3a04551448e753f0746d4"} Apr 17 17:25:15.830900 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:15.830873 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-n6tvz" event={"ID":"0c9357c2-cf5b-4e52-889b-e7a839ac8e1d","Type":"ContainerStarted","Data":"c27211aaba640d10ec8bdc26a01215e5015d08750509972a47981e183d0ff9ac"} Apr 17 17:25:15.832793 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:15.832765 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nnctm" event={"ID":"482108e9-1395-4eda-884c-859f77d7a6be","Type":"ContainerStarted","Data":"85b6f1155b75c44117b9d4cd7a561edeabd90da839b696ec82855c9fac1697f9"} Apr 17 17:25:15.834966 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:15.834938 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rnfvq" event={"ID":"ad57895c-7274-4a8b-a653-20f918afed96","Type":"ContainerStarted","Data":"35c4bb167cff778c3dccabde08b6c8f2db2cb96c32c91fb22b9069b8ad0a29e3"} Apr 17 17:25:15.837768 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:15.837135 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/0.log" Apr 17 17:25:15.837768 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:15.837172 2579 generic.go:358] "Generic (PLEG): container finished" podID="d7d8f932-f808-4d50-8b55-ad125b8b9a2c" containerID="48aac6d66f4d5a79e6dabcf694e4cb652f2f31f2abe9e61d10162def9600545a" exitCode=255 Apr 17 17:25:15.837768 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:15.837387 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-z445c" event={"ID":"d7d8f932-f808-4d50-8b55-ad125b8b9a2c","Type":"ContainerDied","Data":"48aac6d66f4d5a79e6dabcf694e4cb652f2f31f2abe9e61d10162def9600545a"} Apr 17 17:25:15.837768 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:15.837497 2579 scope.go:117] "RemoveContainer" containerID="48aac6d66f4d5a79e6dabcf694e4cb652f2f31f2abe9e61d10162def9600545a" Apr 17 17:25:15.838937 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:15.838902 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-m8x5g" event={"ID":"0ace93ad-4902-4616-82aa-f2d931df41ef","Type":"ContainerStarted","Data":"7e16231d56ac6bc462a72a116bb1685e65071386feddbc7ffdb956f7cef41d63"} Apr 17 17:25:15.839328 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:15.839295 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:25:15.841454 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:15.841434 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-r2mbx" event={"ID":"94b536ad-a08a-4eea-b44e-9a2802212a72","Type":"ContainerStarted","Data":"500d9c48d56012ae15a71b3c73deeb819c641018fbe6cfdb2a4fe5c5b952b4b5"} Apr 17 17:25:15.842669 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:15.842621 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-87bk5" podStartSLOduration=31.264419705999998 podStartE2EDuration="39.842606922s" podCreationTimestamp="2026-04-17 17:24:36 +0000 UTC" firstStartedPulling="2026-04-17 17:25:06.092758055 +0000 UTC m=+32.999490851" lastFinishedPulling="2026-04-17 17:25:14.670945276 +0000 UTC m=+41.577678067" observedRunningTime="2026-04-17 17:25:15.842365786 +0000 UTC m=+42.749098591" watchObservedRunningTime="2026-04-17 17:25:15.842606922 +0000 UTC m=+42.749339789" Apr 17 17:25:15.859273 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:15.858327 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-m8x5g" podStartSLOduration=41.858308639 podStartE2EDuration="41.858308639s" podCreationTimestamp="2026-04-17 17:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:25:15.857658274 +0000 UTC m=+42.764391078" watchObservedRunningTime="2026-04-17 17:25:15.858308639 +0000 UTC m=+42.765041444" Apr 17 17:25:15.883165 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:15.883107 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nnctm" podStartSLOduration=31.396147773 podStartE2EDuration="39.883086094s" podCreationTimestamp="2026-04-17 17:24:36 +0000 UTC" firstStartedPulling="2026-04-17 17:25:06.1756077 +0000 UTC m=+33.082340495" lastFinishedPulling="2026-04-17 17:25:14.662546028 +0000 UTC m=+41.569278816" observedRunningTime="2026-04-17 17:25:15.882286845 +0000 UTC m=+42.789019649" watchObservedRunningTime="2026-04-17 17:25:15.883086094 +0000 UTC m=+42.789818899" Apr 17 17:25:15.965736 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:15.964040 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rnfvq" podStartSLOduration=31.455059573 podStartE2EDuration="39.964022454s" podCreationTimestamp="2026-04-17 17:24:36 +0000 UTC" firstStartedPulling="2026-04-17 17:25:06.154030655 +0000 UTC m=+33.060763448" lastFinishedPulling="2026-04-17 17:25:14.662993534 +0000 UTC m=+41.569726329" observedRunningTime="2026-04-17 17:25:15.963744839 +0000 UTC m=+42.870477643" watchObservedRunningTime="2026-04-17 17:25:15.964022454 +0000 UTC m=+42.870755251" Apr 17 17:25:15.966016 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:15.965733 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-r2mbx" podStartSLOduration=31.411946666 podStartE2EDuration="39.96571864s" podCreationTimestamp="2026-04-17 17:24:36 +0000 UTC" firstStartedPulling="2026-04-17 17:25:06.109168262 +0000 UTC m=+33.015901048" lastFinishedPulling="2026-04-17 17:25:14.66294024 +0000 UTC m=+41.569673022" observedRunningTime="2026-04-17 17:25:15.943506873 +0000 UTC m=+42.850239689" watchObservedRunningTime="2026-04-17 17:25:15.96571864 +0000 UTC m=+42.872451445" Apr 17 17:25:15.969845 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:15.969753 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-z445c" Apr 17 17:25:15.969845 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:15.969800 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-z445c" Apr 17 17:25:16.639203 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:16.639148 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-n6tvz" podStartSLOduration=32.179180864 podStartE2EDuration="40.639132468s" podCreationTimestamp="2026-04-17 17:24:36 +0000 UTC" firstStartedPulling="2026-04-17 17:25:06.207497265 +0000 UTC m=+33.114230047" lastFinishedPulling="2026-04-17 17:25:14.667448857 +0000 UTC m=+41.574181651" observedRunningTime="2026-04-17 17:25:15.990560698 +0000 UTC m=+42.897293503" watchObservedRunningTime="2026-04-17 17:25:16.639132468 +0000 UTC m=+43.545865270" Apr 17 17:25:16.639802 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:16.639786 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-w5fh4"] Apr 17 17:25:16.646835 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:16.646804 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w5fh4" Apr 17 17:25:16.649872 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:16.649851 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 17:25:16.650118 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:16.650101 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-sxctk\"" Apr 17 17:25:16.650208 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:16.650137 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 17:25:16.655314 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:16.655287 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-w5fh4"] Apr 17 17:25:16.827820 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:16.827779 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f6hv\" (UniqueName: \"kubernetes.io/projected/76baba93-cb35-46ce-b2f4-05ea81e5ce12-kube-api-access-2f6hv\") pod \"migrator-74bb7799d9-w5fh4\" (UID: \"76baba93-cb35-46ce-b2f4-05ea81e5ce12\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w5fh4" Apr 17 17:25:16.846091 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:16.846064 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 17:25:16.846463 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:16.846447 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/0.log" Apr 17 17:25:16.846509 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:16.846482 2579 generic.go:358] "Generic (PLEG): container finished" podID="d7d8f932-f808-4d50-8b55-ad125b8b9a2c" containerID="8f66eaa28fb800217ed454a988db30c7364164aa10f9fff60e7f877e82270670" exitCode=255 Apr 17 17:25:16.846620 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:16.846575 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-z445c" event={"ID":"d7d8f932-f808-4d50-8b55-ad125b8b9a2c","Type":"ContainerDied","Data":"8f66eaa28fb800217ed454a988db30c7364164aa10f9fff60e7f877e82270670"} Apr 17 17:25:16.846667 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:16.846645 2579 scope.go:117] "RemoveContainer" containerID="48aac6d66f4d5a79e6dabcf694e4cb652f2f31f2abe9e61d10162def9600545a" Apr 17 17:25:16.846898 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:16.846876 2579 scope.go:117] "RemoveContainer" containerID="8f66eaa28fb800217ed454a988db30c7364164aa10f9fff60e7f877e82270670" Apr 17 17:25:16.847120 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:16.847096 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-z445c_openshift-console-operator(d7d8f932-f808-4d50-8b55-ad125b8b9a2c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-z445c" podUID="d7d8f932-f808-4d50-8b55-ad125b8b9a2c" Apr 17 17:25:16.849154 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:16.849131 2579 generic.go:358] "Generic (PLEG): container finished" podID="28463658-293e-4847-bb58-c40452c9ceba" containerID="efa567a53401a9057b0e13bad5e66ec9db1beeba5f788b563e18f9f927ad9411" exitCode=0 Apr 17 17:25:16.849308 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:16.849250 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8qx6" event={"ID":"28463658-293e-4847-bb58-c40452c9ceba","Type":"ContainerDied","Data":"efa567a53401a9057b0e13bad5e66ec9db1beeba5f788b563e18f9f927ad9411"} Apr 17 17:25:16.928293 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:16.928260 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2f6hv\" (UniqueName: \"kubernetes.io/projected/76baba93-cb35-46ce-b2f4-05ea81e5ce12-kube-api-access-2f6hv\") pod \"migrator-74bb7799d9-w5fh4\" (UID: \"76baba93-cb35-46ce-b2f4-05ea81e5ce12\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w5fh4" Apr 17 17:25:16.937517 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:16.937493 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f6hv\" (UniqueName: \"kubernetes.io/projected/76baba93-cb35-46ce-b2f4-05ea81e5ce12-kube-api-access-2f6hv\") pod \"migrator-74bb7799d9-w5fh4\" (UID: \"76baba93-cb35-46ce-b2f4-05ea81e5ce12\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w5fh4" Apr 17 17:25:16.946342 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:16.946324 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-szhhg_eb9e24d4-7146-488f-a450-9bd6feba5465/dns-node-resolver/0.log" Apr 17 17:25:16.956609 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:16.956571 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w5fh4" Apr 17 17:25:17.073045 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:17.073013 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-w5fh4"] Apr 17 17:25:17.076058 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:25:17.076031 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76baba93_cb35_46ce_b2f4_05ea81e5ce12.slice/crio-0b8f24c183816e21d1423a7fdd0c3d90cf0a58ff2c878e77b8b69743cd5039f5 WatchSource:0}: Error finding container 0b8f24c183816e21d1423a7fdd0c3d90cf0a58ff2c878e77b8b69743cd5039f5: Status 404 returned error can't find the container with id 0b8f24c183816e21d1423a7fdd0c3d90cf0a58ff2c878e77b8b69743cd5039f5 Apr 17 17:25:17.746550 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:17.746524 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9szwb_d18b6520-db25-43a0-bca5-6990fef41e34/node-ca/0.log" Apr 17 17:25:17.857701 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:17.857639 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8qx6" event={"ID":"28463658-293e-4847-bb58-c40452c9ceba","Type":"ContainerStarted","Data":"06f6f0fff10c8b8975ef6999df514f3fa60eb7a69eb566e27a9915acdbf0b24c"} Apr 17 17:25:17.859089 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:17.859066 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 17:25:17.859441 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:17.859421 2579 scope.go:117] "RemoveContainer" containerID="8f66eaa28fb800217ed454a988db30c7364164aa10f9fff60e7f877e82270670" Apr 17 17:25:17.859674 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:17.859652 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-z445c_openshift-console-operator(d7d8f932-f808-4d50-8b55-ad125b8b9a2c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-z445c" podUID="d7d8f932-f808-4d50-8b55-ad125b8b9a2c" Apr 17 17:25:17.860259 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:17.860237 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w5fh4" event={"ID":"76baba93-cb35-46ce-b2f4-05ea81e5ce12","Type":"ContainerStarted","Data":"0b8f24c183816e21d1423a7fdd0c3d90cf0a58ff2c878e77b8b69743cd5039f5"} Apr 17 17:25:17.882288 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:17.882232 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-z8qx6" podStartSLOduration=6.53376848 podStartE2EDuration="44.882213285s" podCreationTimestamp="2026-04-17 17:24:33 +0000 UTC" firstStartedPulling="2026-04-17 17:24:36.312370919 +0000 UTC m=+3.219103715" lastFinishedPulling="2026-04-17 17:25:14.660815728 +0000 UTC m=+41.567548520" observedRunningTime="2026-04-17 17:25:17.880761422 +0000 UTC m=+44.787494240" watchObservedRunningTime="2026-04-17 17:25:17.882213285 +0000 UTC m=+44.788946090" Apr 17 17:25:18.747346 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:18.747315 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-dsd5w"] Apr 17 17:25:18.750504 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:18.750480 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-dsd5w" Apr 17 17:25:18.752864 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:18.752839 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 17:25:18.753611 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:18.753569 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 17:25:18.753724 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:18.753614 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 17:25:18.753724 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:18.753654 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-sq4qc\"" Apr 17 17:25:18.753724 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:18.753666 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 17:25:18.758545 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:18.758523 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-dsd5w"] Apr 17 17:25:18.846099 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:18.846064 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw2d6\" (UniqueName: \"kubernetes.io/projected/0c9f375e-d669-459b-88df-3a6cd68c27b3-kube-api-access-cw2d6\") pod \"service-ca-865cb79987-dsd5w\" (UID: \"0c9f375e-d669-459b-88df-3a6cd68c27b3\") " pod="openshift-service-ca/service-ca-865cb79987-dsd5w" Apr 17 17:25:18.846308 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:18.846213 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0c9f375e-d669-459b-88df-3a6cd68c27b3-signing-cabundle\") pod \"service-ca-865cb79987-dsd5w\" (UID: \"0c9f375e-d669-459b-88df-3a6cd68c27b3\") " pod="openshift-service-ca/service-ca-865cb79987-dsd5w" Apr 17 17:25:18.846377 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:18.846349 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0c9f375e-d669-459b-88df-3a6cd68c27b3-signing-key\") pod \"service-ca-865cb79987-dsd5w\" (UID: \"0c9f375e-d669-459b-88df-3a6cd68c27b3\") " pod="openshift-service-ca/service-ca-865cb79987-dsd5w" Apr 17 17:25:18.864450 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:18.864415 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w5fh4" event={"ID":"76baba93-cb35-46ce-b2f4-05ea81e5ce12","Type":"ContainerStarted","Data":"6db1bed5d6a3d05676343257701a0df45e5f5fc5ecc4a0742d3cdaa3c96a3997"} Apr 17 17:25:18.864869 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:18.864452 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w5fh4" event={"ID":"76baba93-cb35-46ce-b2f4-05ea81e5ce12","Type":"ContainerStarted","Data":"3aa9b7ff4ab05123226a2125c4928548cefd6fd4459b7e21f4bf4299554f9706"} Apr 17 17:25:18.881703 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:18.881654 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w5fh4" podStartSLOduration=1.945946475 podStartE2EDuration="2.881638654s" podCreationTimestamp="2026-04-17 17:25:16 +0000 UTC" firstStartedPulling="2026-04-17 17:25:17.07792679 +0000 UTC m=+43.984659584" lastFinishedPulling="2026-04-17 17:25:18.013618974 +0000 UTC m=+44.920351763" observedRunningTime="2026-04-17 17:25:18.880830953 +0000 UTC m=+45.787563754" watchObservedRunningTime="2026-04-17 17:25:18.881638654 +0000 UTC m=+45.788371456" Apr 17 17:25:18.946854 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:18.946817 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0c9f375e-d669-459b-88df-3a6cd68c27b3-signing-key\") pod \"service-ca-865cb79987-dsd5w\" (UID: \"0c9f375e-d669-459b-88df-3a6cd68c27b3\") " pod="openshift-service-ca/service-ca-865cb79987-dsd5w" Apr 17 17:25:18.947049 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:18.946864 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cw2d6\" (UniqueName: \"kubernetes.io/projected/0c9f375e-d669-459b-88df-3a6cd68c27b3-kube-api-access-cw2d6\") pod \"service-ca-865cb79987-dsd5w\" (UID: \"0c9f375e-d669-459b-88df-3a6cd68c27b3\") " pod="openshift-service-ca/service-ca-865cb79987-dsd5w" Apr 17 17:25:18.947049 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:18.946971 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0c9f375e-d669-459b-88df-3a6cd68c27b3-signing-cabundle\") pod \"service-ca-865cb79987-dsd5w\" (UID: \"0c9f375e-d669-459b-88df-3a6cd68c27b3\") " pod="openshift-service-ca/service-ca-865cb79987-dsd5w" Apr 17 17:25:18.947649 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:18.947624 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0c9f375e-d669-459b-88df-3a6cd68c27b3-signing-cabundle\") pod \"service-ca-865cb79987-dsd5w\" (UID: \"0c9f375e-d669-459b-88df-3a6cd68c27b3\") " pod="openshift-service-ca/service-ca-865cb79987-dsd5w" Apr 17 17:25:18.949367 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:18.949343 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0c9f375e-d669-459b-88df-3a6cd68c27b3-signing-key\") pod \"service-ca-865cb79987-dsd5w\" (UID: \"0c9f375e-d669-459b-88df-3a6cd68c27b3\") " pod="openshift-service-ca/service-ca-865cb79987-dsd5w" Apr 17 17:25:18.954832 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:18.954803 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw2d6\" (UniqueName: \"kubernetes.io/projected/0c9f375e-d669-459b-88df-3a6cd68c27b3-kube-api-access-cw2d6\") pod \"service-ca-865cb79987-dsd5w\" (UID: \"0c9f375e-d669-459b-88df-3a6cd68c27b3\") " pod="openshift-service-ca/service-ca-865cb79987-dsd5w" Apr 17 17:25:19.060081 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:19.059972 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-dsd5w" Apr 17 17:25:19.174569 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:19.174537 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-dsd5w"] Apr 17 17:25:19.178148 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:25:19.178123 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c9f375e_d669_459b_88df_3a6cd68c27b3.slice/crio-44625b723f2c035f37513db5a5f5d42048553409ad7e23dcaab84a19a8c74b52 WatchSource:0}: Error finding container 44625b723f2c035f37513db5a5f5d42048553409ad7e23dcaab84a19a8c74b52: Status 404 returned error can't find the container with id 44625b723f2c035f37513db5a5f5d42048553409ad7e23dcaab84a19a8c74b52 Apr 17 17:25:19.869237 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:19.869197 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-dsd5w" event={"ID":"0c9f375e-d669-459b-88df-3a6cd68c27b3","Type":"ContainerStarted","Data":"840c176cc92f640a91ded6ae8eca3494786b110141e83e91e127ea26dad02e26"} Apr 17 17:25:19.869237 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:19.869244 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-dsd5w" event={"ID":"0c9f375e-d669-459b-88df-3a6cd68c27b3","Type":"ContainerStarted","Data":"44625b723f2c035f37513db5a5f5d42048553409ad7e23dcaab84a19a8c74b52"} Apr 17 17:25:19.933728 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:19.933667 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-dsd5w" podStartSLOduration=1.9336475050000002 podStartE2EDuration="1.933647505s" podCreationTimestamp="2026-04-17 17:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:25:19.93297838 +0000 UTC m=+46.839711182" watchObservedRunningTime="2026-04-17 17:25:19.933647505 +0000 UTC m=+46.840380312" Apr 17 17:25:20.864159 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:20.864115 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret\") pod \"global-pull-secret-syncer-5j2pv\" (UID: \"f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95\") " pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:25:20.866418 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:20.866394 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95-original-pull-secret\") pod \"global-pull-secret-syncer-5j2pv\" (UID: \"f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95\") " pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:25:21.049454 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:21.049417 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5j2pv" Apr 17 17:25:21.194269 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:21.194236 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5j2pv"] Apr 17 17:25:21.197984 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:25:21.197943 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf84e8ad7_d1f5_4adb_a051_4a0ab84d4e95.slice/crio-791836b2c5d88e502c1a2e3ecdd79a57df88b7c16a0e0d691154aeb4ed84a3c4 WatchSource:0}: Error finding container 791836b2c5d88e502c1a2e3ecdd79a57df88b7c16a0e0d691154aeb4ed84a3c4: Status 404 returned error can't find the container with id 791836b2c5d88e502c1a2e3ecdd79a57df88b7c16a0e0d691154aeb4ed84a3c4 Apr 17 17:25:21.369315 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:21.369222 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:21.369315 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:21.369290 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:21.369495 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:21.369384 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:25:21.369495 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:21.369416 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle podName:757e5944-43d8-40d9-bf59-81391d9f77cf nodeName:}" failed. No retries permitted until 2026-04-17 17:25:37.369396736 +0000 UTC m=+64.276129517 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle") pod "router-default-5f49dd587d-rxz8k" (UID: "757e5944-43d8-40d9-bf59-81391d9f77cf") : configmap references non-existent config key: service-ca.crt Apr 17 17:25:21.369495 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:21.369448 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs podName:757e5944-43d8-40d9-bf59-81391d9f77cf nodeName:}" failed. No retries permitted until 2026-04-17 17:25:37.369441217 +0000 UTC m=+64.276173997 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs") pod "router-default-5f49dd587d-rxz8k" (UID: "757e5944-43d8-40d9-bf59-81391d9f77cf") : secret "router-metrics-certs-default" not found Apr 17 17:25:21.469959 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:21.469925 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fdkcf\" (UID: \"b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fdkcf" Apr 17 17:25:21.470121 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:21.469988 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5647w\" (UID: \"9617e4e1-7e6f-467b-92b9-0a933eb32dc6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w" Apr 17 17:25:21.470121 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:21.470015 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tzqfr\" (UID: \"2e1343de-f52a-4262-8e36-2270dd39d6a2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr" Apr 17 17:25:21.470121 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:21.470046 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls\") pod \"dns-default-46l9f\" (UID: \"f6656a7d-18be-4793-8aae-ca80248fd4ac\") " pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:21.470121 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:21.470081 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 17:25:21.470267 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:21.470138 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:21.470267 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:21.470153 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:25:21.470267 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:21.470167 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:21.470267 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:21.470156 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert podName:b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:37.470135906 +0000 UTC m=+64.376868690 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fdkcf" (UID: "b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5") : secret "networking-console-plugin-cert" not found Apr 17 17:25:21.470267 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:21.470207 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls podName:f6656a7d-18be-4793-8aae-ca80248fd4ac nodeName:}" failed. No retries permitted until 2026-04-17 17:25:37.470195502 +0000 UTC m=+64.376928290 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls") pod "dns-default-46l9f" (UID: "f6656a7d-18be-4793-8aae-ca80248fd4ac") : secret "dns-default-metrics-tls" not found Apr 17 17:25:21.470267 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:21.470231 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:21.470470 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:21.470292 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:25:21.470470 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:21.470307 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5fc859f89d-s6vt8: secret "image-registry-tls" not found Apr 17 17:25:21.470470 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:21.470294 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls podName:9617e4e1-7e6f-467b-92b9-0a933eb32dc6 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:37.470282484 +0000 UTC m=+64.377015278 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-5647w" (UID: "9617e4e1-7e6f-467b-92b9-0a933eb32dc6") : secret "samples-operator-tls" not found Apr 17 17:25:21.470470 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:21.470351 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls podName:2e1343de-f52a-4262-8e36-2270dd39d6a2 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:37.470338861 +0000 UTC m=+64.377071646 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-tzqfr" (UID: "2e1343de-f52a-4262-8e36-2270dd39d6a2") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:21.470470 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:21.470365 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls podName:c0da0d0b-459b-4fcc-a426-b97d20867b60 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:37.470355449 +0000 UTC m=+64.377088230 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls") pod "image-registry-5fc859f89d-s6vt8" (UID: "c0da0d0b-459b-4fcc-a426-b97d20867b60") : secret "image-registry-tls" not found Apr 17 17:25:21.571626 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:21.571572 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert\") pod \"ingress-canary-8mf4x\" (UID: \"6ba0a3f3-4f4d-4ea1-a514-e449db3682e3\") " pod="openshift-ingress-canary/ingress-canary-8mf4x" Apr 17 17:25:21.571787 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:21.571717 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:21.571787 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:21.571778 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert podName:6ba0a3f3-4f4d-4ea1-a514-e449db3682e3 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:37.571763824 +0000 UTC m=+64.478496604 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert") pod "ingress-canary-8mf4x" (UID: "6ba0a3f3-4f4d-4ea1-a514-e449db3682e3") : secret "canary-serving-cert" not found Apr 17 17:25:21.875407 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:21.875360 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5j2pv" event={"ID":"f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95","Type":"ContainerStarted","Data":"791836b2c5d88e502c1a2e3ecdd79a57df88b7c16a0e0d691154aeb4ed84a3c4"} Apr 17 17:25:25.892623 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:25.892564 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5j2pv" event={"ID":"f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95","Type":"ContainerStarted","Data":"e5fa97393e66f1b239bddbbd7c0196789e3da92e4d71ba32a30afc68cd23170c"} Apr 17 17:25:25.907443 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:25.907386 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-5j2pv" podStartSLOduration=32.864517593 podStartE2EDuration="36.907370533s" podCreationTimestamp="2026-04-17 17:24:49 +0000 UTC" firstStartedPulling="2026-04-17 17:25:21.200046686 +0000 UTC m=+48.106779480" lastFinishedPulling="2026-04-17 17:25:25.242899626 +0000 UTC m=+52.149632420" observedRunningTime="2026-04-17 17:25:25.907190771 +0000 UTC m=+52.813923575" watchObservedRunningTime="2026-04-17 17:25:25.907370533 +0000 UTC m=+52.814103336" Apr 17 17:25:25.968849 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:25.968815 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-z445c" Apr 17 17:25:25.968849 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:25.968854 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-z445c" Apr 17 17:25:25.969212 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:25.969200 2579 scope.go:117] "RemoveContainer" containerID="8f66eaa28fb800217ed454a988db30c7364164aa10f9fff60e7f877e82270670" Apr 17 17:25:25.969395 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:25.969376 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-z445c_openshift-console-operator(d7d8f932-f808-4d50-8b55-ad125b8b9a2c)\"" pod="openshift-console-operator/console-operator-9d4b6777b-z445c" podUID="d7d8f932-f808-4d50-8b55-ad125b8b9a2c" Apr 17 17:25:31.778814 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:31.778784 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-h29v4" Apr 17 17:25:36.636991 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:36.636960 2579 scope.go:117] "RemoveContainer" containerID="8f66eaa28fb800217ed454a988db30c7364164aa10f9fff60e7f877e82270670" Apr 17 17:25:36.924328 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:36.924251 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 17:25:36.924328 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:36.924319 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-z445c" event={"ID":"d7d8f932-f808-4d50-8b55-ad125b8b9a2c","Type":"ContainerStarted","Data":"1f2fa9374b42b8531aac01da5a229e43d198fa554f9f09daa16616818e8d02ad"} Apr 17 17:25:36.924700 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:36.924683 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-z445c" Apr 17 17:25:36.942746 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:36.942694 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-z445c" podStartSLOduration=52.4647967 podStartE2EDuration="1m0.94267959s" podCreationTimestamp="2026-04-17 17:24:36 +0000 UTC" firstStartedPulling="2026-04-17 17:25:06.188389529 +0000 UTC m=+33.095122309" lastFinishedPulling="2026-04-17 17:25:14.66627241 +0000 UTC m=+41.573005199" observedRunningTime="2026-04-17 17:25:36.9418893 +0000 UTC m=+63.848622104" watchObservedRunningTime="2026-04-17 17:25:36.94267959 +0000 UTC m=+63.849412394" Apr 17 17:25:37.414437 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.414400 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:37.414670 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.414534 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:37.415118 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.415089 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/757e5944-43d8-40d9-bf59-81391d9f77cf-service-ca-bundle\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:37.416812 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.416789 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/757e5944-43d8-40d9-bf59-81391d9f77cf-metrics-certs\") pod \"router-default-5f49dd587d-rxz8k\" (UID: \"757e5944-43d8-40d9-bf59-81391d9f77cf\") " pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:37.515576 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.515530 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5647w\" (UID: \"9617e4e1-7e6f-467b-92b9-0a933eb32dc6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w" Apr 17 17:25:37.515576 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.515578 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tzqfr\" (UID: \"2e1343de-f52a-4262-8e36-2270dd39d6a2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr" Apr 17 17:25:37.515843 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.515625 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls\") pod \"dns-default-46l9f\" (UID: \"f6656a7d-18be-4793-8aae-ca80248fd4ac\") " pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:37.515843 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.515671 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:37.515843 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.515702 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fdkcf\" (UID: \"b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fdkcf" Apr 17 17:25:37.518245 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.518219 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e1343de-f52a-4262-8e36-2270dd39d6a2-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-tzqfr\" (UID: \"2e1343de-f52a-4262-8e36-2270dd39d6a2\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr" Apr 17 17:25:37.518360 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.518294 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls\") pod \"image-registry-5fc859f89d-s6vt8\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:37.518427 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.518408 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f6656a7d-18be-4793-8aae-ca80248fd4ac-metrics-tls\") pod \"dns-default-46l9f\" (UID: \"f6656a7d-18be-4793-8aae-ca80248fd4ac\") " pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:37.518676 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.518652 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9617e4e1-7e6f-467b-92b9-0a933eb32dc6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-5647w\" (UID: \"9617e4e1-7e6f-467b-92b9-0a933eb32dc6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w" Apr 17 17:25:37.518809 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.518791 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fdkcf\" (UID: \"b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fdkcf" Apr 17 17:25:37.607923 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.607891 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-z445c" Apr 17 17:25:37.616208 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.616175 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert\") pod \"ingress-canary-8mf4x\" (UID: \"6ba0a3f3-4f4d-4ea1-a514-e449db3682e3\") " pod="openshift-ingress-canary/ingress-canary-8mf4x" Apr 17 17:25:37.619686 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.619662 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ba0a3f3-4f4d-4ea1-a514-e449db3682e3-cert\") pod \"ingress-canary-8mf4x\" (UID: \"6ba0a3f3-4f4d-4ea1-a514-e449db3682e3\") " pod="openshift-ingress-canary/ingress-canary-8mf4x" Apr 17 17:25:37.655206 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.655147 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-86rgw\"" Apr 17 17:25:37.663153 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.663118 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:37.665335 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.665249 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-2scp9\"" Apr 17 17:25:37.674192 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.674148 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr" Apr 17 17:25:37.700735 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.700704 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qvzpk\"" Apr 17 17:25:37.708737 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.708655 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:37.717483 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.717268 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9l5w8\"" Apr 17 17:25:37.725776 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.725082 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:37.725776 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.725371 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-5qqw4\"" Apr 17 17:25:37.733194 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.732968 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w" Apr 17 17:25:37.748098 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.747824 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-rw8lm\"" Apr 17 17:25:37.759881 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.755728 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fdkcf" Apr 17 17:25:37.785896 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.785357 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-n9t7x\"" Apr 17 17:25:37.799811 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.799300 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8mf4x" Apr 17 17:25:37.861483 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.860660 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5f49dd587d-rxz8k"] Apr 17 17:25:37.916189 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.915802 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr"] Apr 17 17:25:37.944497 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.944189 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr" event={"ID":"2e1343de-f52a-4262-8e36-2270dd39d6a2","Type":"ContainerStarted","Data":"c1b9f4a4d928f72eabffb487bbab36dc52a45ef92b569eeea0dda23d394d7b26"} Apr 17 17:25:37.946717 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.946615 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5f49dd587d-rxz8k" event={"ID":"757e5944-43d8-40d9-bf59-81391d9f77cf","Type":"ContainerStarted","Data":"403f07ab2f2daacddb960777939096c949dded6fdd0edd17b3c63fe7d0a94c93"} Apr 17 17:25:37.953884 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.953054 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5fc859f89d-s6vt8"] Apr 17 17:25:37.955992 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:25:37.955952 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0da0d0b_459b_4fcc_a426_b97d20867b60.slice/crio-3db52795e4c7207e47be1f2ee78c128c2c9fa822c13299452cffe11afb24d8c7 WatchSource:0}: Error finding container 3db52795e4c7207e47be1f2ee78c128c2c9fa822c13299452cffe11afb24d8c7: Status 404 returned error can't find the container with id 3db52795e4c7207e47be1f2ee78c128c2c9fa822c13299452cffe11afb24d8c7 Apr 17 17:25:37.981134 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.981093 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-46l9f"] Apr 17 17:25:37.984792 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:25:37.984760 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6656a7d_18be_4793_8aae_ca80248fd4ac.slice/crio-5c15b229779b2d6bd0854f1e15b7923e2e690c00677945128da30726c58e1185 WatchSource:0}: Error finding container 5c15b229779b2d6bd0854f1e15b7923e2e690c00677945128da30726c58e1185: Status 404 returned error can't find the container with id 5c15b229779b2d6bd0854f1e15b7923e2e690c00677945128da30726c58e1185 Apr 17 17:25:37.995079 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:37.995024 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w"] Apr 17 17:25:38.025667 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.025546 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fdkcf"] Apr 17 17:25:38.036665 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:25:38.036642 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6dbc8c2_d7a9_453d_8ebc_1e0c119d98e5.slice/crio-d8301ee7e15cd49ea41260ffee3246104d28bb54e2d82b4b37622fd60b535bb8 WatchSource:0}: Error finding container d8301ee7e15cd49ea41260ffee3246104d28bb54e2d82b4b37622fd60b535bb8: Status 404 returned error can't find the container with id d8301ee7e15cd49ea41260ffee3246104d28bb54e2d82b4b37622fd60b535bb8 Apr 17 17:25:38.053490 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.053449 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8mf4x"] Apr 17 17:25:38.060806 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:25:38.060778 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ba0a3f3_4f4d_4ea1_a514_e449db3682e3.slice/crio-e4b8735a92cd851753f7a014f37f142e1bffbd284458719717731a15291b3f45 WatchSource:0}: Error finding container e4b8735a92cd851753f7a014f37f142e1bffbd284458719717731a15291b3f45: Status 404 returned error can't find the container with id e4b8735a92cd851753f7a014f37f142e1bffbd284458719717731a15291b3f45 Apr 17 17:25:38.150508 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.150462 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt"] Apr 17 17:25:38.154042 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.154014 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" Apr 17 17:25:38.155332 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.155309 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b87b4f6f-wx92j"] Apr 17 17:25:38.157484 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.157463 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 17:25:38.157929 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.157913 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 17:25:38.158122 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.158093 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 17:25:38.158220 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.158167 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b87b4f6f-wx92j" Apr 17 17:25:38.158380 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.158356 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 17:25:38.158556 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.158490 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 17:25:38.159655 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.159634 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 17:25:38.164660 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.164639 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-nt5t5\"" Apr 17 17:25:38.164908 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.164892 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 17:25:38.165644 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.165625 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7644c9f98b-hphct"] Apr 17 17:25:38.167054 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.167007 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 17:25:38.168533 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.168515 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7644c9f98b-hphct" Apr 17 17:25:38.171951 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.171933 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 17:25:38.180320 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.180297 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt"] Apr 17 17:25:38.181277 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.181242 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b87b4f6f-wx92j"] Apr 17 17:25:38.190149 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.190119 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7644c9f98b-hphct"] Apr 17 17:25:38.224296 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.224263 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3f0c4a92-d09a-422b-ba0b-9b62bc614519-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-898cfd57b-wzvrt\" (UID: \"3f0c4a92-d09a-422b-ba0b-9b62bc614519\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" Apr 17 17:25:38.224296 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.224298 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59gv5\" (UniqueName: \"kubernetes.io/projected/e990542e-7f3f-4772-a138-6d6aa2959526-kube-api-access-59gv5\") pod \"klusterlet-addon-workmgr-7644c9f98b-hphct\" (UID: \"e990542e-7f3f-4772-a138-6d6aa2959526\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7644c9f98b-hphct" Apr 17 17:25:38.224494 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.224342 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/90956b23-bba2-4101-9b42-b9514db5d135-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8b87b4f6f-wx92j\" (UID: \"90956b23-bba2-4101-9b42-b9514db5d135\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b87b4f6f-wx92j" Apr 17 17:25:38.224494 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.224391 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3f0c4a92-d09a-422b-ba0b-9b62bc614519-hub\") pod \"cluster-proxy-proxy-agent-898cfd57b-wzvrt\" (UID: \"3f0c4a92-d09a-422b-ba0b-9b62bc614519\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" Apr 17 17:25:38.224494 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.224424 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3f0c4a92-d09a-422b-ba0b-9b62bc614519-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-898cfd57b-wzvrt\" (UID: \"3f0c4a92-d09a-422b-ba0b-9b62bc614519\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" Apr 17 17:25:38.224494 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.224476 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d86qz\" (UniqueName: \"kubernetes.io/projected/90956b23-bba2-4101-9b42-b9514db5d135-kube-api-access-d86qz\") pod \"managed-serviceaccount-addon-agent-8b87b4f6f-wx92j\" (UID: \"90956b23-bba2-4101-9b42-b9514db5d135\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b87b4f6f-wx92j" Apr 17 17:25:38.224634 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.224500 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3f0c4a92-d09a-422b-ba0b-9b62bc614519-ca\") pod \"cluster-proxy-proxy-agent-898cfd57b-wzvrt\" (UID: \"3f0c4a92-d09a-422b-ba0b-9b62bc614519\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" Apr 17 17:25:38.224634 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.224522 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsgcd\" (UniqueName: \"kubernetes.io/projected/3f0c4a92-d09a-422b-ba0b-9b62bc614519-kube-api-access-hsgcd\") pod \"cluster-proxy-proxy-agent-898cfd57b-wzvrt\" (UID: \"3f0c4a92-d09a-422b-ba0b-9b62bc614519\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" Apr 17 17:25:38.224634 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.224545 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e990542e-7f3f-4772-a138-6d6aa2959526-tmp\") pod \"klusterlet-addon-workmgr-7644c9f98b-hphct\" (UID: \"e990542e-7f3f-4772-a138-6d6aa2959526\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7644c9f98b-hphct" Apr 17 17:25:38.224634 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.224621 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3f0c4a92-d09a-422b-ba0b-9b62bc614519-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-898cfd57b-wzvrt\" (UID: \"3f0c4a92-d09a-422b-ba0b-9b62bc614519\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" Apr 17 17:25:38.224754 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.224642 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e990542e-7f3f-4772-a138-6d6aa2959526-klusterlet-config\") pod \"klusterlet-addon-workmgr-7644c9f98b-hphct\" (UID: \"e990542e-7f3f-4772-a138-6d6aa2959526\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7644c9f98b-hphct" Apr 17 17:25:38.250718 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.250683 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-8z5d2"] Apr 17 17:25:38.253978 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.253952 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-8z5d2" Apr 17 17:25:38.256246 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.256222 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 17:25:38.256246 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.256255 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 17:25:38.257387 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.257199 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-rtlh6\"" Apr 17 17:25:38.274005 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.273927 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-8z5d2"] Apr 17 17:25:38.325901 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.325862 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e990542e-7f3f-4772-a138-6d6aa2959526-tmp\") pod \"klusterlet-addon-workmgr-7644c9f98b-hphct\" (UID: \"e990542e-7f3f-4772-a138-6d6aa2959526\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7644c9f98b-hphct" Apr 17 17:25:38.325901 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.325905 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3f0c4a92-d09a-422b-ba0b-9b62bc614519-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-898cfd57b-wzvrt\" (UID: \"3f0c4a92-d09a-422b-ba0b-9b62bc614519\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" Apr 17 17:25:38.326159 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.326024 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e990542e-7f3f-4772-a138-6d6aa2959526-klusterlet-config\") pod \"klusterlet-addon-workmgr-7644c9f98b-hphct\" (UID: \"e990542e-7f3f-4772-a138-6d6aa2959526\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7644c9f98b-hphct" Apr 17 17:25:38.326159 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.326118 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gphs5\" (UniqueName: \"kubernetes.io/projected/fc5c9760-32a8-4cb9-9740-2192f5719cd7-kube-api-access-gphs5\") pod \"downloads-6bcc868b7-8z5d2\" (UID: \"fc5c9760-32a8-4cb9-9740-2192f5719cd7\") " pod="openshift-console/downloads-6bcc868b7-8z5d2" Apr 17 17:25:38.326265 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.326174 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3f0c4a92-d09a-422b-ba0b-9b62bc614519-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-898cfd57b-wzvrt\" (UID: \"3f0c4a92-d09a-422b-ba0b-9b62bc614519\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" Apr 17 17:25:38.326265 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.326205 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59gv5\" (UniqueName: \"kubernetes.io/projected/e990542e-7f3f-4772-a138-6d6aa2959526-kube-api-access-59gv5\") pod \"klusterlet-addon-workmgr-7644c9f98b-hphct\" (UID: \"e990542e-7f3f-4772-a138-6d6aa2959526\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7644c9f98b-hphct" Apr 17 17:25:38.326265 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.326230 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e990542e-7f3f-4772-a138-6d6aa2959526-tmp\") pod \"klusterlet-addon-workmgr-7644c9f98b-hphct\" (UID: \"e990542e-7f3f-4772-a138-6d6aa2959526\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7644c9f98b-hphct" Apr 17 17:25:38.326433 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.326268 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/90956b23-bba2-4101-9b42-b9514db5d135-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8b87b4f6f-wx92j\" (UID: \"90956b23-bba2-4101-9b42-b9514db5d135\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b87b4f6f-wx92j" Apr 17 17:25:38.326433 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.326299 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3f0c4a92-d09a-422b-ba0b-9b62bc614519-hub\") pod \"cluster-proxy-proxy-agent-898cfd57b-wzvrt\" (UID: \"3f0c4a92-d09a-422b-ba0b-9b62bc614519\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" Apr 17 17:25:38.326433 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.326324 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3f0c4a92-d09a-422b-ba0b-9b62bc614519-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-898cfd57b-wzvrt\" (UID: \"3f0c4a92-d09a-422b-ba0b-9b62bc614519\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" Apr 17 17:25:38.326433 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.326368 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d86qz\" (UniqueName: \"kubernetes.io/projected/90956b23-bba2-4101-9b42-b9514db5d135-kube-api-access-d86qz\") pod \"managed-serviceaccount-addon-agent-8b87b4f6f-wx92j\" (UID: \"90956b23-bba2-4101-9b42-b9514db5d135\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b87b4f6f-wx92j" Apr 17 17:25:38.326433 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.326396 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3f0c4a92-d09a-422b-ba0b-9b62bc614519-ca\") pod \"cluster-proxy-proxy-agent-898cfd57b-wzvrt\" (UID: \"3f0c4a92-d09a-422b-ba0b-9b62bc614519\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" Apr 17 17:25:38.326433 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.326422 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsgcd\" (UniqueName: \"kubernetes.io/projected/3f0c4a92-d09a-422b-ba0b-9b62bc614519-kube-api-access-hsgcd\") pod \"cluster-proxy-proxy-agent-898cfd57b-wzvrt\" (UID: \"3f0c4a92-d09a-422b-ba0b-9b62bc614519\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" Apr 17 17:25:38.327135 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.327031 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3f0c4a92-d09a-422b-ba0b-9b62bc614519-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-898cfd57b-wzvrt\" (UID: \"3f0c4a92-d09a-422b-ba0b-9b62bc614519\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" Apr 17 17:25:38.329629 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.329488 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/90956b23-bba2-4101-9b42-b9514db5d135-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8b87b4f6f-wx92j\" (UID: \"90956b23-bba2-4101-9b42-b9514db5d135\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b87b4f6f-wx92j" Apr 17 17:25:38.329629 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.329525 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3f0c4a92-d09a-422b-ba0b-9b62bc614519-hub\") pod \"cluster-proxy-proxy-agent-898cfd57b-wzvrt\" (UID: \"3f0c4a92-d09a-422b-ba0b-9b62bc614519\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" Apr 17 17:25:38.329629 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.329558 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e990542e-7f3f-4772-a138-6d6aa2959526-klusterlet-config\") pod \"klusterlet-addon-workmgr-7644c9f98b-hphct\" (UID: \"e990542e-7f3f-4772-a138-6d6aa2959526\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7644c9f98b-hphct" Apr 17 17:25:38.329629 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.329615 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3f0c4a92-d09a-422b-ba0b-9b62bc614519-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-898cfd57b-wzvrt\" (UID: \"3f0c4a92-d09a-422b-ba0b-9b62bc614519\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" Apr 17 17:25:38.329927 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.329905 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3f0c4a92-d09a-422b-ba0b-9b62bc614519-ca\") pod \"cluster-proxy-proxy-agent-898cfd57b-wzvrt\" (UID: \"3f0c4a92-d09a-422b-ba0b-9b62bc614519\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" Apr 17 17:25:38.330164 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.330137 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3f0c4a92-d09a-422b-ba0b-9b62bc614519-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-898cfd57b-wzvrt\" (UID: \"3f0c4a92-d09a-422b-ba0b-9b62bc614519\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" Apr 17 17:25:38.338289 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.338263 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsgcd\" (UniqueName: \"kubernetes.io/projected/3f0c4a92-d09a-422b-ba0b-9b62bc614519-kube-api-access-hsgcd\") pod \"cluster-proxy-proxy-agent-898cfd57b-wzvrt\" (UID: \"3f0c4a92-d09a-422b-ba0b-9b62bc614519\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" Apr 17 17:25:38.340359 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.340330 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d86qz\" (UniqueName: \"kubernetes.io/projected/90956b23-bba2-4101-9b42-b9514db5d135-kube-api-access-d86qz\") pod \"managed-serviceaccount-addon-agent-8b87b4f6f-wx92j\" (UID: \"90956b23-bba2-4101-9b42-b9514db5d135\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b87b4f6f-wx92j" Apr 17 17:25:38.341063 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.341042 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59gv5\" (UniqueName: \"kubernetes.io/projected/e990542e-7f3f-4772-a138-6d6aa2959526-kube-api-access-59gv5\") pod \"klusterlet-addon-workmgr-7644c9f98b-hphct\" (UID: \"e990542e-7f3f-4772-a138-6d6aa2959526\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7644c9f98b-hphct" Apr 17 17:25:38.344672 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.344649 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xhfqz"] Apr 17 17:25:38.349065 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.349046 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xhfqz" Apr 17 17:25:38.351284 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.351265 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 17:25:38.351373 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.351303 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4n9s5\"" Apr 17 17:25:38.351373 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.351359 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 17:25:38.361927 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.361903 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xhfqz"] Apr 17 17:25:38.427700 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.427618 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8072177b-ab57-4392-ae96-968f804bc12a-crio-socket\") pod \"insights-runtime-extractor-xhfqz\" (UID: \"8072177b-ab57-4392-ae96-968f804bc12a\") " pod="openshift-insights/insights-runtime-extractor-xhfqz" Apr 17 17:25:38.427700 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.427670 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8072177b-ab57-4392-ae96-968f804bc12a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xhfqz\" (UID: \"8072177b-ab57-4392-ae96-968f804bc12a\") " pod="openshift-insights/insights-runtime-extractor-xhfqz" Apr 17 17:25:38.427905 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.427736 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8072177b-ab57-4392-ae96-968f804bc12a-data-volume\") pod \"insights-runtime-extractor-xhfqz\" (UID: \"8072177b-ab57-4392-ae96-968f804bc12a\") " pod="openshift-insights/insights-runtime-extractor-xhfqz" Apr 17 17:25:38.427905 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.427817 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gphs5\" (UniqueName: \"kubernetes.io/projected/fc5c9760-32a8-4cb9-9740-2192f5719cd7-kube-api-access-gphs5\") pod \"downloads-6bcc868b7-8z5d2\" (UID: \"fc5c9760-32a8-4cb9-9740-2192f5719cd7\") " pod="openshift-console/downloads-6bcc868b7-8z5d2" Apr 17 17:25:38.428015 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.427909 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8072177b-ab57-4392-ae96-968f804bc12a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xhfqz\" (UID: \"8072177b-ab57-4392-ae96-968f804bc12a\") " pod="openshift-insights/insights-runtime-extractor-xhfqz" Apr 17 17:25:38.428015 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.427945 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8lt5\" (UniqueName: \"kubernetes.io/projected/8072177b-ab57-4392-ae96-968f804bc12a-kube-api-access-v8lt5\") pod \"insights-runtime-extractor-xhfqz\" (UID: \"8072177b-ab57-4392-ae96-968f804bc12a\") " pod="openshift-insights/insights-runtime-extractor-xhfqz" Apr 17 17:25:38.438234 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.438179 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gphs5\" (UniqueName: \"kubernetes.io/projected/fc5c9760-32a8-4cb9-9740-2192f5719cd7-kube-api-access-gphs5\") pod \"downloads-6bcc868b7-8z5d2\" (UID: \"fc5c9760-32a8-4cb9-9740-2192f5719cd7\") " pod="openshift-console/downloads-6bcc868b7-8z5d2" Apr 17 17:25:38.475280 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.475245 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" Apr 17 17:25:38.494697 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.494670 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b87b4f6f-wx92j" Apr 17 17:25:38.501245 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.500237 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7644c9f98b-hphct" Apr 17 17:25:38.531059 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.531025 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8072177b-ab57-4392-ae96-968f804bc12a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xhfqz\" (UID: \"8072177b-ab57-4392-ae96-968f804bc12a\") " pod="openshift-insights/insights-runtime-extractor-xhfqz" Apr 17 17:25:38.531204 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.531078 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8lt5\" (UniqueName: \"kubernetes.io/projected/8072177b-ab57-4392-ae96-968f804bc12a-kube-api-access-v8lt5\") pod \"insights-runtime-extractor-xhfqz\" (UID: \"8072177b-ab57-4392-ae96-968f804bc12a\") " pod="openshift-insights/insights-runtime-extractor-xhfqz" Apr 17 17:25:38.531204 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.531152 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8072177b-ab57-4392-ae96-968f804bc12a-crio-socket\") pod \"insights-runtime-extractor-xhfqz\" (UID: \"8072177b-ab57-4392-ae96-968f804bc12a\") " pod="openshift-insights/insights-runtime-extractor-xhfqz" Apr 17 17:25:38.531204 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.531177 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8072177b-ab57-4392-ae96-968f804bc12a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xhfqz\" (UID: \"8072177b-ab57-4392-ae96-968f804bc12a\") " pod="openshift-insights/insights-runtime-extractor-xhfqz" Apr 17 17:25:38.531369 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.531214 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8072177b-ab57-4392-ae96-968f804bc12a-data-volume\") pod \"insights-runtime-extractor-xhfqz\" (UID: \"8072177b-ab57-4392-ae96-968f804bc12a\") " pod="openshift-insights/insights-runtime-extractor-xhfqz" Apr 17 17:25:38.532169 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.531728 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8072177b-ab57-4392-ae96-968f804bc12a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xhfqz\" (UID: \"8072177b-ab57-4392-ae96-968f804bc12a\") " pod="openshift-insights/insights-runtime-extractor-xhfqz" Apr 17 17:25:38.532169 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.531825 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8072177b-ab57-4392-ae96-968f804bc12a-crio-socket\") pod \"insights-runtime-extractor-xhfqz\" (UID: \"8072177b-ab57-4392-ae96-968f804bc12a\") " pod="openshift-insights/insights-runtime-extractor-xhfqz" Apr 17 17:25:38.532622 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.532561 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8072177b-ab57-4392-ae96-968f804bc12a-data-volume\") pod \"insights-runtime-extractor-xhfqz\" (UID: \"8072177b-ab57-4392-ae96-968f804bc12a\") " pod="openshift-insights/insights-runtime-extractor-xhfqz" Apr 17 17:25:38.545980 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.545921 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8lt5\" (UniqueName: \"kubernetes.io/projected/8072177b-ab57-4392-ae96-968f804bc12a-kube-api-access-v8lt5\") pod \"insights-runtime-extractor-xhfqz\" (UID: \"8072177b-ab57-4392-ae96-968f804bc12a\") " pod="openshift-insights/insights-runtime-extractor-xhfqz" Apr 17 17:25:38.546676 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.546623 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8072177b-ab57-4392-ae96-968f804bc12a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xhfqz\" (UID: \"8072177b-ab57-4392-ae96-968f804bc12a\") " pod="openshift-insights/insights-runtime-extractor-xhfqz" Apr 17 17:25:38.564561 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.563966 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-8z5d2" Apr 17 17:25:38.659863 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.658418 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xhfqz" Apr 17 17:25:38.697073 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.697009 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt"] Apr 17 17:25:38.729017 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.728947 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7644c9f98b-hphct"] Apr 17 17:25:38.741870 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.741779 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b87b4f6f-wx92j"] Apr 17 17:25:38.761506 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:25:38.761433 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90956b23_bba2_4101_9b42_b9514db5d135.slice/crio-efd52b2bb6e863a1ef193e7217d8a608050e5db023484403b4d413da09755064 WatchSource:0}: Error finding container efd52b2bb6e863a1ef193e7217d8a608050e5db023484403b4d413da09755064: Status 404 returned error can't find the container with id efd52b2bb6e863a1ef193e7217d8a608050e5db023484403b4d413da09755064 Apr 17 17:25:38.792137 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.791505 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-8z5d2"] Apr 17 17:25:38.879448 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.879010 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xhfqz"] Apr 17 17:25:38.881918 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:25:38.881829 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8072177b_ab57_4392_ae96_968f804bc12a.slice/crio-b66de0b16ef0b98fe57638f15099a5e4beae1f70595781ead856cbac1639880b WatchSource:0}: Error finding container b66de0b16ef0b98fe57638f15099a5e4beae1f70595781ead856cbac1639880b: Status 404 returned error can't find the container with id b66de0b16ef0b98fe57638f15099a5e4beae1f70595781ead856cbac1639880b Apr 17 17:25:38.956569 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.955652 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" event={"ID":"c0da0d0b-459b-4fcc-a426-b97d20867b60","Type":"ContainerStarted","Data":"44052820fb8090624b229257a12289dc635c2eca02ddc061112b435ed5bf3f01"} Apr 17 17:25:38.956569 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.955697 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" event={"ID":"c0da0d0b-459b-4fcc-a426-b97d20867b60","Type":"ContainerStarted","Data":"3db52795e4c7207e47be1f2ee78c128c2c9fa822c13299452cffe11afb24d8c7"} Apr 17 17:25:38.956569 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.956527 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:25:38.958584 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.958522 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-8z5d2" event={"ID":"fc5c9760-32a8-4cb9-9740-2192f5719cd7","Type":"ContainerStarted","Data":"58e99d88f1b9e84da8825f9cf9ed36ad488cef04216812e5c57ae0745d860a36"} Apr 17 17:25:38.961753 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.961528 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b87b4f6f-wx92j" event={"ID":"90956b23-bba2-4101-9b42-b9514db5d135","Type":"ContainerStarted","Data":"efd52b2bb6e863a1ef193e7217d8a608050e5db023484403b4d413da09755064"} Apr 17 17:25:38.965195 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.965124 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fdkcf" event={"ID":"b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5","Type":"ContainerStarted","Data":"d8301ee7e15cd49ea41260ffee3246104d28bb54e2d82b4b37622fd60b535bb8"} Apr 17 17:25:38.966933 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.966906 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5f49dd587d-rxz8k" event={"ID":"757e5944-43d8-40d9-bf59-81391d9f77cf","Type":"ContainerStarted","Data":"92b78bbdc51cc3d921cbd08e341130030558e8165ca7b6c1c0113752073c863e"} Apr 17 17:25:38.969464 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.969441 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7644c9f98b-hphct" event={"ID":"e990542e-7f3f-4772-a138-6d6aa2959526","Type":"ContainerStarted","Data":"6204c32d61a61aa310d6648fcf0d26ca858748e874d165b14154dc65c4176455"} Apr 17 17:25:38.971072 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.971050 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" event={"ID":"3f0c4a92-d09a-422b-ba0b-9b62bc614519","Type":"ContainerStarted","Data":"6805dbb7cade861b365d477e4f25c04b64ba0f0ee74593dcfde4924fc0e2a829"} Apr 17 17:25:38.973649 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.973625 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-46l9f" event={"ID":"f6656a7d-18be-4793-8aae-ca80248fd4ac","Type":"ContainerStarted","Data":"5c15b229779b2d6bd0854f1e15b7923e2e690c00677945128da30726c58e1185"} Apr 17 17:25:38.975216 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.975194 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8mf4x" event={"ID":"6ba0a3f3-4f4d-4ea1-a514-e449db3682e3","Type":"ContainerStarted","Data":"e4b8735a92cd851753f7a014f37f142e1bffbd284458719717731a15291b3f45"} Apr 17 17:25:38.977253 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.977137 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w" event={"ID":"9617e4e1-7e6f-467b-92b9-0a933eb32dc6","Type":"ContainerStarted","Data":"98bf6632ba6a6b2bf45c5e2e316d772246593d5b88c96016c930c5fa63fd27b0"} Apr 17 17:25:38.981056 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.981011 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" podStartSLOduration=64.980997572 podStartE2EDuration="1m4.980997572s" podCreationTimestamp="2026-04-17 17:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:25:38.980183876 +0000 UTC m=+65.886916679" watchObservedRunningTime="2026-04-17 17:25:38.980997572 +0000 UTC m=+65.887730376" Apr 17 17:25:38.985102 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:38.984312 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xhfqz" event={"ID":"8072177b-ab57-4392-ae96-968f804bc12a","Type":"ContainerStarted","Data":"b66de0b16ef0b98fe57638f15099a5e4beae1f70595781ead856cbac1639880b"} Apr 17 17:25:39.005249 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:39.004838 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5f49dd587d-rxz8k" podStartSLOduration=63.004818678 podStartE2EDuration="1m3.004818678s" podCreationTimestamp="2026-04-17 17:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:25:39.003437722 +0000 UTC m=+65.910170526" watchObservedRunningTime="2026-04-17 17:25:39.004818678 +0000 UTC m=+65.911551484" Apr 17 17:25:39.345776 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:39.345183 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs\") pod \"network-metrics-daemon-z942n\" (UID: \"d168f0a0-7fcd-4905-a424-24a94b7fcdbb\") " pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:25:39.350989 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:39.350959 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d168f0a0-7fcd-4905-a424-24a94b7fcdbb-metrics-certs\") pod \"network-metrics-daemon-z942n\" (UID: \"d168f0a0-7fcd-4905-a424-24a94b7fcdbb\") " pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:25:39.361564 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:39.361335 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vr6gd\"" Apr 17 17:25:39.369349 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:39.369321 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z942n" Apr 17 17:25:39.669323 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:39.666783 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:39.670247 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:39.670047 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:39.751681 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:39.751002 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z942n"] Apr 17 17:25:40.001796 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:40.001753 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xhfqz" event={"ID":"8072177b-ab57-4392-ae96-968f804bc12a","Type":"ContainerStarted","Data":"27eef2d29f8ede8c6421984979ea99b9be6b507637d62a7300e699ae6621c629"} Apr 17 17:25:40.015075 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:40.014955 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fdkcf" event={"ID":"b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5","Type":"ContainerStarted","Data":"b5e3ff7e55c53471eacaaf517ee239bc3349ff5b290a2721af5b59f636008625"} Apr 17 17:25:40.015492 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:40.015438 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:40.017072 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:40.017034 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5f49dd587d-rxz8k" Apr 17 17:25:40.047739 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:40.046428 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fdkcf" podStartSLOduration=35.517084108 podStartE2EDuration="37.046409103s" podCreationTimestamp="2026-04-17 17:25:03 +0000 UTC" firstStartedPulling="2026-04-17 17:25:38.039101926 +0000 UTC m=+64.945834706" lastFinishedPulling="2026-04-17 17:25:39.568426906 +0000 UTC m=+66.475159701" observedRunningTime="2026-04-17 17:25:40.045172524 +0000 UTC m=+66.951905329" watchObservedRunningTime="2026-04-17 17:25:40.046409103 +0000 UTC m=+66.953141905" Apr 17 17:25:42.857704 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:25:42.857662 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd168f0a0_7fcd_4905_a424_24a94b7fcdbb.slice/crio-9329dd444dac5e65c32382234c7193f0ccc1b140c44f1af10be5f5000a6fd36d WatchSource:0}: Error finding container 9329dd444dac5e65c32382234c7193f0ccc1b140c44f1af10be5f5000a6fd36d: Status 404 returned error can't find the container with id 9329dd444dac5e65c32382234c7193f0ccc1b140c44f1af10be5f5000a6fd36d Apr 17 17:25:43.033911 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:43.033868 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z942n" event={"ID":"d168f0a0-7fcd-4905-a424-24a94b7fcdbb","Type":"ContainerStarted","Data":"9329dd444dac5e65c32382234c7193f0ccc1b140c44f1af10be5f5000a6fd36d"} Apr 17 17:25:47.862868 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:47.862829 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-m8x5g" Apr 17 17:25:48.059306 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:48.059269 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7644c9f98b-hphct" event={"ID":"e990542e-7f3f-4772-a138-6d6aa2959526","Type":"ContainerStarted","Data":"8c9b3e4723be476eed1d2c465a4fd08a38f53b04284f3d125034463f45f1ce6b"} Apr 17 17:25:48.060488 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:48.060401 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7644c9f98b-hphct" Apr 17 17:25:48.062603 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:48.062559 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7644c9f98b-hphct" Apr 17 17:25:48.065263 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:48.065212 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8mf4x" event={"ID":"6ba0a3f3-4f4d-4ea1-a514-e449db3682e3","Type":"ContainerStarted","Data":"3bcffd2f0f88ca6dfc4321c1808bb0017788e35d955617a9f155332d6dd8019e"} Apr 17 17:25:48.068512 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:48.068037 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr" event={"ID":"2e1343de-f52a-4262-8e36-2270dd39d6a2","Type":"ContainerStarted","Data":"beee7fee9181bbe82a4162b0aa0a2e7c8f1faf5c0a6d4fbf553b6cf651328d36"} Apr 17 17:25:48.080497 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:48.080218 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7644c9f98b-hphct" podStartSLOduration=0.957034715 podStartE2EDuration="10.080199622s" podCreationTimestamp="2026-04-17 17:25:38 +0000 UTC" firstStartedPulling="2026-04-17 17:25:38.746923351 +0000 UTC m=+65.653656145" lastFinishedPulling="2026-04-17 17:25:47.870088255 +0000 UTC m=+74.776821052" observedRunningTime="2026-04-17 17:25:48.076709285 +0000 UTC m=+74.983442088" watchObservedRunningTime="2026-04-17 17:25:48.080199622 +0000 UTC m=+74.986932426" Apr 17 17:25:48.113208 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:48.112887 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-tzqfr" podStartSLOduration=62.195596832 podStartE2EDuration="1m12.112867107s" podCreationTimestamp="2026-04-17 17:24:36 +0000 UTC" firstStartedPulling="2026-04-17 17:25:37.924037284 +0000 UTC m=+64.830770080" lastFinishedPulling="2026-04-17 17:25:47.841307573 +0000 UTC m=+74.748040355" observedRunningTime="2026-04-17 17:25:48.109958464 +0000 UTC m=+75.016691267" watchObservedRunningTime="2026-04-17 17:25:48.112867107 +0000 UTC m=+75.019599911" Apr 17 17:25:48.163096 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:48.162685 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8mf4x" podStartSLOduration=33.384839923 podStartE2EDuration="43.162663308s" podCreationTimestamp="2026-04-17 17:25:05 +0000 UTC" firstStartedPulling="2026-04-17 17:25:38.063200819 +0000 UTC m=+64.969933600" lastFinishedPulling="2026-04-17 17:25:47.841024192 +0000 UTC m=+74.747756985" observedRunningTime="2026-04-17 17:25:48.157432175 +0000 UTC m=+75.064164969" watchObservedRunningTime="2026-04-17 17:25:48.162663308 +0000 UTC m=+75.069396111" Apr 17 17:25:49.075668 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:49.074970 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b87b4f6f-wx92j" event={"ID":"90956b23-bba2-4101-9b42-b9514db5d135","Type":"ContainerStarted","Data":"9c257841ea7a3167455b33cbcfe5b7541d0895cbfed339b05156c0ee8d815fdd"} Apr 17 17:25:49.077856 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:49.077816 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" event={"ID":"3f0c4a92-d09a-422b-ba0b-9b62bc614519","Type":"ContainerStarted","Data":"c35fc556b8f47612778ee14ba17382602fb0c2b65294acd6ffa0ac31cf226c67"} Apr 17 17:25:49.080871 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:49.080700 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-46l9f" event={"ID":"f6656a7d-18be-4793-8aae-ca80248fd4ac","Type":"ContainerStarted","Data":"5952b0466fea9fabb479eaa62205e7e83413c92d0d68bef3a0621aef0831eab7"} Apr 17 17:25:49.080871 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:49.080735 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-46l9f" event={"ID":"f6656a7d-18be-4793-8aae-ca80248fd4ac","Type":"ContainerStarted","Data":"db8607eacd3fe3e81fdc94a665ed20a6c4a59bfa663be7370629ab65a9c31816"} Apr 17 17:25:49.082112 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:49.081305 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-46l9f" Apr 17 17:25:49.084716 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:49.084102 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w" event={"ID":"9617e4e1-7e6f-467b-92b9-0a933eb32dc6","Type":"ContainerStarted","Data":"b64c936e00454d3274b3e746a560adfb5076227eff1d2b2b8edf3d686b6da98a"} Apr 17 17:25:49.084716 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:49.084138 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w" event={"ID":"9617e4e1-7e6f-467b-92b9-0a933eb32dc6","Type":"ContainerStarted","Data":"a51172f1190c00753009486c111a4f43e6f28753eb2f1bf4bb8cd0742d6fb40d"} Apr 17 17:25:49.086468 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:49.086440 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z942n" event={"ID":"d168f0a0-7fcd-4905-a424-24a94b7fcdbb","Type":"ContainerStarted","Data":"01850ddefcda66bc1873b80a85000dfbb80b478b43102b0c6b14457ce0c476f0"} Apr 17 17:25:49.086570 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:49.086475 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z942n" event={"ID":"d168f0a0-7fcd-4905-a424-24a94b7fcdbb","Type":"ContainerStarted","Data":"dd160eb828f7c746459943dc4712951da21c23577ce6db3aa58096e3c7ade08c"} Apr 17 17:25:49.089580 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:49.089543 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xhfqz" event={"ID":"8072177b-ab57-4392-ae96-968f804bc12a","Type":"ContainerStarted","Data":"66212933d38225542a1555d6b6306e4262aea84773d24625141f2046bc1175ec"} Apr 17 17:25:49.094712 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:49.093011 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b87b4f6f-wx92j" podStartSLOduration=1.982306142 podStartE2EDuration="11.092993926s" podCreationTimestamp="2026-04-17 17:25:38 +0000 UTC" firstStartedPulling="2026-04-17 17:25:38.769622453 +0000 UTC m=+65.676355245" lastFinishedPulling="2026-04-17 17:25:47.880310249 +0000 UTC m=+74.787043029" observedRunningTime="2026-04-17 17:25:49.091679117 +0000 UTC m=+75.998411919" watchObservedRunningTime="2026-04-17 17:25:49.092993926 +0000 UTC m=+75.999726730" Apr 17 17:25:49.114185 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:49.113916 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-46l9f" podStartSLOduration=34.261593672 podStartE2EDuration="44.11389789s" podCreationTimestamp="2026-04-17 17:25:05 +0000 UTC" firstStartedPulling="2026-04-17 17:25:37.988700667 +0000 UTC m=+64.895433458" lastFinishedPulling="2026-04-17 17:25:47.841004895 +0000 UTC m=+74.747737676" observedRunningTime="2026-04-17 17:25:49.111624796 +0000 UTC m=+76.018357603" watchObservedRunningTime="2026-04-17 17:25:49.11389789 +0000 UTC m=+76.020630694" Apr 17 17:25:49.129630 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:49.129522 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-z942n" podStartSLOduration=70.937595783 podStartE2EDuration="1m16.129506763s" podCreationTimestamp="2026-04-17 17:24:33 +0000 UTC" firstStartedPulling="2026-04-17 17:25:42.860965548 +0000 UTC m=+69.767698343" lastFinishedPulling="2026-04-17 17:25:48.052876539 +0000 UTC m=+74.959609323" observedRunningTime="2026-04-17 17:25:49.12798201 +0000 UTC m=+76.034714814" watchObservedRunningTime="2026-04-17 17:25:49.129506763 +0000 UTC m=+76.036239565" Apr 17 17:25:49.147159 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:49.146932 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-5647w" podStartSLOduration=63.357130343 podStartE2EDuration="1m13.146912149s" podCreationTimestamp="2026-04-17 17:24:36 +0000 UTC" firstStartedPulling="2026-04-17 17:25:38.051677337 +0000 UTC m=+64.958410118" lastFinishedPulling="2026-04-17 17:25:47.841459136 +0000 UTC m=+74.748191924" observedRunningTime="2026-04-17 17:25:49.145384304 +0000 UTC m=+76.052117106" watchObservedRunningTime="2026-04-17 17:25:49.146912149 +0000 UTC m=+76.053644952" Apr 17 17:25:51.099173 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:51.098579 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" event={"ID":"3f0c4a92-d09a-422b-ba0b-9b62bc614519","Type":"ContainerStarted","Data":"640bcc601991ce1f4c08f1cacf544ca9ba5db30a5e84c114fdfa48dca014fd8e"} Apr 17 17:25:51.099173 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:51.098642 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" event={"ID":"3f0c4a92-d09a-422b-ba0b-9b62bc614519","Type":"ContainerStarted","Data":"b02987df2d38e2766d0f9c13b7eaaf66bf5534d0e27a9fd7dd18ae5591119db7"} Apr 17 17:25:51.100746 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:51.100704 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xhfqz" event={"ID":"8072177b-ab57-4392-ae96-968f804bc12a","Type":"ContainerStarted","Data":"9111d1ad51c72bc934989f8d1e30a2c6de03044ebcc678936b8300acd9eab1f5"} Apr 17 17:25:51.120414 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:51.120359 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-898cfd57b-wzvrt" podStartSLOduration=1.111012055 podStartE2EDuration="13.120339762s" podCreationTimestamp="2026-04-17 17:25:38 +0000 UTC" firstStartedPulling="2026-04-17 17:25:38.70628446 +0000 UTC m=+65.613017254" lastFinishedPulling="2026-04-17 17:25:50.71561218 +0000 UTC m=+77.622344961" observedRunningTime="2026-04-17 17:25:51.119280809 +0000 UTC m=+78.026013611" watchObservedRunningTime="2026-04-17 17:25:51.120339762 +0000 UTC m=+78.027072567" Apr 17 17:25:51.139285 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:51.139208 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xhfqz" podStartSLOduration=1.422044162 podStartE2EDuration="13.139191507s" podCreationTimestamp="2026-04-17 17:25:38 +0000 UTC" firstStartedPulling="2026-04-17 17:25:38.993822498 +0000 UTC m=+65.900555295" lastFinishedPulling="2026-04-17 17:25:50.710969843 +0000 UTC m=+77.617702640" observedRunningTime="2026-04-17 17:25:51.138400988 +0000 UTC m=+78.045133793" watchObservedRunningTime="2026-04-17 17:25:51.139191507 +0000 UTC m=+78.045924310" Apr 17 17:25:56.987048 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:56.986994 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-5jhj4"] Apr 17 17:25:57.042608 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.041279 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.046601 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.044432 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 17:25:57.046601 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.044709 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 17:25:57.046601 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.044876 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-29gws\"" Apr 17 17:25:57.046601 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.045062 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 17:25:57.046601 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.045217 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 17:25:57.128934 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.128894 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-textfile\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.129121 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.128963 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.129121 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.129004 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/423b3c4f-6603-41d8-aa68-98b535aac2f9-metrics-client-ca\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.129121 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.129045 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/423b3c4f-6603-41d8-aa68-98b535aac2f9-sys\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.129121 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.129088 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-wtmp\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.129121 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.129112 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqw4d\" (UniqueName: \"kubernetes.io/projected/423b3c4f-6603-41d8-aa68-98b535aac2f9-kube-api-access-jqw4d\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.129378 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.129258 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/423b3c4f-6603-41d8-aa68-98b535aac2f9-root\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.129378 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.129308 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-tls\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.129378 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.129352 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-accelerators-collector-config\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.230605 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.230555 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/423b3c4f-6603-41d8-aa68-98b535aac2f9-metrics-client-ca\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.230792 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.230631 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/423b3c4f-6603-41d8-aa68-98b535aac2f9-sys\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.230792 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.230672 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-wtmp\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.230792 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.230698 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqw4d\" (UniqueName: \"kubernetes.io/projected/423b3c4f-6603-41d8-aa68-98b535aac2f9-kube-api-access-jqw4d\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.230792 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.230753 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/423b3c4f-6603-41d8-aa68-98b535aac2f9-root\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.230792 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.230781 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-tls\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.231090 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.230817 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-accelerators-collector-config\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.231090 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.230857 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-textfile\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.231090 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.230898 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.232627 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:57.231390 2579 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:25:57.232627 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.231411 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/423b3c4f-6603-41d8-aa68-98b535aac2f9-root\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.232627 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:57.231469 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-tls podName:423b3c4f-6603-41d8-aa68-98b535aac2f9 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:57.731449341 +0000 UTC m=+84.638182139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-tls") pod "node-exporter-5jhj4" (UID: "423b3c4f-6603-41d8-aa68-98b535aac2f9") : secret "node-exporter-tls" not found Apr 17 17:25:57.232627 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.231732 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-textfile\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.232627 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.231764 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/423b3c4f-6603-41d8-aa68-98b535aac2f9-sys\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.232627 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.231861 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-wtmp\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.232627 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.231936 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-accelerators-collector-config\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.232627 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.231980 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/423b3c4f-6603-41d8-aa68-98b535aac2f9-metrics-client-ca\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.234314 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.234292 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.243394 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.243335 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqw4d\" (UniqueName: \"kubernetes.io/projected/423b3c4f-6603-41d8-aa68-98b535aac2f9-kube-api-access-jqw4d\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.715460 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.715131 2579 patch_prober.go:28] interesting pod/image-registry-5fc859f89d-s6vt8 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 17:25:57.715460 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.715204 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" podUID="c0da0d0b-459b-4fcc-a426-b97d20867b60" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:25:57.737544 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:57.737501 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-tls\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:57.737703 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:57.737631 2579 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:25:57.737703 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:25:57.737702 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-tls podName:423b3c4f-6603-41d8-aa68-98b535aac2f9 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:58.737683216 +0000 UTC m=+85.644416047 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-tls") pod "node-exporter-5jhj4" (UID: "423b3c4f-6603-41d8-aa68-98b535aac2f9") : secret "node-exporter-tls" not found Apr 17 17:25:58.746694 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:58.746656 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-tls\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:58.749371 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:58.749340 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/423b3c4f-6603-41d8-aa68-98b535aac2f9-node-exporter-tls\") pod \"node-exporter-5jhj4\" (UID: \"423b3c4f-6603-41d8-aa68-98b535aac2f9\") " pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:58.857907 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:58.857876 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5jhj4" Apr 17 17:25:58.868860 ip-10-0-130-17 kubenswrapper[2579]: W0417 17:25:58.868826 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod423b3c4f_6603_41d8_aa68_98b535aac2f9.slice/crio-d3da816c23050f2e07d358cfb1f5a805d8fb0a824abe551bf70ab5c973a24cfb WatchSource:0}: Error finding container d3da816c23050f2e07d358cfb1f5a805d8fb0a824abe551bf70ab5c973a24cfb: Status 404 returned error can't find the container with id d3da816c23050f2e07d358cfb1f5a805d8fb0a824abe551bf70ab5c973a24cfb Apr 17 17:25:59.130247 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:59.130199 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-8z5d2" event={"ID":"fc5c9760-32a8-4cb9-9740-2192f5719cd7","Type":"ContainerStarted","Data":"cc3d92edc72f66a3e153ed842463dcc3ae0acf5634843442d0316409af273d27"} Apr 17 17:25:59.130247 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:59.130256 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-8z5d2" Apr 17 17:25:59.131486 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:59.131443 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5jhj4" event={"ID":"423b3c4f-6603-41d8-aa68-98b535aac2f9","Type":"ContainerStarted","Data":"d3da816c23050f2e07d358cfb1f5a805d8fb0a824abe551bf70ab5c973a24cfb"} Apr 17 17:25:59.146416 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:59.146381 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-8z5d2" Apr 17 17:25:59.153373 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:25:59.153308 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-8z5d2" podStartSLOduration=1.7297681150000002 podStartE2EDuration="21.153292606s" podCreationTimestamp="2026-04-17 17:25:38 +0000 UTC" firstStartedPulling="2026-04-17 17:25:38.800322552 +0000 UTC m=+65.707055340" lastFinishedPulling="2026-04-17 17:25:58.223847037 +0000 UTC m=+85.130579831" observedRunningTime="2026-04-17 17:25:59.151165069 +0000 UTC m=+86.057897871" watchObservedRunningTime="2026-04-17 17:25:59.153292606 +0000 UTC m=+86.060025409" Apr 17 17:26:00.103364 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:00.103330 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-46l9f" Apr 17 17:26:01.030378 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:01.030348 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:26:01.115092 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:01.115056 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5fc859f89d-s6vt8"] Apr 17 17:26:01.140606 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:01.140555 2579 generic.go:358] "Generic (PLEG): container finished" podID="423b3c4f-6603-41d8-aa68-98b535aac2f9" containerID="8f5e2736d31743cc699f7d3d45e53b7ae47248275ea6d451011459d398667eef" exitCode=0 Apr 17 17:26:01.140758 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:01.140646 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5jhj4" event={"ID":"423b3c4f-6603-41d8-aa68-98b535aac2f9","Type":"ContainerDied","Data":"8f5e2736d31743cc699f7d3d45e53b7ae47248275ea6d451011459d398667eef"} Apr 17 17:26:02.146687 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:02.146641 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5jhj4" event={"ID":"423b3c4f-6603-41d8-aa68-98b535aac2f9","Type":"ContainerStarted","Data":"775929c0ff5ff3df4d7c01c25590c861fc6f0fff487bba8f9d8d42df2cf3f431"} Apr 17 17:26:02.146687 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:02.146690 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5jhj4" event={"ID":"423b3c4f-6603-41d8-aa68-98b535aac2f9","Type":"ContainerStarted","Data":"aaeeec3de46e034ec23562e26f4d664060782b180b56dbdb8783a9a424e882a7"} Apr 17 17:26:21.207601 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:21.207563 2579 generic.go:358] "Generic (PLEG): container finished" podID="482108e9-1395-4eda-884c-859f77d7a6be" containerID="85b6f1155b75c44117b9d4cd7a561edeabd90da839b696ec82855c9fac1697f9" exitCode=0 Apr 17 17:26:21.208034 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:21.207643 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nnctm" event={"ID":"482108e9-1395-4eda-884c-859f77d7a6be","Type":"ContainerDied","Data":"85b6f1155b75c44117b9d4cd7a561edeabd90da839b696ec82855c9fac1697f9"} Apr 17 17:26:21.208034 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:21.207962 2579 scope.go:117] "RemoveContainer" containerID="85b6f1155b75c44117b9d4cd7a561edeabd90da839b696ec82855c9fac1697f9" Apr 17 17:26:21.227552 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:21.227506 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-5jhj4" podStartSLOduration=23.845697617 podStartE2EDuration="25.227485596s" podCreationTimestamp="2026-04-17 17:25:56 +0000 UTC" firstStartedPulling="2026-04-17 17:25:58.870989957 +0000 UTC m=+85.777722737" lastFinishedPulling="2026-04-17 17:26:00.252777932 +0000 UTC m=+87.159510716" observedRunningTime="2026-04-17 17:26:02.167699608 +0000 UTC m=+89.074432409" watchObservedRunningTime="2026-04-17 17:26:21.227485596 +0000 UTC m=+108.134218398" Apr 17 17:26:22.212163 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:22.212129 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-nnctm" event={"ID":"482108e9-1395-4eda-884c-859f77d7a6be","Type":"ContainerStarted","Data":"0c8e22e64f5f0c8ca3e89a43a38aa5d89be9859addb06646c8defca9b523cb24"} Apr 17 17:26:26.163418 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.163356 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" podUID="c0da0d0b-459b-4fcc-a426-b97d20867b60" containerName="registry" containerID="cri-o://44052820fb8090624b229257a12289dc635c2eca02ddc061112b435ed5bf3f01" gracePeriod=30 Apr 17 17:26:26.225724 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.225691 2579 generic.go:358] "Generic (PLEG): container finished" podID="94b536ad-a08a-4eea-b44e-9a2802212a72" containerID="500d9c48d56012ae15a71b3c73deeb819c641018fbe6cfdb2a4fe5c5b952b4b5" exitCode=0 Apr 17 17:26:26.225853 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.225752 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-r2mbx" event={"ID":"94b536ad-a08a-4eea-b44e-9a2802212a72","Type":"ContainerDied","Data":"500d9c48d56012ae15a71b3c73deeb819c641018fbe6cfdb2a4fe5c5b952b4b5"} Apr 17 17:26:26.226081 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.226069 2579 scope.go:117] "RemoveContainer" containerID="500d9c48d56012ae15a71b3c73deeb819c641018fbe6cfdb2a4fe5c5b952b4b5" Apr 17 17:26:26.440463 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.440440 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:26:26.494222 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.494196 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0da0d0b-459b-4fcc-a426-b97d20867b60-trusted-ca\") pod \"c0da0d0b-459b-4fcc-a426-b97d20867b60\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " Apr 17 17:26:26.494222 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.494226 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-certificates\") pod \"c0da0d0b-459b-4fcc-a426-b97d20867b60\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " Apr 17 17:26:26.494429 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.494246 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c0da0d0b-459b-4fcc-a426-b97d20867b60-image-registry-private-configuration\") pod \"c0da0d0b-459b-4fcc-a426-b97d20867b60\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " Apr 17 17:26:26.494429 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.494264 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nnj6\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-kube-api-access-7nnj6\") pod \"c0da0d0b-459b-4fcc-a426-b97d20867b60\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " Apr 17 17:26:26.494429 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.494284 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-bound-sa-token\") pod \"c0da0d0b-459b-4fcc-a426-b97d20867b60\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " Apr 17 17:26:26.494429 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.494317 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c0da0d0b-459b-4fcc-a426-b97d20867b60-installation-pull-secrets\") pod \"c0da0d0b-459b-4fcc-a426-b97d20867b60\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " Apr 17 17:26:26.494429 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.494349 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls\") pod \"c0da0d0b-459b-4fcc-a426-b97d20867b60\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " Apr 17 17:26:26.494429 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.494397 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c0da0d0b-459b-4fcc-a426-b97d20867b60-ca-trust-extracted\") pod \"c0da0d0b-459b-4fcc-a426-b97d20867b60\" (UID: \"c0da0d0b-459b-4fcc-a426-b97d20867b60\") " Apr 17 17:26:26.494742 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.494689 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0da0d0b-459b-4fcc-a426-b97d20867b60-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c0da0d0b-459b-4fcc-a426-b97d20867b60" (UID: "c0da0d0b-459b-4fcc-a426-b97d20867b60"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:26.495107 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.495082 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c0da0d0b-459b-4fcc-a426-b97d20867b60" (UID: "c0da0d0b-459b-4fcc-a426-b97d20867b60"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:26.497009 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.496976 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c0da0d0b-459b-4fcc-a426-b97d20867b60" (UID: "c0da0d0b-459b-4fcc-a426-b97d20867b60"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:26:26.497166 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.497104 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0da0d0b-459b-4fcc-a426-b97d20867b60-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c0da0d0b-459b-4fcc-a426-b97d20867b60" (UID: "c0da0d0b-459b-4fcc-a426-b97d20867b60"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:26.497264 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.497232 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0da0d0b-459b-4fcc-a426-b97d20867b60-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "c0da0d0b-459b-4fcc-a426-b97d20867b60" (UID: "c0da0d0b-459b-4fcc-a426-b97d20867b60"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:26.497372 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.497355 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-kube-api-access-7nnj6" (OuterVolumeSpecName: "kube-api-access-7nnj6") pod "c0da0d0b-459b-4fcc-a426-b97d20867b60" (UID: "c0da0d0b-459b-4fcc-a426-b97d20867b60"). InnerVolumeSpecName "kube-api-access-7nnj6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:26:26.497443 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.497422 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c0da0d0b-459b-4fcc-a426-b97d20867b60" (UID: "c0da0d0b-459b-4fcc-a426-b97d20867b60"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:26:26.503114 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.503092 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0da0d0b-459b-4fcc-a426-b97d20867b60-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c0da0d0b-459b-4fcc-a426-b97d20867b60" (UID: "c0da0d0b-459b-4fcc-a426-b97d20867b60"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:26:26.595406 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.595372 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-bound-sa-token\") on node \"ip-10-0-130-17.ec2.internal\" DevicePath \"\"" Apr 17 17:26:26.595406 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.595402 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c0da0d0b-459b-4fcc-a426-b97d20867b60-installation-pull-secrets\") on node \"ip-10-0-130-17.ec2.internal\" DevicePath \"\"" Apr 17 17:26:26.595406 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.595414 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-tls\") on node \"ip-10-0-130-17.ec2.internal\" DevicePath \"\"" Apr 17 17:26:26.595625 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.595424 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c0da0d0b-459b-4fcc-a426-b97d20867b60-ca-trust-extracted\") on node \"ip-10-0-130-17.ec2.internal\" DevicePath \"\"" Apr 17 17:26:26.595625 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.595433 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c0da0d0b-459b-4fcc-a426-b97d20867b60-trusted-ca\") on node \"ip-10-0-130-17.ec2.internal\" DevicePath \"\"" Apr 17 17:26:26.595625 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.595443 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c0da0d0b-459b-4fcc-a426-b97d20867b60-registry-certificates\") on node \"ip-10-0-130-17.ec2.internal\" DevicePath \"\"" Apr 17 17:26:26.595625 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.595452 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c0da0d0b-459b-4fcc-a426-b97d20867b60-image-registry-private-configuration\") on node \"ip-10-0-130-17.ec2.internal\" DevicePath \"\"" Apr 17 17:26:26.595625 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:26.595460 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7nnj6\" (UniqueName: \"kubernetes.io/projected/c0da0d0b-459b-4fcc-a426-b97d20867b60-kube-api-access-7nnj6\") on node \"ip-10-0-130-17.ec2.internal\" DevicePath \"\"" Apr 17 17:26:27.229968 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:27.229924 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-r2mbx" event={"ID":"94b536ad-a08a-4eea-b44e-9a2802212a72","Type":"ContainerStarted","Data":"62f24a7ef8b4ff815f2c780d23afa961b5dcaf57b5625e6a75d1590e99a33f6c"} Apr 17 17:26:27.231155 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:27.231132 2579 generic.go:358] "Generic (PLEG): container finished" podID="c0da0d0b-459b-4fcc-a426-b97d20867b60" containerID="44052820fb8090624b229257a12289dc635c2eca02ddc061112b435ed5bf3f01" exitCode=0 Apr 17 17:26:27.231257 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:27.231184 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" Apr 17 17:26:27.231257 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:27.231212 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" event={"ID":"c0da0d0b-459b-4fcc-a426-b97d20867b60","Type":"ContainerDied","Data":"44052820fb8090624b229257a12289dc635c2eca02ddc061112b435ed5bf3f01"} Apr 17 17:26:27.231257 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:27.231243 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5fc859f89d-s6vt8" event={"ID":"c0da0d0b-459b-4fcc-a426-b97d20867b60","Type":"ContainerDied","Data":"3db52795e4c7207e47be1f2ee78c128c2c9fa822c13299452cffe11afb24d8c7"} Apr 17 17:26:27.231257 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:27.231258 2579 scope.go:117] "RemoveContainer" containerID="44052820fb8090624b229257a12289dc635c2eca02ddc061112b435ed5bf3f01" Apr 17 17:26:27.239691 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:27.239675 2579 scope.go:117] "RemoveContainer" containerID="44052820fb8090624b229257a12289dc635c2eca02ddc061112b435ed5bf3f01" Apr 17 17:26:27.239933 ip-10-0-130-17 kubenswrapper[2579]: E0417 17:26:27.239911 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44052820fb8090624b229257a12289dc635c2eca02ddc061112b435ed5bf3f01\": container with ID starting with 44052820fb8090624b229257a12289dc635c2eca02ddc061112b435ed5bf3f01 not found: ID does not exist" containerID="44052820fb8090624b229257a12289dc635c2eca02ddc061112b435ed5bf3f01" Apr 17 17:26:27.239994 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:27.239941 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44052820fb8090624b229257a12289dc635c2eca02ddc061112b435ed5bf3f01"} err="failed to get container status \"44052820fb8090624b229257a12289dc635c2eca02ddc061112b435ed5bf3f01\": rpc error: code = NotFound desc = could not find container \"44052820fb8090624b229257a12289dc635c2eca02ddc061112b435ed5bf3f01\": container with ID starting with 44052820fb8090624b229257a12289dc635c2eca02ddc061112b435ed5bf3f01 not found: ID does not exist" Apr 17 17:26:27.263142 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:27.263111 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5fc859f89d-s6vt8"] Apr 17 17:26:27.269430 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:27.269402 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5fc859f89d-s6vt8"] Apr 17 17:26:27.643586 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:27.643550 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0da0d0b-459b-4fcc-a426-b97d20867b60" path="/var/lib/kubelet/pods/c0da0d0b-459b-4fcc-a426-b97d20867b60/volumes" Apr 17 17:26:46.289651 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:46.289611 2579 generic.go:358] "Generic (PLEG): container finished" podID="ed6fdd82-ec63-4507-83a7-188a60111e24" containerID="b4ae02ec51839858971be87c938dae4ec8d6c97f7b12de3d49bb357ab72874a1" exitCode=0 Apr 17 17:26:46.290050 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:46.289667 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-87bk5" event={"ID":"ed6fdd82-ec63-4507-83a7-188a60111e24","Type":"ContainerDied","Data":"b4ae02ec51839858971be87c938dae4ec8d6c97f7b12de3d49bb357ab72874a1"} Apr 17 17:26:46.290050 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:46.290016 2579 scope.go:117] "RemoveContainer" containerID="b4ae02ec51839858971be87c938dae4ec8d6c97f7b12de3d49bb357ab72874a1" Apr 17 17:26:47.295209 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:26:47.295179 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-87bk5" event={"ID":"ed6fdd82-ec63-4507-83a7-188a60111e24","Type":"ContainerStarted","Data":"7769c2c19eba9a27a9eee3051edb626d2211d82843d465e4a8c20620c7380283"} Apr 17 17:29:33.596347 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:29:33.596321 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 17:29:33.596911 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:29:33.596377 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 17:29:33.604097 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:29:33.604072 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 17:29:33.604657 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:29:33.604632 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 17:29:33.606889 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:29:33.606870 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 17:34:33.617396 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:34:33.617369 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 17:34:33.618648 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:34:33.618585 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 17:34:33.624668 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:34:33.624636 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 17:34:33.625989 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:34:33.625968 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 17:39:33.639631 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:39:33.639528 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 17:39:33.643449 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:39:33.643423 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 17:39:33.647526 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:39:33.647504 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 17:39:33.650254 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:39:33.650069 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 17:44:33.660474 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:44:33.660441 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 17:44:33.665141 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:44:33.665117 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 17:44:33.676571 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:44:33.676546 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 17:44:33.677174 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:44:33.677155 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 17:49:33.695488 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:49:33.695451 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 17:49:33.696074 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:49:33.696057 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 17:49:33.703064 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:49:33.703036 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 17:49:33.703369 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:49:33.703348 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 17:54:33.716522 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:54:33.716495 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 17:54:33.718469 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:54:33.718437 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 17:54:33.723080 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:54:33.723059 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 17:54:33.724684 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:54:33.724668 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 17:59:33.736783 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:59:33.736750 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 17:59:33.741608 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:59:33.741565 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 17:59:33.745752 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:59:33.745729 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 17:59:33.748408 ip-10-0-130-17 kubenswrapper[2579]: I0417 17:59:33.748387 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 18:04:33.760141 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:04:33.760018 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 18:04:33.764015 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:04:33.763410 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 18:04:33.766761 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:04:33.766743 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 18:04:33.770367 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:04:33.770332 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 18:09:33.780625 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:09:33.780496 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 18:09:33.785646 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:09:33.785618 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 18:09:33.787873 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:09:33.787849 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 18:09:33.794725 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:09:33.794701 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 18:14:33.808122 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:14:33.807981 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 18:14:33.812029 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:14:33.811909 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 18:14:33.815048 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:14:33.815029 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 18:14:33.818931 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:14:33.818911 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 18:19:33.828852 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:19:33.828743 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 18:19:33.832673 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:19:33.832650 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 18:19:33.835512 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:19:33.835493 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 18:19:33.838957 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:19:33.838941 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 18:24:33.849440 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:24:33.849329 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 18:24:33.853870 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:24:33.853847 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 18:24:33.856113 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:24:33.856087 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 18:24:33.860498 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:24:33.860479 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 18:28:25.730923 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:25.730892 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-5j2pv_f84e8ad7-d1f5-4adb-a051-4a0ab84d4e95/global-pull-secret-syncer/0.log" Apr 17 18:28:25.882880 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:25.882836 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-bzxhn_a926d37e-a35b-4d6d-a341-5f224db6cd94/konnectivity-agent/0.log" Apr 17 18:28:25.989935 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:25.989848 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-17.ec2.internal_ee539be09c69658279b69c4b8f0acb61/haproxy/0.log" Apr 17 18:28:29.537554 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:29.537499 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-tzqfr_2e1343de-f52a-4262-8e36-2270dd39d6a2/cluster-monitoring-operator/0.log" Apr 17 18:28:29.723930 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:29.723840 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5jhj4_423b3c4f-6603-41d8-aa68-98b535aac2f9/node-exporter/0.log" Apr 17 18:28:29.745130 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:29.745100 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5jhj4_423b3c4f-6603-41d8-aa68-98b535aac2f9/kube-rbac-proxy/0.log" Apr 17 18:28:29.775929 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:29.775901 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5jhj4_423b3c4f-6603-41d8-aa68-98b535aac2f9/init-textfile/0.log" Apr 17 18:28:31.665673 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:31.665643 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-fdkcf_b6dbc8c2-d7a9-453d-8ebc-1e0c119d98e5/networking-console-plugin/0.log" Apr 17 18:28:32.118144 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.118112 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/1.log" Apr 17 18:28:32.124926 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.124888 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-z445c_d7d8f932-f808-4d50-8b55-ad125b8b9a2c/console-operator/2.log" Apr 17 18:28:32.529569 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.529483 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-8z5d2_fc5c9760-32a8-4cb9-9740-2192f5719cd7/download-server/0.log" Apr 17 18:28:32.816251 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.816171 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s"] Apr 17 18:28:32.816632 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.816498 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0da0d0b-459b-4fcc-a426-b97d20867b60" containerName="registry" Apr 17 18:28:32.816632 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.816511 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0da0d0b-459b-4fcc-a426-b97d20867b60" containerName="registry" Apr 17 18:28:32.816632 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.816579 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0da0d0b-459b-4fcc-a426-b97d20867b60" containerName="registry" Apr 17 18:28:32.819498 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.819481 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" Apr 17 18:28:32.821744 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.821724 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2d5hj\"/\"default-dockercfg-gs5n6\"" Apr 17 18:28:32.821883 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.821759 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2d5hj\"/\"openshift-service-ca.crt\"" Apr 17 18:28:32.822422 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.822401 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2d5hj\"/\"kube-root-ca.crt\"" Apr 17 18:28:32.828626 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.828603 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s"] Apr 17 18:28:32.896182 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.896146 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84ff610c-d7f8-4601-83b9-5e024ec1921d-sys\") pod \"perf-node-gather-daemonset-gdq4s\" (UID: \"84ff610c-d7f8-4601-83b9-5e024ec1921d\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" Apr 17 18:28:32.896360 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.896194 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/84ff610c-d7f8-4601-83b9-5e024ec1921d-lib-modules\") pod \"perf-node-gather-daemonset-gdq4s\" (UID: \"84ff610c-d7f8-4601-83b9-5e024ec1921d\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" Apr 17 18:28:32.896360 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.896286 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/84ff610c-d7f8-4601-83b9-5e024ec1921d-podres\") pod \"perf-node-gather-daemonset-gdq4s\" (UID: \"84ff610c-d7f8-4601-83b9-5e024ec1921d\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" Apr 17 18:28:32.896360 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.896329 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/84ff610c-d7f8-4601-83b9-5e024ec1921d-proc\") pod \"perf-node-gather-daemonset-gdq4s\" (UID: \"84ff610c-d7f8-4601-83b9-5e024ec1921d\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" Apr 17 18:28:32.896464 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.896377 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czz4q\" (UniqueName: \"kubernetes.io/projected/84ff610c-d7f8-4601-83b9-5e024ec1921d-kube-api-access-czz4q\") pod \"perf-node-gather-daemonset-gdq4s\" (UID: \"84ff610c-d7f8-4601-83b9-5e024ec1921d\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" Apr 17 18:28:32.963955 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.963915 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-rnfvq_ad57895c-7274-4a8b-a653-20f918afed96/volume-data-source-validator/0.log" Apr 17 18:28:32.996803 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.996747 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84ff610c-d7f8-4601-83b9-5e024ec1921d-sys\") pod \"perf-node-gather-daemonset-gdq4s\" (UID: \"84ff610c-d7f8-4601-83b9-5e024ec1921d\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" Apr 17 18:28:32.996803 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.996803 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/84ff610c-d7f8-4601-83b9-5e024ec1921d-lib-modules\") pod \"perf-node-gather-daemonset-gdq4s\" (UID: \"84ff610c-d7f8-4601-83b9-5e024ec1921d\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" Apr 17 18:28:32.997037 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.996844 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/84ff610c-d7f8-4601-83b9-5e024ec1921d-podres\") pod \"perf-node-gather-daemonset-gdq4s\" (UID: \"84ff610c-d7f8-4601-83b9-5e024ec1921d\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" Apr 17 18:28:32.997037 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.996880 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84ff610c-d7f8-4601-83b9-5e024ec1921d-sys\") pod \"perf-node-gather-daemonset-gdq4s\" (UID: \"84ff610c-d7f8-4601-83b9-5e024ec1921d\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" Apr 17 18:28:32.997037 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.996890 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/84ff610c-d7f8-4601-83b9-5e024ec1921d-proc\") pod \"perf-node-gather-daemonset-gdq4s\" (UID: \"84ff610c-d7f8-4601-83b9-5e024ec1921d\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" Apr 17 18:28:32.997037 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.996937 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/84ff610c-d7f8-4601-83b9-5e024ec1921d-proc\") pod \"perf-node-gather-daemonset-gdq4s\" (UID: \"84ff610c-d7f8-4601-83b9-5e024ec1921d\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" Apr 17 18:28:32.997037 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.996960 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/84ff610c-d7f8-4601-83b9-5e024ec1921d-podres\") pod \"perf-node-gather-daemonset-gdq4s\" (UID: \"84ff610c-d7f8-4601-83b9-5e024ec1921d\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" Apr 17 18:28:32.997037 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.996962 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czz4q\" (UniqueName: \"kubernetes.io/projected/84ff610c-d7f8-4601-83b9-5e024ec1921d-kube-api-access-czz4q\") pod \"perf-node-gather-daemonset-gdq4s\" (UID: \"84ff610c-d7f8-4601-83b9-5e024ec1921d\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" Apr 17 18:28:32.997259 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:32.996965 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/84ff610c-d7f8-4601-83b9-5e024ec1921d-lib-modules\") pod \"perf-node-gather-daemonset-gdq4s\" (UID: \"84ff610c-d7f8-4601-83b9-5e024ec1921d\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" Apr 17 18:28:33.005232 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:33.005191 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czz4q\" (UniqueName: \"kubernetes.io/projected/84ff610c-d7f8-4601-83b9-5e024ec1921d-kube-api-access-czz4q\") pod \"perf-node-gather-daemonset-gdq4s\" (UID: \"84ff610c-d7f8-4601-83b9-5e024ec1921d\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" Apr 17 18:28:33.130617 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:33.130562 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" Apr 17 18:28:33.257334 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:33.257299 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s"] Apr 17 18:28:33.260898 ip-10-0-130-17 kubenswrapper[2579]: W0417 18:28:33.260868 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod84ff610c_d7f8_4601_83b9_5e024ec1921d.slice/crio-43074c424f018bd5b6cb4fab3611f5af86b95116f507dea4e3ba14320156b7d8 WatchSource:0}: Error finding container 43074c424f018bd5b6cb4fab3611f5af86b95116f507dea4e3ba14320156b7d8: Status 404 returned error can't find the container with id 43074c424f018bd5b6cb4fab3611f5af86b95116f507dea4e3ba14320156b7d8 Apr 17 18:28:33.262571 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:33.262554 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:28:33.702085 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:33.702050 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-46l9f_f6656a7d-18be-4793-8aae-ca80248fd4ac/dns/0.log" Apr 17 18:28:33.725896 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:33.725854 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-46l9f_f6656a7d-18be-4793-8aae-ca80248fd4ac/kube-rbac-proxy/0.log" Apr 17 18:28:33.824860 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:33.824831 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-szhhg_eb9e24d4-7146-488f-a450-9bd6feba5465/dns-node-resolver/0.log" Apr 17 18:28:33.926560 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:33.926525 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" event={"ID":"84ff610c-d7f8-4601-83b9-5e024ec1921d","Type":"ContainerStarted","Data":"3b6cff43a7d9bf28ad4e1657e4be2f690fc6a656498394cfc191c212bac4fa45"} Apr 17 18:28:33.926742 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:33.926568 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" event={"ID":"84ff610c-d7f8-4601-83b9-5e024ec1921d","Type":"ContainerStarted","Data":"43074c424f018bd5b6cb4fab3611f5af86b95116f507dea4e3ba14320156b7d8"} Apr 17 18:28:33.926742 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:33.926678 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" Apr 17 18:28:33.943857 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:33.943797 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" podStartSLOduration=1.9437797909999999 podStartE2EDuration="1.943779791s" podCreationTimestamp="2026-04-17 18:28:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:28:33.943614207 +0000 UTC m=+3840.850347001" watchObservedRunningTime="2026-04-17 18:28:33.943779791 +0000 UTC m=+3840.850512595" Apr 17 18:28:34.308873 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:34.308836 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9szwb_d18b6520-db25-43a0-bca5-6990fef41e34/node-ca/0.log" Apr 17 18:28:35.057602 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:35.057415 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5f49dd587d-rxz8k_757e5944-43d8-40d9-bf59-81391d9f77cf/router/0.log" Apr 17 18:28:35.415561 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:35.415524 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8mf4x_6ba0a3f3-4f4d-4ea1-a514-e449db3682e3/serve-healthcheck-canary/0.log" Apr 17 18:28:35.787625 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:35.787498 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-r2mbx_94b536ad-a08a-4eea-b44e-9a2802212a72/insights-operator/0.log" Apr 17 18:28:35.790129 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:35.790102 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-r2mbx_94b536ad-a08a-4eea-b44e-9a2802212a72/insights-operator/1.log" Apr 17 18:28:35.948718 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:35.948684 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xhfqz_8072177b-ab57-4392-ae96-968f804bc12a/kube-rbac-proxy/0.log" Apr 17 18:28:35.970528 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:35.970500 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xhfqz_8072177b-ab57-4392-ae96-968f804bc12a/exporter/0.log" Apr 17 18:28:35.992481 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:35.992449 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xhfqz_8072177b-ab57-4392-ae96-968f804bc12a/extractor/0.log" Apr 17 18:28:39.939653 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:39.939627 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-gdq4s" Apr 17 18:28:42.446886 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:42.446846 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-w5fh4_76baba93-cb35-46ce-b2f4-05ea81e5ce12/migrator/0.log" Apr 17 18:28:42.470579 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:42.470544 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-w5fh4_76baba93-cb35-46ce-b2f4-05ea81e5ce12/graceful-termination/0.log" Apr 17 18:28:42.868735 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:42.868693 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-nnctm_482108e9-1395-4eda-884c-859f77d7a6be/kube-storage-version-migrator-operator/1.log" Apr 17 18:28:42.870121 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:42.870095 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-nnctm_482108e9-1395-4eda-884c-859f77d7a6be/kube-storage-version-migrator-operator/0.log" Apr 17 18:28:44.254069 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:44.254039 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z8qx6_28463658-293e-4847-bb58-c40452c9ceba/kube-multus-additional-cni-plugins/0.log" Apr 17 18:28:44.274986 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:44.274949 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z8qx6_28463658-293e-4847-bb58-c40452c9ceba/egress-router-binary-copy/0.log" Apr 17 18:28:44.295906 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:44.295877 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z8qx6_28463658-293e-4847-bb58-c40452c9ceba/cni-plugins/0.log" Apr 17 18:28:44.316342 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:44.316315 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z8qx6_28463658-293e-4847-bb58-c40452c9ceba/bond-cni-plugin/0.log" Apr 17 18:28:44.338236 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:44.338150 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z8qx6_28463658-293e-4847-bb58-c40452c9ceba/routeoverride-cni/0.log" Apr 17 18:28:44.359180 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:44.359148 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z8qx6_28463658-293e-4847-bb58-c40452c9ceba/whereabouts-cni-bincopy/0.log" Apr 17 18:28:44.380729 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:44.380699 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z8qx6_28463658-293e-4847-bb58-c40452c9ceba/whereabouts-cni/0.log" Apr 17 18:28:44.416089 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:44.416045 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-njgk7_5f87164f-e1cb-4cad-aeef-d75c4e3648c7/kube-multus/0.log" Apr 17 18:28:44.598918 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:44.598835 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-z942n_d168f0a0-7fcd-4905-a424-24a94b7fcdbb/network-metrics-daemon/0.log" Apr 17 18:28:44.617716 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:44.617682 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-z942n_d168f0a0-7fcd-4905-a424-24a94b7fcdbb/kube-rbac-proxy/0.log" Apr 17 18:28:45.317314 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:45.317284 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-controller/0.log" Apr 17 18:28:45.338334 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:45.338305 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/0.log" Apr 17 18:28:45.371622 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:45.371565 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovn-acl-logging/1.log" Apr 17 18:28:45.394373 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:45.394338 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/kube-rbac-proxy-node/0.log" Apr 17 18:28:45.416272 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:45.416241 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 18:28:45.440445 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:45.440418 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/northd/0.log" Apr 17 18:28:45.462347 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:45.462320 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/nbdb/0.log" Apr 17 18:28:45.484562 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:45.484528 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/sbdb/0.log" Apr 17 18:28:45.650603 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:45.650559 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-h29v4_5ec8213b-d815-438e-ab3d-f610b8fc1f8a/ovnkube-controller/0.log" Apr 17 18:28:47.419419 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:47.419383 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-n6tvz_0c9357c2-cf5b-4e52-889b-e7a839ac8e1d/check-endpoints/0.log" Apr 17 18:28:47.468089 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:47.468043 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-m8x5g_0ace93ad-4902-4616-82aa-f2d931df41ef/network-check-target-container/0.log" Apr 17 18:28:48.364112 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:48.364080 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-bbk45_8b5e6ca6-9cb4-41b2-9b42-cece2ca5ad9b/iptables-alerter/0.log" Apr 17 18:28:49.068159 ip-10-0-130-17 kubenswrapper[2579]: I0417 18:28:49.068127 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-jscv2_ced94cc1-575e-4efc-8406-8add5b3da29c/tuned/0.log"