Apr 17 11:13:34.433513 ip-10-0-129-94 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 11:13:34.433525 ip-10-0-129-94 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 11:13:34.433532 ip-10-0-129-94 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 11:13:34.433799 ip-10-0-129-94 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 11:13:44.673238 ip-10-0-129-94 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 11:13:44.673257 ip-10-0-129-94 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 8864a49819de4812b63a9497ad826bda -- Apr 17 11:16:10.476687 ip-10-0-129-94 systemd[1]: Starting Kubernetes Kubelet... Apr 17 11:16:10.966380 ip-10-0-129-94 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:10.966380 ip-10-0-129-94 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 11:16:10.966380 ip-10-0-129-94 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:10.966380 ip-10-0-129-94 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 11:16:10.966380 ip-10-0-129-94 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:10.967518 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.967335 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 11:16:10.972090 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972073 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:10.972090 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972089 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:10.972159 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972094 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:10.972159 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972099 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:10.972159 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972102 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:10.972159 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972105 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:10.972159 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972108 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:10.972159 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972111 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:10.972159 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972114 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:10.972159 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972117 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:10.972159 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972126 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:10.972159 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972133 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:10.972159 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972136 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:10.972159 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972139 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:10.972159 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972142 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:10.972159 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972144 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:10.972159 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972147 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:10.972159 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972149 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:10.972159 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972153 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:10.972159 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972155 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:10.972159 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972158 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:10.972159 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972161 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:10.972651 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972164 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:10.972651 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972166 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:10.972651 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972169 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:10.972651 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972172 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:10.972651 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972175 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:10.972651 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972177 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:10.972651 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972180 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:10.972651 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972183 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:10.972651 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972185 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:10.972651 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972188 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:10.972651 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972192 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:10.972651 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972194 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:10.972651 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972197 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:10.972651 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972200 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:10.972651 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972202 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:10.972651 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972205 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:10.972651 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972207 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:10.972651 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972210 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:10.972651 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972215 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:10.973103 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972218 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:10.973103 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972221 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:10.973103 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972224 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:10.973103 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972226 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:10.973103 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972229 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:10.973103 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972231 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:10.973103 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972234 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:10.973103 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972237 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:10.973103 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972239 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:10.973103 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972242 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:10.973103 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972244 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:10.973103 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972247 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:10.973103 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972249 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:10.973103 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972252 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:10.973103 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972255 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:10.973103 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972258 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:10.973103 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972261 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:10.973103 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972263 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:10.973103 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972266 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:10.973103 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972269 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:10.973603 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972271 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:10.973603 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972274 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:10.973603 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972277 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:10.973603 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972279 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:10.973603 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972282 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:10.973603 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972285 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:10.973603 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972288 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:10.973603 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972290 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:10.973603 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972293 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:10.973603 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972295 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:10.973603 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972298 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:10.973603 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972300 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:10.973603 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972303 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:10.973603 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972305 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:10.973603 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972308 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:10.973603 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972311 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:10.973603 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972313 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:10.973603 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972316 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:10.973603 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972318 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:10.973603 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972320 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:10.974073 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972323 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:10.974073 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972326 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:10.974073 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972328 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:10.974073 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972331 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:10.974073 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972333 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:10.974073 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972734 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:10.974073 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972740 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:10.974073 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972743 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:10.974073 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972746 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:10.974073 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972749 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:10.974073 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972752 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:10.974073 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972754 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:10.974073 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972757 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:10.974073 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972759 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:10.974073 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972762 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:10.974073 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972764 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:10.974073 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972767 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:10.974073 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972770 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:10.974073 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972773 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:10.974073 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972775 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:10.974660 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972778 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:10.974660 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972780 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:10.974660 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972783 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:10.974660 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972785 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:10.974660 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972788 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:10.974660 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972790 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:10.974660 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972792 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:10.974660 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972795 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:10.974660 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972798 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:10.974660 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972800 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:10.974660 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972802 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:10.974660 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972805 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:10.974660 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972807 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:10.974660 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972810 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:10.974660 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972812 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:10.974660 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972815 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:10.974660 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972817 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:10.974660 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972820 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:10.974660 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972824 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:10.974660 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972827 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:10.975154 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972829 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:10.975154 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972832 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:10.975154 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972834 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:10.975154 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972837 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:10.975154 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972839 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:10.975154 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972842 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:10.975154 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972845 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:10.975154 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972847 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:10.975154 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972850 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:10.975154 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972853 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:10.975154 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972856 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:10.975154 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972858 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:10.975154 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972861 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:10.975154 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972863 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:10.975154 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972866 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:10.975154 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972868 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:10.975154 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972871 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:10.975154 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972874 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:10.975154 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972876 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:10.975154 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972879 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:10.975663 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972881 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:10.975663 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972884 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:10.975663 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972886 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:10.975663 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972889 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:10.975663 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972891 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:10.975663 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972893 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:10.975663 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972896 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:10.975663 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972899 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:10.975663 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972901 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:10.975663 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972904 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:10.975663 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972908 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:10.975663 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972911 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:10.975663 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972913 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:10.975663 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972916 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:10.975663 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972918 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:10.975663 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972921 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:10.975663 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972923 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:10.975663 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972927 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:10.975663 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972931 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:10.975663 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972934 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:10.976152 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972936 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:10.976152 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972939 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:10.976152 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972943 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:10.976152 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972947 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:10.976152 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972950 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:10.976152 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972953 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:10.976152 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972955 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:10.976152 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972958 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:10.976152 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972960 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:10.976152 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972963 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:10.976152 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.972965 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:10.976152 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973039 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 11:16:10.976152 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973046 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 11:16:10.976152 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973052 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 11:16:10.976152 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973057 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 11:16:10.976152 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973061 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 11:16:10.976152 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973065 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 11:16:10.976152 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973069 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 11:16:10.976152 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973073 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 11:16:10.976152 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973077 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973080 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973084 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973088 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973091 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973094 2571 flags.go:64] FLAG: --cgroup-root="" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973097 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973101 2571 flags.go:64] FLAG: --client-ca-file="" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973103 2571 flags.go:64] FLAG: --cloud-config="" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973106 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973109 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973114 2571 flags.go:64] FLAG: --cluster-domain="" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973117 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973120 2571 flags.go:64] FLAG: --config-dir="" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973123 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973126 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973131 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973134 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973138 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973141 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973144 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973146 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973150 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973152 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973155 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 11:16:10.976653 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973160 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973163 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973166 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973169 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973171 2571 flags.go:64] FLAG: --enable-server="true" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973174 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973178 2571 flags.go:64] FLAG: --event-burst="100" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973182 2571 flags.go:64] FLAG: --event-qps="50" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973185 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973188 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973191 2571 flags.go:64] FLAG: --eviction-hard="" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973195 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973199 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973202 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973205 2571 flags.go:64] FLAG: --eviction-soft="" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973209 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973211 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973214 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973217 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973220 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973223 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973226 2571 flags.go:64] FLAG: --feature-gates="" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973230 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973233 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973236 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 11:16:10.977286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973240 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973243 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973246 2571 flags.go:64] FLAG: --help="false" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973249 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-129-94.ec2.internal" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973252 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973255 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973258 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973262 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973265 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973268 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973271 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973274 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973277 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973280 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973283 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973286 2571 flags.go:64] FLAG: --kube-reserved="" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973289 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973292 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973295 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973297 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973301 2571 flags.go:64] FLAG: --lock-file="" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973304 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973307 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973310 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 11:16:10.977896 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973315 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973317 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973320 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973323 2571 flags.go:64] FLAG: --logging-format="text" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973326 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973329 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973332 2571 flags.go:64] FLAG: --manifest-url="" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973335 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973340 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973343 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973347 2571 flags.go:64] FLAG: --max-pods="110" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973350 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973353 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973357 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973360 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973362 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973379 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973382 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973390 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973393 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973396 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973399 2571 flags.go:64] FLAG: --pod-cidr="" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973402 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 11:16:10.978530 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973408 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973411 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973414 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973417 2571 flags.go:64] FLAG: --port="10250" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973421 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973423 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-07cfdd99d859d2590" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973427 2571 flags.go:64] FLAG: --qos-reserved="" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973429 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973433 2571 flags.go:64] FLAG: --register-node="true" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973435 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973438 2571 flags.go:64] FLAG: --register-with-taints="" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973442 2571 flags.go:64] FLAG: --registry-burst="10" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973445 2571 flags.go:64] FLAG: --registry-qps="5" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973448 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973450 2571 flags.go:64] FLAG: --reserved-memory="" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973454 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973457 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973460 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973464 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973467 2571 flags.go:64] FLAG: --runonce="false" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973470 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973473 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973476 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973479 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973482 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973485 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 11:16:10.979091 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973489 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973492 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973495 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973498 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973501 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973503 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973507 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973510 2571 flags.go:64] FLAG: --system-cgroups="" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973514 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973519 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973522 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973525 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973530 2571 flags.go:64] FLAG: --tls-min-version="" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973532 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973535 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973538 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973541 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973544 2571 flags.go:64] FLAG: --v="2" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973548 2571 flags.go:64] FLAG: --version="false" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973552 2571 flags.go:64] FLAG: --vmodule="" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973556 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.973559 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973645 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973649 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:10.979731 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973655 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:10.980352 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973658 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:10.980352 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973662 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:10.980352 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973665 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:10.980352 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973668 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:10.980352 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973671 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:10.980352 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973673 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:10.980352 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973676 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:10.980352 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973678 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:10.980352 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973681 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:10.980352 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973684 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:10.980352 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973687 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:10.980352 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973689 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:10.980352 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973692 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:10.980352 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973695 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:10.980352 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973697 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:10.980352 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973700 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:10.980352 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973703 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:10.980352 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973706 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:10.980352 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973708 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:10.980352 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973711 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:10.980914 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973714 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:10.980914 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973716 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:10.980914 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973719 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:10.980914 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973721 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:10.980914 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973724 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:10.980914 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973727 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:10.980914 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973729 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:10.980914 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973732 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:10.980914 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973734 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:10.980914 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973737 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:10.980914 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973739 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:10.980914 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973744 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:10.980914 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973746 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:10.980914 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973750 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:10.980914 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973752 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:10.980914 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973755 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:10.980914 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973757 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:10.980914 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973760 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:10.980914 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973763 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:10.980914 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973765 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:10.981690 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973768 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:10.981690 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973771 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:10.981690 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973773 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:10.981690 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973776 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:10.981690 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973781 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:10.981690 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973784 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:10.981690 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973787 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:10.981690 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973790 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:10.981690 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973792 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:10.981690 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973795 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:10.981690 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973798 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:10.981690 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973801 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:10.981690 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973803 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:10.981690 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973806 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:10.981690 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973808 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:10.981690 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973811 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:10.981690 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973813 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:10.981690 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973816 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:10.981690 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973818 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:10.981690 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973821 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:10.982315 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973823 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:10.982315 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973826 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:10.982315 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973828 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:10.982315 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973832 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:10.982315 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973835 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:10.982315 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973837 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:10.982315 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973842 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:10.982315 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973845 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:10.982315 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973848 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:10.982315 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973851 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:10.982315 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973854 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:10.982315 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973857 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:10.982315 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973859 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:10.982315 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973862 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:10.982315 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973865 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:10.982315 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973867 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:10.982315 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973871 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:10.982315 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973873 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:10.982315 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973876 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:10.982315 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973878 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:10.982847 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973881 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:10.982847 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973884 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:10.982847 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.973886 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:10.982847 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.974844 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:10.982847 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.982569 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 11:16:10.982847 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.982590 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 11:16:10.982847 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982646 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:10.982847 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982651 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:10.982847 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982655 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:10.982847 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982658 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:10.982847 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982661 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:10.982847 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982663 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:10.982847 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982668 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:10.982847 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982673 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:10.982847 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982678 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:10.983235 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982681 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:10.983235 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982685 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:10.983235 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982688 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:10.983235 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982690 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:10.983235 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982693 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:10.983235 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982696 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:10.983235 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982698 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:10.983235 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982701 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:10.983235 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982704 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:10.983235 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982706 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:10.983235 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982709 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:10.983235 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982712 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:10.983235 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982715 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:10.983235 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982717 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:10.983235 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982721 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:10.983235 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982723 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:10.983235 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982726 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:10.983235 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982729 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:10.983235 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982732 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:10.983235 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982734 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:10.983739 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982737 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:10.983739 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982739 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:10.983739 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982742 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:10.983739 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982745 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:10.983739 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982748 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:10.983739 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982751 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:10.983739 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982753 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:10.983739 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982756 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:10.983739 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982758 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:10.983739 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982761 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:10.983739 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982764 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:10.983739 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982767 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:10.983739 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982769 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:10.983739 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982772 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:10.983739 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982775 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:10.983739 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982777 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:10.983739 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982780 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:10.983739 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982782 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:10.983739 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982785 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:10.984247 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982787 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:10.984247 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982790 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:10.984247 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982792 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:10.984247 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982795 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:10.984247 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982797 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:10.984247 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982800 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:10.984247 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982803 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:10.984247 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982805 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:10.984247 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982808 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:10.984247 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982810 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:10.984247 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982813 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:10.984247 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982815 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:10.984247 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982818 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:10.984247 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982821 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:10.984247 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982823 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:10.984247 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982827 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:10.984247 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982830 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:10.984247 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982832 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:10.984247 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982835 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:10.984247 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982838 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:10.984788 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982840 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:10.984788 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982843 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:10.984788 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982845 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:10.984788 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982848 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:10.984788 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982851 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:10.984788 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982853 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:10.984788 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982856 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:10.984788 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982859 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:10.984788 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982862 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:10.984788 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982865 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:10.984788 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982868 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:10.984788 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982870 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:10.984788 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982873 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:10.984788 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982875 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:10.984788 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982878 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:10.984788 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982880 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:10.984788 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982883 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:10.984788 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.982885 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:10.985227 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.982891 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:10.985227 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983006 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:10.985227 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983012 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:10.985227 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983015 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:10.985227 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983019 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:10.985227 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983022 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:10.985227 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983025 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:10.985227 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983028 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:10.985227 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983030 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:10.985227 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983034 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:10.985227 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983037 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:10.985227 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983040 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:10.985227 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983042 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:10.985227 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983045 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:10.985227 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983048 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:10.985227 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983050 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:10.985636 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983053 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:10.985636 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983055 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:10.985636 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983058 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:10.985636 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983061 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:10.985636 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983063 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:10.985636 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983066 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:10.985636 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983068 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:10.985636 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983071 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:10.985636 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983074 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:10.985636 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983076 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:10.985636 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983079 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:10.985636 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983081 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:10.985636 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983084 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:10.985636 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983086 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:10.985636 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983089 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:10.985636 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983091 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:10.985636 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983094 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:10.985636 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983096 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:10.985636 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983099 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:10.985636 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983102 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:10.986114 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983104 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:10.986114 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983107 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:10.986114 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983110 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:10.986114 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983112 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:10.986114 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983115 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:10.986114 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983117 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:10.986114 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983120 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:10.986114 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983123 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:10.986114 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983125 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:10.986114 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983129 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:10.986114 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983131 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:10.986114 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983134 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:10.986114 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983136 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:10.986114 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983139 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:10.986114 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983141 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:10.986114 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983144 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:10.986114 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983146 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:10.986114 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983149 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:10.986114 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983151 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:10.986593 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983154 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:10.986593 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983158 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:10.986593 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983161 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:10.986593 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983163 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:10.986593 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983165 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:10.986593 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983168 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:10.986593 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983170 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:10.986593 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983173 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:10.986593 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983175 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:10.986593 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983178 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:10.986593 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983180 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:10.986593 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983183 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:10.986593 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983187 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:10.986593 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983190 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:10.986593 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983193 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:10.986593 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983196 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:10.986593 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983200 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:10.986593 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983203 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:10.986593 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983206 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:10.987044 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983209 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:10.987044 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983212 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:10.987044 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983214 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:10.987044 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983217 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:10.987044 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983219 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:10.987044 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983222 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:10.987044 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983224 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:10.987044 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983227 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:10.987044 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983229 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:10.987044 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983232 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:10.987044 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983234 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:10.987044 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983237 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:10.987044 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:10.983240 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:10.987044 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.983244 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:10.987044 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.983381 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 11:16:10.987044 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.986924 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 11:16:10.987859 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.987846 2571 server.go:1019] "Starting client certificate rotation" Apr 17 11:16:10.987955 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.987940 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:16:10.987995 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:10.987978 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:16:11.013818 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.013796 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:16:11.016548 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.016531 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:16:11.032938 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.032919 2571 log.go:25] "Validated CRI v1 runtime API" Apr 17 11:16:11.041229 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.041211 2571 log.go:25] "Validated CRI v1 image API" Apr 17 11:16:11.042489 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.042471 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 11:16:11.045760 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.045741 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:16:11.046992 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.046973 2571 fs.go:135] Filesystem UUIDs: map[16a44681-e5ab-4625-803a-ce24268cb119:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 b01d4060-665e-4c03-ad99-93f040c405d3:/dev/nvme0n1p4] Apr 17 11:16:11.047039 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.046993 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 11:16:11.053853 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.053740 2571 manager.go:217] Machine: {Timestamp:2026-04-17 11:16:11.05168506 +0000 UTC m=+0.447608154 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097346 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec289a882625e600568c48d51829b199 SystemUUID:ec289a88-2625-e600-568c-48d51829b199 BootID:8864a498-19de-4812-b63a-9497ad826bda Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:09:68:0d:81:63 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:09:68:0d:81:63 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:fe:a9:e2:76:66:98 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 11:16:11.053853 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.053842 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 11:16:11.053983 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.053926 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 11:16:11.055109 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.055086 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 11:16:11.055280 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.055112 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-94.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 11:16:11.055331 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.055290 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 11:16:11.055331 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.055299 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 11:16:11.055331 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.055312 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:16:11.056681 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.056670 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:16:11.058193 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.058183 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:16:11.058498 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.058488 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 11:16:11.061054 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.061045 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 17 11:16:11.061086 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.061058 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 11:16:11.061086 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.061070 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 11:16:11.061086 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.061079 2571 kubelet.go:397] "Adding apiserver pod source" Apr 17 11:16:11.061178 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.061089 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 11:16:11.062221 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.062210 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:16:11.062269 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.062230 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:16:11.064537 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.064477 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tkv99" Apr 17 11:16:11.065542 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.065518 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 11:16:11.066926 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.066912 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 11:16:11.069133 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.069117 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 11:16:11.069133 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.069135 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 11:16:11.069231 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.069141 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 11:16:11.069231 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.069148 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 11:16:11.069231 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.069158 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 11:16:11.069231 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.069167 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 11:16:11.069231 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.069176 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 11:16:11.069231 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.069181 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 11:16:11.069231 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.069189 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 11:16:11.069231 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.069195 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 11:16:11.069231 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.069213 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 11:16:11.069231 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.069221 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 11:16:11.070113 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.070104 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 11:16:11.070113 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.070113 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 11:16:11.071155 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.071135 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-tkv99" Apr 17 11:16:11.073108 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:11.073079 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-94.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 11:16:11.073194 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:11.073081 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 11:16:11.073194 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.073138 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-94.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 11:16:11.073757 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.073743 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 11:16:11.073796 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.073782 2571 server.go:1295] "Started kubelet" Apr 17 11:16:11.073872 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.073850 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 11:16:11.073918 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.073859 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 11:16:11.073961 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.073919 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 11:16:11.074714 ip-10-0-129-94 systemd[1]: Started Kubernetes Kubelet. Apr 17 11:16:11.075825 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.075806 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 11:16:11.076786 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.076771 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 17 11:16:11.080707 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.080689 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 11:16:11.081303 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.081290 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 11:16:11.082005 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.081989 2571 factory.go:55] Registering systemd factory Apr 17 11:16:11.082005 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.082007 2571 factory.go:223] Registration of the systemd container factory successfully Apr 17 11:16:11.082166 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.082128 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 11:16:11.082219 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.082177 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 11:16:11.082219 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.082194 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 11:16:11.082219 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.082201 2571 factory.go:153] Registering CRI-O factory Apr 17 11:16:11.082219 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.082212 2571 factory.go:223] Registration of the crio container factory successfully Apr 17 11:16:11.082452 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.082260 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 11:16:11.082452 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.082283 2571 factory.go:103] Registering Raw factory Apr 17 11:16:11.082452 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.082300 2571 manager.go:1196] Started watching for new ooms in manager Apr 17 11:16:11.082452 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.082342 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 17 11:16:11.082452 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.082351 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 17 11:16:11.082452 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:11.082354 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-94.ec2.internal\" not found" Apr 17 11:16:11.082725 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.082707 2571 manager.go:319] Starting recovery of all containers Apr 17 11:16:11.084161 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:11.084135 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 11:16:11.084360 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.084329 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:11.091210 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.091050 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 11:16:11.094192 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.094172 2571 manager.go:324] Recovery completed Apr 17 11:16:11.095388 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:11.095352 2571 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-94.ec2.internal\" not found" node="ip-10-0-129-94.ec2.internal" Apr 17 11:16:11.098976 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.098964 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:11.101452 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.101437 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-94.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:11.101522 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.101466 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-94.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:11.101522 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.101480 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-94.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:11.101960 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.101947 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 11:16:11.102015 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.101959 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 11:16:11.102015 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.101978 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:16:11.104415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.104381 2571 policy_none.go:49] "None policy: Start" Apr 17 11:16:11.104415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.104401 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 11:16:11.104415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.104414 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 17 11:16:11.137927 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.137911 2571 manager.go:341] "Starting Device Plugin manager" Apr 17 11:16:11.151388 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:11.137943 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 11:16:11.151388 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.137952 2571 server.go:85] "Starting device plugin registration server" Apr 17 11:16:11.151388 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.138146 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 11:16:11.151388 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.138156 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 11:16:11.151388 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.138228 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 11:16:11.151388 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.138296 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 11:16:11.151388 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.138304 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 11:16:11.151388 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:11.138903 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 11:16:11.151388 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:11.138940 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-94.ec2.internal\" not found" Apr 17 11:16:11.211452 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.211420 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 11:16:11.211452 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.211453 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 11:16:11.211619 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.211471 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 11:16:11.211619 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.211478 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 11:16:11.211619 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:11.211508 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 11:16:11.214402 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.214384 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:11.238703 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.238659 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:11.239644 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.239629 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-94.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:11.239699 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.239654 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-94.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:11.239699 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.239663 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-94.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:11.239699 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.239691 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-94.ec2.internal" Apr 17 11:16:11.248717 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.248702 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-94.ec2.internal" Apr 17 11:16:11.248806 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:11.248722 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-94.ec2.internal\": node \"ip-10-0-129-94.ec2.internal\" not found" Apr 17 11:16:11.267321 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:11.267300 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-94.ec2.internal\" not found" Apr 17 11:16:11.311834 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.311786 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-94.ec2.internal"] Apr 17 11:16:11.311919 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.311875 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:11.312808 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.312793 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-94.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:11.312857 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.312821 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-94.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:11.312857 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.312830 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-94.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:11.314221 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.314208 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:11.314382 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.314359 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal" Apr 17 11:16:11.314427 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.314398 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:11.314910 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.314893 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-94.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:11.314964 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.314925 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-94.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:11.314964 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.314935 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-94.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:11.315031 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.314893 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-94.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:11.315031 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.314996 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-94.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:11.315031 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.315008 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-94.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:11.316237 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.316222 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-94.ec2.internal" Apr 17 11:16:11.316283 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.316247 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:11.316856 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.316841 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-94.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:11.316912 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.316866 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-94.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:11.316912 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.316880 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-94.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:11.341019 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:11.340998 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-94.ec2.internal\" not found" node="ip-10-0-129-94.ec2.internal" Apr 17 11:16:11.344380 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:11.344353 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-94.ec2.internal\" not found" node="ip-10-0-129-94.ec2.internal" Apr 17 11:16:11.367427 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:11.367401 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-94.ec2.internal\" not found" Apr 17 11:16:11.383893 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.383876 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d762e99710cc8c4964384ecaf1747fc1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal\" (UID: \"d762e99710cc8c4964384ecaf1747fc1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal" Apr 17 11:16:11.383984 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.383900 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d762e99710cc8c4964384ecaf1747fc1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal\" (UID: \"d762e99710cc8c4964384ecaf1747fc1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal" Apr 17 11:16:11.383984 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.383918 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/69d674d86c0903ec8afec4ccfab0ec85-config\") pod \"kube-apiserver-proxy-ip-10-0-129-94.ec2.internal\" (UID: \"69d674d86c0903ec8afec4ccfab0ec85\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-94.ec2.internal" Apr 17 11:16:11.468481 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:11.468453 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-94.ec2.internal\" not found" Apr 17 11:16:11.484438 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.484416 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d762e99710cc8c4964384ecaf1747fc1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal\" (UID: \"d762e99710cc8c4964384ecaf1747fc1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal" Apr 17 11:16:11.484504 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.484452 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d762e99710cc8c4964384ecaf1747fc1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal\" (UID: \"d762e99710cc8c4964384ecaf1747fc1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal" Apr 17 11:16:11.484504 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.484478 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/69d674d86c0903ec8afec4ccfab0ec85-config\") pod \"kube-apiserver-proxy-ip-10-0-129-94.ec2.internal\" (UID: \"69d674d86c0903ec8afec4ccfab0ec85\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-94.ec2.internal" Apr 17 11:16:11.484578 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.484503 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/69d674d86c0903ec8afec4ccfab0ec85-config\") pod \"kube-apiserver-proxy-ip-10-0-129-94.ec2.internal\" (UID: \"69d674d86c0903ec8afec4ccfab0ec85\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-94.ec2.internal" Apr 17 11:16:11.484578 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.484511 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d762e99710cc8c4964384ecaf1747fc1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal\" (UID: \"d762e99710cc8c4964384ecaf1747fc1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal" Apr 17 11:16:11.484578 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.484512 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d762e99710cc8c4964384ecaf1747fc1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal\" (UID: \"d762e99710cc8c4964384ecaf1747fc1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal" Apr 17 11:16:11.569634 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:11.569555 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-94.ec2.internal\" not found" Apr 17 11:16:11.643110 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.643087 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal" Apr 17 11:16:11.646785 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.646757 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-94.ec2.internal" Apr 17 11:16:11.670423 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:11.670388 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-94.ec2.internal\" not found" Apr 17 11:16:11.770917 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:11.770882 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-94.ec2.internal\" not found" Apr 17 11:16:11.871488 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:11.871410 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-94.ec2.internal\" not found" Apr 17 11:16:11.971931 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:11.971898 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-94.ec2.internal\" not found" Apr 17 11:16:11.988324 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.988305 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 11:16:11.988454 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.988438 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:16:11.988514 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:11.988484 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:16:12.072890 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:12.072858 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-94.ec2.internal\" not found" Apr 17 11:16:12.072890 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:12.072844 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 11:11:11 +0000 UTC" deadline="2027-09-22 10:48:58.643714279 +0000 UTC" Apr 17 11:16:12.073077 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:12.072901 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12551h32m46.570818468s" Apr 17 11:16:12.081590 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:12.081564 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 11:16:12.105154 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:12.105128 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:16:12.139171 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:12.139143 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd762e99710cc8c4964384ecaf1747fc1.slice/crio-3f1622784adf195304df6535c79411d4fdc1967f05fb5624f343a265eb23a354 WatchSource:0}: Error finding container 3f1622784adf195304df6535c79411d4fdc1967f05fb5624f343a265eb23a354: Status 404 returned error can't find the container with id 3f1622784adf195304df6535c79411d4fdc1967f05fb5624f343a265eb23a354 Apr 17 11:16:12.139557 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:12.139533 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69d674d86c0903ec8afec4ccfab0ec85.slice/crio-cf6393eed9f3ee93d9822be4521356ebc23eb05950743577b31a7d92fc896753 WatchSource:0}: Error finding container cf6393eed9f3ee93d9822be4521356ebc23eb05950743577b31a7d92fc896753: Status 404 returned error can't find the container with id cf6393eed9f3ee93d9822be4521356ebc23eb05950743577b31a7d92fc896753 Apr 17 11:16:12.144964 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:12.144949 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:16:12.160168 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:12.160148 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-thx5b" Apr 17 11:16:12.173520 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:12.173499 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-94.ec2.internal\" not found" Apr 17 11:16:12.185268 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:12.185249 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-thx5b" Apr 17 11:16:12.215180 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:12.215132 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal" event={"ID":"d762e99710cc8c4964384ecaf1747fc1","Type":"ContainerStarted","Data":"3f1622784adf195304df6535c79411d4fdc1967f05fb5624f343a265eb23a354"} Apr 17 11:16:12.216125 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:12.216103 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-94.ec2.internal" event={"ID":"69d674d86c0903ec8afec4ccfab0ec85","Type":"ContainerStarted","Data":"cf6393eed9f3ee93d9822be4521356ebc23eb05950743577b31a7d92fc896753"} Apr 17 11:16:12.274305 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:12.274272 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-94.ec2.internal\" not found" Apr 17 11:16:12.364087 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:12.364066 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:12.382483 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:12.382423 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-94.ec2.internal" Apr 17 11:16:12.392132 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:12.392106 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:16:12.394050 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:12.394033 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal" Apr 17 11:16:12.403945 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:12.403921 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:16:12.537692 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:12.537636 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:13.062346 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.062307 2571 apiserver.go:52] "Watching apiserver" Apr 17 11:16:13.069269 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.069248 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 11:16:13.069658 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.069632 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-7xnhl","kube-system/global-pull-secret-syncer-27glw","kube-system/konnectivity-agent-29v4q","openshift-image-registry/node-ca-fgzfd","openshift-multus/network-metrics-daemon-zsnbl","openshift-network-operator/iptables-alerter-jc7f5","openshift-ovn-kubernetes/ovnkube-node-wjrrc","kube-system/kube-apiserver-proxy-ip-10-0-129-94.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6","openshift-cluster-node-tuning-operator/tuned-prk6q","openshift-dns/node-resolver-rctf4","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal","openshift-multus/multus-65tnl","openshift-multus/multus-additional-cni-plugins-v728t"] Apr 17 11:16:13.071853 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.071828 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.073035 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.073010 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.074232 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.074165 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 11:16:13.074232 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.074256 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 11:16:13.074502 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.074482 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-tn8wx\"" Apr 17 11:16:13.074580 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.074553 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 11:16:13.074580 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.074560 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 11:16:13.074705 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.074681 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 11:16:13.074816 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.074798 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 11:16:13.074935 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.074915 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 11:16:13.075262 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.075151 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 11:16:13.075262 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.075196 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 11:16:13.075409 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.075295 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 11:16:13.075409 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.075343 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-f2dq9\"" Apr 17 11:16:13.075755 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.075736 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-29v4q" Apr 17 11:16:13.075834 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.075792 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fgzfd" Apr 17 11:16:13.077632 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.077112 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:13.077747 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:13.077722 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsnbl" podUID="44159d9f-1705-4830-8bfe-c087640f29cb" Apr 17 11:16:13.077917 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.077890 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 11:16:13.078713 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.078282 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 11:16:13.078713 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.078513 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vzb99\"" Apr 17 11:16:13.078859 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.078734 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-xdwq8\"" Apr 17 11:16:13.078859 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.078772 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 11:16:13.079692 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.078389 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 11:16:13.079692 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.079251 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jc7f5" Apr 17 11:16:13.080621 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.080600 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 11:16:13.081235 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.081181 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.081329 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.081293 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-cq9wg\"" Apr 17 11:16:13.081465 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.081407 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:13.082170 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.082143 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 11:16:13.082285 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.082248 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:13.083008 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.082846 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-9md9h\"" Apr 17 11:16:13.083114 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.083040 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:13.083114 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.083066 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 11:16:13.083114 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:13.083091 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7xnhl" podUID="27626c71-9dab-4636-93f8-f3321c44e711" Apr 17 11:16:13.083335 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.083312 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 11:16:13.083757 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.083740 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 11:16:13.084438 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.084420 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.085816 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.085795 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.086071 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.086022 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 11:16:13.086150 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.086099 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 11:16:13.086463 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.086441 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-jm5kp\"" Apr 17 11:16:13.087572 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.087551 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:13.087671 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.087619 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:13.087671 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.087638 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-5fpwr\"" Apr 17 11:16:13.087921 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.087902 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rctf4" Apr 17 11:16:13.089315 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.089296 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:13.089454 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.089435 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 11:16:13.089527 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:13.089404 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-27glw" podUID="cf904787-1ca2-44e2-a227-75aa1d60f7a0" Apr 17 11:16:13.089622 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.089602 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 11:16:13.089750 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.089732 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-csxvt\"" Apr 17 11:16:13.093122 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093098 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-run-ovn\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.093211 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093140 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgwtr\" (UniqueName: \"kubernetes.io/projected/4c820c8f-2002-4e3b-afd9-88115414ecc4-kube-api-access-zgwtr\") pod \"iptables-alerter-jc7f5\" (UID: \"4c820c8f-2002-4e3b-afd9-88115414ecc4\") " pod="openshift-network-operator/iptables-alerter-jc7f5" Apr 17 11:16:13.093211 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093169 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-etc-openvswitch\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.093211 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093202 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7c6d0851-5688-40f9-8967-116e7a6bddf3-multus-daemon-config\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.093401 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093227 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnscc\" (UniqueName: \"kubernetes.io/projected/7c6d0851-5688-40f9-8967-116e7a6bddf3-kube-api-access-jnscc\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.093401 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093250 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-run-netns\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.093401 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093274 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-etc-kubernetes\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.093401 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093297 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02fceaee-2358-4389-a551-6c489878daca-host\") pod \"node-ca-fgzfd\" (UID: \"02fceaee-2358-4389-a551-6c489878daca\") " pod="openshift-image-registry/node-ca-fgzfd" Apr 17 11:16:13.093401 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093344 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-systemd-units\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.093401 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093388 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-log-socket\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.093733 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093426 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-multus-cni-dir\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.093733 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093495 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/02fceaee-2358-4389-a551-6c489878daca-serviceca\") pod \"node-ca-fgzfd\" (UID: \"02fceaee-2358-4389-a551-6c489878daca\") " pod="openshift-image-registry/node-ca-fgzfd" Apr 17 11:16:13.093733 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093546 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-run-systemd\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.093733 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093575 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-system-cni-dir\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.093733 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093624 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-os-release\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.093733 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093649 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-device-dir\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.093733 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093671 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-sys-fs\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.093733 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093701 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srn5t\" (UniqueName: \"kubernetes.io/projected/02fceaee-2358-4389-a551-6c489878daca-kube-api-access-srn5t\") pod \"node-ca-fgzfd\" (UID: \"02fceaee-2358-4389-a551-6c489878daca\") " pod="openshift-image-registry/node-ca-fgzfd" Apr 17 11:16:13.093733 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093726 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-cni-bin\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.094086 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093749 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4c820c8f-2002-4e3b-afd9-88115414ecc4-iptables-alerter-script\") pod \"iptables-alerter-jc7f5\" (UID: \"4c820c8f-2002-4e3b-afd9-88115414ecc4\") " pod="openshift-network-operator/iptables-alerter-jc7f5" Apr 17 11:16:13.094086 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093781 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c820c8f-2002-4e3b-afd9-88115414ecc4-host-slash\") pod \"iptables-alerter-jc7f5\" (UID: \"4c820c8f-2002-4e3b-afd9-88115414ecc4\") " pod="openshift-network-operator/iptables-alerter-jc7f5" Apr 17 11:16:13.094086 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093804 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-etc-selinux\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.094086 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093832 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt448\" (UniqueName: \"kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448\") pod \"network-check-target-7xnhl\" (UID: \"27626c71-9dab-4636-93f8-f3321c44e711\") " pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:13.094086 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093872 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-run-openvswitch\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.094086 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093908 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-host-run-k8s-cni-cncf-io\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.094086 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093934 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-hostroot\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.094086 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093959 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-node-log\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.094086 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.093997 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.094086 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094037 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-run-ovn-kubernetes\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.094086 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094074 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7c6d0851-5688-40f9-8967-116e7a6bddf3-cni-binary-copy\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.094589 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094101 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-host-var-lib-kubelet\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.094589 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094125 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-host-run-multus-certs\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.094589 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094149 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a6c5567f-d00d-4e77-b239-f0ad9016d0b1-agent-certs\") pod \"konnectivity-agent-29v4q\" (UID: \"a6c5567f-d00d-4e77-b239-f0ad9016d0b1\") " pod="kube-system/konnectivity-agent-29v4q" Apr 17 11:16:13.094589 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094173 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgh7h\" (UniqueName: \"kubernetes.io/projected/44159d9f-1705-4830-8bfe-c087640f29cb-kube-api-access-dgh7h\") pod \"network-metrics-daemon-zsnbl\" (UID: \"44159d9f-1705-4830-8bfe-c087640f29cb\") " pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:13.094589 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094216 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-cni-netd\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.094589 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094251 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc9qv\" (UniqueName: \"kubernetes.io/projected/b4022fde-6cb7-4448-ba75-34477921e084-kube-api-access-wc9qv\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.094589 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094282 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-cnibin\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.094589 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094301 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-host-run-netns\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.094589 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094315 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-host-var-lib-cni-bin\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.094589 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094359 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.094589 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094421 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-slash\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.094589 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094456 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b4022fde-6cb7-4448-ba75-34477921e084-env-overrides\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.094589 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094479 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-multus-conf-dir\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.094589 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094510 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a6c5567f-d00d-4e77-b239-f0ad9016d0b1-konnectivity-ca\") pod \"konnectivity-agent-29v4q\" (UID: \"a6c5567f-d00d-4e77-b239-f0ad9016d0b1\") " pod="kube-system/konnectivity-agent-29v4q" Apr 17 11:16:13.094589 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094539 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-multus-socket-dir-parent\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.094589 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094564 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-host-var-lib-cni-multus\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.094589 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094595 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-kubelet\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.095225 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094617 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b4022fde-6cb7-4448-ba75-34477921e084-ovnkube-script-lib\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.095225 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094657 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-registration-dir\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.095225 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094686 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b4022fde-6cb7-4448-ba75-34477921e084-ovn-node-metrics-cert\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.095225 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094711 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-socket-dir\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.095225 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094736 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cq57\" (UniqueName: \"kubernetes.io/projected/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-kube-api-access-7cq57\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.095225 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094760 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs\") pod \"network-metrics-daemon-zsnbl\" (UID: \"44159d9f-1705-4830-8bfe-c087640f29cb\") " pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:13.095225 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094783 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-var-lib-openvswitch\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.095225 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.094807 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b4022fde-6cb7-4448-ba75-34477921e084-ovnkube-config\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.183006 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.182974 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 11:16:13.186017 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.185983 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:11:12 +0000 UTC" deadline="2028-01-01 14:22:39.043875876 +0000 UTC" Apr 17 11:16:13.186017 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.186014 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14979h6m25.857864214s" Apr 17 11:16:13.195536 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.195505 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b4022fde-6cb7-4448-ba75-34477921e084-ovnkube-config\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.195536 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.195539 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mgtk\" (UniqueName: \"kubernetes.io/projected/be735422-56b8-4ef0-8974-325284a7057a-kube-api-access-8mgtk\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.195706 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.195559 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-run-ovn\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.195706 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.195575 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgwtr\" (UniqueName: \"kubernetes.io/projected/4c820c8f-2002-4e3b-afd9-88115414ecc4-kube-api-access-zgwtr\") pod \"iptables-alerter-jc7f5\" (UID: \"4c820c8f-2002-4e3b-afd9-88115414ecc4\") " pod="openshift-network-operator/iptables-alerter-jc7f5" Apr 17 11:16:13.195706 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.195591 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31b7facb-4c12-4174-a583-430fbb53bf63-cnibin\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.195848 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.195721 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-run-ovn\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.195848 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.195739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-etc-openvswitch\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.195848 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.195790 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7c6d0851-5688-40f9-8967-116e7a6bddf3-multus-daemon-config\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.195848 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.195817 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jnscc\" (UniqueName: \"kubernetes.io/projected/7c6d0851-5688-40f9-8967-116e7a6bddf3-kube-api-access-jnscc\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.196038 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.195848 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/05781f98-6d49-4771-a747-d678a55de76e-hosts-file\") pod \"node-resolver-rctf4\" (UID: \"05781f98-6d49-4771-a747-d678a55de76e\") " pod="openshift-dns/node-resolver-rctf4" Apr 17 11:16:13.196038 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.195859 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-etc-openvswitch\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.196038 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.195879 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-run-netns\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.196038 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.195906 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-etc-kubernetes\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.196038 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.195948 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02fceaee-2358-4389-a551-6c489878daca-host\") pod \"node-ca-fgzfd\" (UID: \"02fceaee-2358-4389-a551-6c489878daca\") " pod="openshift-image-registry/node-ca-fgzfd" Apr 17 11:16:13.196038 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.195972 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-run-netns\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.196038 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.195990 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31b7facb-4c12-4174-a583-430fbb53bf63-os-release\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.196038 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196017 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-etc-kubernetes\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.196490 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196057 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31b7facb-4c12-4174-a583-430fbb53bf63-cni-binary-copy\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.196490 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196095 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31b7facb-4c12-4174-a583-430fbb53bf63-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.196490 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196120 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02fceaee-2358-4389-a551-6c489878daca-host\") pod \"node-ca-fgzfd\" (UID: \"02fceaee-2358-4389-a551-6c489878daca\") " pod="openshift-image-registry/node-ca-fgzfd" Apr 17 11:16:13.196490 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196121 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-etc-sysconfig\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.196490 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196175 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-systemd-units\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.196490 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196198 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-log-socket\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.196490 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196200 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b4022fde-6cb7-4448-ba75-34477921e084-ovnkube-config\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.196490 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196221 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-multus-cni-dir\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.196490 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196248 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/02fceaee-2358-4389-a551-6c489878daca-serviceca\") pod \"node-ca-fgzfd\" (UID: \"02fceaee-2358-4389-a551-6c489878daca\") " pod="openshift-image-registry/node-ca-fgzfd" Apr 17 11:16:13.196490 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196276 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-systemd-units\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.196490 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196288 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-var-lib-kubelet\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.196490 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196303 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-log-socket\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.196490 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196320 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-run-systemd\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.196490 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196347 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-multus-cni-dir\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.196490 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196396 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-run-systemd\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.196490 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196406 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-system-cni-dir\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.196490 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196435 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-os-release\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.196490 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196453 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-system-cni-dir\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.197281 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196395 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7c6d0851-5688-40f9-8967-116e7a6bddf3-multus-daemon-config\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.197281 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196462 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-device-dir\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.197281 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196499 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-sys-fs\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.197281 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196524 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srn5t\" (UniqueName: \"kubernetes.io/projected/02fceaee-2358-4389-a551-6c489878daca-kube-api-access-srn5t\") pod \"node-ca-fgzfd\" (UID: \"02fceaee-2358-4389-a551-6c489878daca\") " pod="openshift-image-registry/node-ca-fgzfd" Apr 17 11:16:13.197281 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196547 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-os-release\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.197281 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196552 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31b7facb-4c12-4174-a583-430fbb53bf63-system-cni-dir\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.197281 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196500 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-device-dir\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.197281 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196577 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-etc-sysctl-d\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.197281 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196578 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-sys-fs\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.197281 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196602 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-cni-bin\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.197281 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196626 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-host\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.197281 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196649 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbt7b\" (UniqueName: \"kubernetes.io/projected/05781f98-6d49-4771-a747-d678a55de76e-kube-api-access-cbt7b\") pod \"node-resolver-rctf4\" (UID: \"05781f98-6d49-4771-a747-d678a55de76e\") " pod="openshift-dns/node-resolver-rctf4" Apr 17 11:16:13.197281 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196652 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-cni-bin\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.197281 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196676 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4c820c8f-2002-4e3b-afd9-88115414ecc4-iptables-alerter-script\") pod \"iptables-alerter-jc7f5\" (UID: \"4c820c8f-2002-4e3b-afd9-88115414ecc4\") " pod="openshift-network-operator/iptables-alerter-jc7f5" Apr 17 11:16:13.197281 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196703 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c820c8f-2002-4e3b-afd9-88115414ecc4-host-slash\") pod \"iptables-alerter-jc7f5\" (UID: \"4c820c8f-2002-4e3b-afd9-88115414ecc4\") " pod="openshift-network-operator/iptables-alerter-jc7f5" Apr 17 11:16:13.197281 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196724 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-etc-selinux\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.197281 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196749 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-etc-sysctl-conf\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.197875 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196775 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mt448\" (UniqueName: \"kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448\") pod \"network-check-target-7xnhl\" (UID: \"27626c71-9dab-4636-93f8-f3321c44e711\") " pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:13.197875 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196781 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c820c8f-2002-4e3b-afd9-88115414ecc4-host-slash\") pod \"iptables-alerter-jc7f5\" (UID: \"4c820c8f-2002-4e3b-afd9-88115414ecc4\") " pod="openshift-network-operator/iptables-alerter-jc7f5" Apr 17 11:16:13.197875 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196787 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/02fceaee-2358-4389-a551-6c489878daca-serviceca\") pod \"node-ca-fgzfd\" (UID: \"02fceaee-2358-4389-a551-6c489878daca\") " pod="openshift-image-registry/node-ca-fgzfd" Apr 17 11:16:13.197875 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196825 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-run-openvswitch\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.197875 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196885 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-etc-selinux\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.197875 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196936 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-host-run-k8s-cni-cncf-io\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.197875 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196963 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-hostroot\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.197875 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.196999 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-hostroot\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.197875 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197003 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-host-run-k8s-cni-cncf-io\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.197875 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197018 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-run-openvswitch\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.197875 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197034 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wmr2\" (UniqueName: \"kubernetes.io/projected/31b7facb-4c12-4174-a583-430fbb53bf63-kube-api-access-4wmr2\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.197875 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197052 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-etc-modprobe-d\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.197875 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197067 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-etc-systemd\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.197875 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197088 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/05781f98-6d49-4771-a747-d678a55de76e-tmp-dir\") pod \"node-resolver-rctf4\" (UID: \"05781f98-6d49-4771-a747-d678a55de76e\") " pod="openshift-dns/node-resolver-rctf4" Apr 17 11:16:13.197875 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197110 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-node-log\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.197875 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197126 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.197875 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197143 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/31b7facb-4c12-4174-a583-430fbb53bf63-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.198591 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197196 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4c820c8f-2002-4e3b-afd9-88115414ecc4-iptables-alerter-script\") pod \"iptables-alerter-jc7f5\" (UID: \"4c820c8f-2002-4e3b-afd9-88115414ecc4\") " pod="openshift-network-operator/iptables-alerter-jc7f5" Apr 17 11:16:13.198591 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197211 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-node-log\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.198591 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197243 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-lib-modules\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.198591 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197261 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.198591 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197269 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/cf904787-1ca2-44e2-a227-75aa1d60f7a0-kubelet-config\") pod \"global-pull-secret-syncer-27glw\" (UID: \"cf904787-1ca2-44e2-a227-75aa1d60f7a0\") " pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:13.198591 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197315 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/cf904787-1ca2-44e2-a227-75aa1d60f7a0-dbus\") pod \"global-pull-secret-syncer-27glw\" (UID: \"cf904787-1ca2-44e2-a227-75aa1d60f7a0\") " pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:13.198591 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197347 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret\") pod \"global-pull-secret-syncer-27glw\" (UID: \"cf904787-1ca2-44e2-a227-75aa1d60f7a0\") " pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:13.198591 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197393 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-run-ovn-kubernetes\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.198591 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197419 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7c6d0851-5688-40f9-8967-116e7a6bddf3-cni-binary-copy\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.198591 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197445 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-host-var-lib-kubelet\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.198591 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197471 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-host-run-multus-certs\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.198591 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197470 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-run-ovn-kubernetes\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.198591 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197498 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-host-var-lib-kubelet\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.198591 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197511 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-host-run-multus-certs\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.198591 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197502 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a6c5567f-d00d-4e77-b239-f0ad9016d0b1-agent-certs\") pod \"konnectivity-agent-29v4q\" (UID: \"a6c5567f-d00d-4e77-b239-f0ad9016d0b1\") " pod="kube-system/konnectivity-agent-29v4q" Apr 17 11:16:13.198591 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197538 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31b7facb-4c12-4174-a583-430fbb53bf63-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.198591 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197561 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-sys\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.199313 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197590 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgh7h\" (UniqueName: \"kubernetes.io/projected/44159d9f-1705-4830-8bfe-c087640f29cb-kube-api-access-dgh7h\") pod \"network-metrics-daemon-zsnbl\" (UID: \"44159d9f-1705-4830-8bfe-c087640f29cb\") " pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:13.199313 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197626 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-cni-netd\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.199313 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197659 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wc9qv\" (UniqueName: \"kubernetes.io/projected/b4022fde-6cb7-4448-ba75-34477921e084-kube-api-access-wc9qv\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.199313 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197685 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-cnibin\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.199313 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197726 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-host-run-netns\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.199313 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197738 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-cni-netd\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.199313 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197754 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-host-var-lib-cni-bin\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.199313 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197809 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-cnibin\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.199313 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197816 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.199313 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197853 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-host-run-netns\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.199313 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197876 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.199313 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197882 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/be735422-56b8-4ef0-8974-325284a7057a-tmp\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.199313 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197912 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7c6d0851-5688-40f9-8967-116e7a6bddf3-cni-binary-copy\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.199313 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197922 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-slash\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.199313 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197959 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-host-var-lib-cni-bin\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.199313 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197965 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-slash\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.199313 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.197959 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b4022fde-6cb7-4448-ba75-34477921e084-env-overrides\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.199313 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198054 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-multus-conf-dir\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.199963 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198073 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a6c5567f-d00d-4e77-b239-f0ad9016d0b1-konnectivity-ca\") pod \"konnectivity-agent-29v4q\" (UID: \"a6c5567f-d00d-4e77-b239-f0ad9016d0b1\") " pod="kube-system/konnectivity-agent-29v4q" Apr 17 11:16:13.199963 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198091 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-etc-kubernetes\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.199963 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198111 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/be735422-56b8-4ef0-8974-325284a7057a-etc-tuned\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.199963 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198159 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-multus-socket-dir-parent\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.199963 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198219 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-multus-socket-dir-parent\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.199963 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198247 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-host-var-lib-cni-multus\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.199963 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198276 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-run\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.199963 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198305 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-kubelet\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.199963 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198323 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 11:16:13.199963 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198388 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-multus-conf-dir\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.199963 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198459 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7c6d0851-5688-40f9-8967-116e7a6bddf3-host-var-lib-cni-multus\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.199963 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198546 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-host-kubelet\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.199963 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198615 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a6c5567f-d00d-4e77-b239-f0ad9016d0b1-konnectivity-ca\") pod \"konnectivity-agent-29v4q\" (UID: \"a6c5567f-d00d-4e77-b239-f0ad9016d0b1\") " pod="kube-system/konnectivity-agent-29v4q" Apr 17 11:16:13.199963 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198329 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b4022fde-6cb7-4448-ba75-34477921e084-ovnkube-script-lib\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.199963 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198673 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-registration-dir\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.199963 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198701 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b4022fde-6cb7-4448-ba75-34477921e084-ovn-node-metrics-cert\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.199963 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198726 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-socket-dir\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.200500 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198749 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7cq57\" (UniqueName: \"kubernetes.io/projected/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-kube-api-access-7cq57\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.200500 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198771 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs\") pod \"network-metrics-daemon-zsnbl\" (UID: \"44159d9f-1705-4830-8bfe-c087640f29cb\") " pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:13.200500 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198771 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-registration-dir\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.200500 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198789 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b4022fde-6cb7-4448-ba75-34477921e084-ovnkube-script-lib\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.200500 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198791 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-var-lib-openvswitch\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.200500 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.198837 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b4022fde-6cb7-4448-ba75-34477921e084-var-lib-openvswitch\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.200500 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:13.198924 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:13.200500 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:13.199008 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs podName:44159d9f-1705-4830-8bfe-c087640f29cb nodeName:}" failed. No retries permitted until 2026-04-17 11:16:13.698977856 +0000 UTC m=+3.094900958 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs") pod "network-metrics-daemon-zsnbl" (UID: "44159d9f-1705-4830-8bfe-c087640f29cb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:13.200500 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.199010 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b4022fde-6cb7-4448-ba75-34477921e084-env-overrides\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.200500 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.199066 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-socket-dir\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.201654 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.201631 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b4022fde-6cb7-4448-ba75-34477921e084-ovn-node-metrics-cert\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.201849 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.201831 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a6c5567f-d00d-4e77-b239-f0ad9016d0b1-agent-certs\") pod \"konnectivity-agent-29v4q\" (UID: \"a6c5567f-d00d-4e77-b239-f0ad9016d0b1\") " pod="kube-system/konnectivity-agent-29v4q" Apr 17 11:16:13.203852 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.203828 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgwtr\" (UniqueName: \"kubernetes.io/projected/4c820c8f-2002-4e3b-afd9-88115414ecc4-kube-api-access-zgwtr\") pod \"iptables-alerter-jc7f5\" (UID: \"4c820c8f-2002-4e3b-afd9-88115414ecc4\") " pod="openshift-network-operator/iptables-alerter-jc7f5" Apr 17 11:16:13.208566 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.208543 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgh7h\" (UniqueName: \"kubernetes.io/projected/44159d9f-1705-4830-8bfe-c087640f29cb-kube-api-access-dgh7h\") pod \"network-metrics-daemon-zsnbl\" (UID: \"44159d9f-1705-4830-8bfe-c087640f29cb\") " pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:13.208716 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.208668 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srn5t\" (UniqueName: \"kubernetes.io/projected/02fceaee-2358-4389-a551-6c489878daca-kube-api-access-srn5t\") pod \"node-ca-fgzfd\" (UID: \"02fceaee-2358-4389-a551-6c489878daca\") " pod="openshift-image-registry/node-ca-fgzfd" Apr 17 11:16:13.209099 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:13.209070 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:13.209099 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:13.209098 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:13.209230 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:13.209113 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mt448 for pod openshift-network-diagnostics/network-check-target-7xnhl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:13.209230 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:13.209186 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448 podName:27626c71-9dab-4636-93f8-f3321c44e711 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:13.70916591 +0000 UTC m=+3.105089009 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mt448" (UniqueName: "kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448") pod "network-check-target-7xnhl" (UID: "27626c71-9dab-4636-93f8-f3321c44e711") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:13.211180 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.211156 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc9qv\" (UniqueName: \"kubernetes.io/projected/b4022fde-6cb7-4448-ba75-34477921e084-kube-api-access-wc9qv\") pod \"ovnkube-node-wjrrc\" (UID: \"b4022fde-6cb7-4448-ba75-34477921e084\") " pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.211628 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.211588 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cq57\" (UniqueName: \"kubernetes.io/projected/3dc3cf5b-5d2a-40f0-a694-f05e7539986c-kube-api-access-7cq57\") pod \"aws-ebs-csi-driver-node-m6zb6\" (UID: \"3dc3cf5b-5d2a-40f0-a694-f05e7539986c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.211765 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.211749 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnscc\" (UniqueName: \"kubernetes.io/projected/7c6d0851-5688-40f9-8967-116e7a6bddf3-kube-api-access-jnscc\") pod \"multus-65tnl\" (UID: \"7c6d0851-5688-40f9-8967-116e7a6bddf3\") " pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.287850 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.287809 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:13.299497 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.299470 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/05781f98-6d49-4771-a747-d678a55de76e-tmp-dir\") pod \"node-resolver-rctf4\" (UID: \"05781f98-6d49-4771-a747-d678a55de76e\") " pod="openshift-dns/node-resolver-rctf4" Apr 17 11:16:13.299497 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.299510 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/31b7facb-4c12-4174-a583-430fbb53bf63-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.299728 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.299536 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-lib-modules\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.299728 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.299558 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/cf904787-1ca2-44e2-a227-75aa1d60f7a0-kubelet-config\") pod \"global-pull-secret-syncer-27glw\" (UID: \"cf904787-1ca2-44e2-a227-75aa1d60f7a0\") " pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:13.299728 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.299580 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/cf904787-1ca2-44e2-a227-75aa1d60f7a0-dbus\") pod \"global-pull-secret-syncer-27glw\" (UID: \"cf904787-1ca2-44e2-a227-75aa1d60f7a0\") " pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:13.299728 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.299601 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret\") pod \"global-pull-secret-syncer-27glw\" (UID: \"cf904787-1ca2-44e2-a227-75aa1d60f7a0\") " pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:13.299728 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.299626 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31b7facb-4c12-4174-a583-430fbb53bf63-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.299728 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:13.299720 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:13.299998 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.299721 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-lib-modules\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.299998 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.299775 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/cf904787-1ca2-44e2-a227-75aa1d60f7a0-dbus\") pod \"global-pull-secret-syncer-27glw\" (UID: \"cf904787-1ca2-44e2-a227-75aa1d60f7a0\") " pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:13.299998 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:13.299788 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret podName:cf904787-1ca2-44e2-a227-75aa1d60f7a0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:13.799768136 +0000 UTC m=+3.195691237 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret") pod "global-pull-secret-syncer-27glw" (UID: "cf904787-1ca2-44e2-a227-75aa1d60f7a0") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:13.299998 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.299794 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/cf904787-1ca2-44e2-a227-75aa1d60f7a0-kubelet-config\") pod \"global-pull-secret-syncer-27glw\" (UID: \"cf904787-1ca2-44e2-a227-75aa1d60f7a0\") " pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:13.299998 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.299899 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-sys\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.299998 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.299917 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31b7facb-4c12-4174-a583-430fbb53bf63-tuning-conf-dir\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.299998 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.299941 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/be735422-56b8-4ef0-8974-325284a7057a-tmp\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.299998 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.299971 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-etc-kubernetes\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.299998 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.299968 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-sys\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.300458 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300026 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-etc-kubernetes\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.300458 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300062 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/be735422-56b8-4ef0-8974-325284a7057a-etc-tuned\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.300458 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300089 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-run\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.300458 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300134 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mgtk\" (UniqueName: \"kubernetes.io/projected/be735422-56b8-4ef0-8974-325284a7057a-kube-api-access-8mgtk\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.300458 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300163 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31b7facb-4c12-4174-a583-430fbb53bf63-cnibin\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.300458 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300191 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/05781f98-6d49-4771-a747-d678a55de76e-hosts-file\") pod \"node-resolver-rctf4\" (UID: \"05781f98-6d49-4771-a747-d678a55de76e\") " pod="openshift-dns/node-resolver-rctf4" Apr 17 11:16:13.300458 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300219 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31b7facb-4c12-4174-a583-430fbb53bf63-os-release\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.300458 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300225 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-run\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.300458 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300245 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31b7facb-4c12-4174-a583-430fbb53bf63-cni-binary-copy\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.300458 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300271 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31b7facb-4c12-4174-a583-430fbb53bf63-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.300458 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300295 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-etc-sysconfig\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.300458 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300324 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-var-lib-kubelet\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.300458 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300351 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/05781f98-6d49-4771-a747-d678a55de76e-hosts-file\") pod \"node-resolver-rctf4\" (UID: \"05781f98-6d49-4771-a747-d678a55de76e\") " pod="openshift-dns/node-resolver-rctf4" Apr 17 11:16:13.300458 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300353 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31b7facb-4c12-4174-a583-430fbb53bf63-system-cni-dir\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.300458 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300411 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/31b7facb-4c12-4174-a583-430fbb53bf63-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.300458 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300419 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/05781f98-6d49-4771-a747-d678a55de76e-tmp-dir\") pod \"node-resolver-rctf4\" (UID: \"05781f98-6d49-4771-a747-d678a55de76e\") " pod="openshift-dns/node-resolver-rctf4" Apr 17 11:16:13.300458 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300443 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31b7facb-4c12-4174-a583-430fbb53bf63-os-release\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.301415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300458 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31b7facb-4c12-4174-a583-430fbb53bf63-system-cni-dir\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.301415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300470 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31b7facb-4c12-4174-a583-430fbb53bf63-cnibin\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.301415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300494 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-etc-sysconfig\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.301415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300496 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-etc-sysctl-d\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.301415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300530 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-host\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.301415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300540 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-var-lib-kubelet\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.301415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300555 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbt7b\" (UniqueName: \"kubernetes.io/projected/05781f98-6d49-4771-a747-d678a55de76e-kube-api-access-cbt7b\") pod \"node-resolver-rctf4\" (UID: \"05781f98-6d49-4771-a747-d678a55de76e\") " pod="openshift-dns/node-resolver-rctf4" Apr 17 11:16:13.301415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300581 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-etc-sysctl-conf\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.301415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300592 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-host\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.301415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300625 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wmr2\" (UniqueName: \"kubernetes.io/projected/31b7facb-4c12-4174-a583-430fbb53bf63-kube-api-access-4wmr2\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.301415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300648 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-etc-modprobe-d\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.301415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300665 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-etc-sysctl-d\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.301415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300674 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-etc-systemd\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.301415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300703 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-etc-sysctl-conf\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.301415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300745 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-etc-systemd\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.301415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300779 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/be735422-56b8-4ef0-8974-325284a7057a-etc-modprobe-d\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.301415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.300831 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31b7facb-4c12-4174-a583-430fbb53bf63-cni-binary-copy\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.302057 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.301266 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31b7facb-4c12-4174-a583-430fbb53bf63-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.302492 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.302472 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/be735422-56b8-4ef0-8974-325284a7057a-etc-tuned\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.302700 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.302683 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/be735422-56b8-4ef0-8974-325284a7057a-tmp\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.308641 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.308612 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mgtk\" (UniqueName: \"kubernetes.io/projected/be735422-56b8-4ef0-8974-325284a7057a-kube-api-access-8mgtk\") pod \"tuned-prk6q\" (UID: \"be735422-56b8-4ef0-8974-325284a7057a\") " pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.309557 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.309536 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbt7b\" (UniqueName: \"kubernetes.io/projected/05781f98-6d49-4771-a747-d678a55de76e-kube-api-access-cbt7b\") pod \"node-resolver-rctf4\" (UID: \"05781f98-6d49-4771-a747-d678a55de76e\") " pod="openshift-dns/node-resolver-rctf4" Apr 17 11:16:13.309786 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.309767 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wmr2\" (UniqueName: \"kubernetes.io/projected/31b7facb-4c12-4174-a583-430fbb53bf63-kube-api-access-4wmr2\") pod \"multus-additional-cni-plugins-v728t\" (UID: \"31b7facb-4c12-4174-a583-430fbb53bf63\") " pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.384953 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.384864 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:13.392715 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.392692 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-65tnl" Apr 17 11:16:13.401391 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.401350 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-29v4q" Apr 17 11:16:13.405949 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.405932 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fgzfd" Apr 17 11:16:13.413526 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.413502 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jc7f5" Apr 17 11:16:13.420059 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.420037 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" Apr 17 11:16:13.427601 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.427581 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-v728t" Apr 17 11:16:13.435251 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.435220 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-prk6q" Apr 17 11:16:13.440916 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.440886 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rctf4" Apr 17 11:16:13.523968 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.523940 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:13.699840 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:13.699805 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4022fde_6cb7_4448_ba75_34477921e084.slice/crio-5ba79bdbbc9422d7c0259515f185bc3444b239e5eb0427b9a3ebc4508b6f73ca WatchSource:0}: Error finding container 5ba79bdbbc9422d7c0259515f185bc3444b239e5eb0427b9a3ebc4508b6f73ca: Status 404 returned error can't find the container with id 5ba79bdbbc9422d7c0259515f185bc3444b239e5eb0427b9a3ebc4508b6f73ca Apr 17 11:16:13.701600 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:13.701497 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe735422_56b8_4ef0_8974_325284a7057a.slice/crio-1f35d1109f26bec2863697eca2b837ad086497b5587873aebc1734fce19c787a WatchSource:0}: Error finding container 1f35d1109f26bec2863697eca2b837ad086497b5587873aebc1734fce19c787a: Status 404 returned error can't find the container with id 1f35d1109f26bec2863697eca2b837ad086497b5587873aebc1734fce19c787a Apr 17 11:16:13.704992 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.704967 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs\") pod \"network-metrics-daemon-zsnbl\" (UID: \"44159d9f-1705-4830-8bfe-c087640f29cb\") " pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:13.705099 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:13.705070 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c820c8f_2002_4e3b_afd9_88115414ecc4.slice/crio-0598bc4e219b50f560ee2b67694a374874b45749ba6b31ba6e358297a8f70ea5 WatchSource:0}: Error finding container 0598bc4e219b50f560ee2b67694a374874b45749ba6b31ba6e358297a8f70ea5: Status 404 returned error can't find the container with id 0598bc4e219b50f560ee2b67694a374874b45749ba6b31ba6e358297a8f70ea5 Apr 17 11:16:13.705099 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:13.705087 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:13.705279 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:13.705152 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs podName:44159d9f-1705-4830-8bfe-c087640f29cb nodeName:}" failed. No retries permitted until 2026-04-17 11:16:14.705130153 +0000 UTC m=+4.101053249 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs") pod "network-metrics-daemon-zsnbl" (UID: "44159d9f-1705-4830-8bfe-c087640f29cb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:13.705633 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:13.705582 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02fceaee_2358_4389_a551_6c489878daca.slice/crio-b667e7ea171cf7083ebd5876bf8bd9563fb91d05e25311bb48c430790f7c6050 WatchSource:0}: Error finding container b667e7ea171cf7083ebd5876bf8bd9563fb91d05e25311bb48c430790f7c6050: Status 404 returned error can't find the container with id b667e7ea171cf7083ebd5876bf8bd9563fb91d05e25311bb48c430790f7c6050 Apr 17 11:16:13.706281 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:13.706257 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05781f98_6d49_4771_a747_d678a55de76e.slice/crio-cc64d201b84558e3b8a1911e23858466f98e12e24cbc5dc4b350c77e524d3a43 WatchSource:0}: Error finding container cc64d201b84558e3b8a1911e23858466f98e12e24cbc5dc4b350c77e524d3a43: Status 404 returned error can't find the container with id cc64d201b84558e3b8a1911e23858466f98e12e24cbc5dc4b350c77e524d3a43 Apr 17 11:16:13.707116 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:13.707046 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31b7facb_4c12_4174_a583_430fbb53bf63.slice/crio-b36a10a40b86519ce4c2d3369ca1b89fb0a43022cbc27f6557c31131f5023f18 WatchSource:0}: Error finding container b36a10a40b86519ce4c2d3369ca1b89fb0a43022cbc27f6557c31131f5023f18: Status 404 returned error can't find the container with id b36a10a40b86519ce4c2d3369ca1b89fb0a43022cbc27f6557c31131f5023f18 Apr 17 11:16:13.708664 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:13.708403 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c6d0851_5688_40f9_8967_116e7a6bddf3.slice/crio-a7c0a6c72e8c9450bc3d8a1cca5c9b9dff66e26bd179f57779e70b26deb4f7cc WatchSource:0}: Error finding container a7c0a6c72e8c9450bc3d8a1cca5c9b9dff66e26bd179f57779e70b26deb4f7cc: Status 404 returned error can't find the container with id a7c0a6c72e8c9450bc3d8a1cca5c9b9dff66e26bd179f57779e70b26deb4f7cc Apr 17 11:16:13.709198 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:13.709131 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6c5567f_d00d_4e77_b239_f0ad9016d0b1.slice/crio-e16e1fb64fa3b4f6d95a4a61f319f54df6c6c45395c56b941d39c4266f064d2a WatchSource:0}: Error finding container e16e1fb64fa3b4f6d95a4a61f319f54df6c6c45395c56b941d39c4266f064d2a: Status 404 returned error can't find the container with id e16e1fb64fa3b4f6d95a4a61f319f54df6c6c45395c56b941d39c4266f064d2a Apr 17 11:16:13.711262 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:13.711235 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dc3cf5b_5d2a_40f0_a694_f05e7539986c.slice/crio-7221a61cff1507276bad7cc515fe3cc089ae49250eff831402ac9aa6a93c5c6d WatchSource:0}: Error finding container 7221a61cff1507276bad7cc515fe3cc089ae49250eff831402ac9aa6a93c5c6d: Status 404 returned error can't find the container with id 7221a61cff1507276bad7cc515fe3cc089ae49250eff831402ac9aa6a93c5c6d Apr 17 11:16:13.806359 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.806157 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mt448\" (UniqueName: \"kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448\") pod \"network-check-target-7xnhl\" (UID: \"27626c71-9dab-4636-93f8-f3321c44e711\") " pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:13.806506 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:13.806284 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:13.806506 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:13.806391 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret\") pod \"global-pull-secret-syncer-27glw\" (UID: \"cf904787-1ca2-44e2-a227-75aa1d60f7a0\") " pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:13.806506 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:13.806427 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:13.806506 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:13.806443 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mt448 for pod openshift-network-diagnostics/network-check-target-7xnhl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:13.806506 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:13.806490 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:13.806506 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:13.806502 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448 podName:27626c71-9dab-4636-93f8-f3321c44e711 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:14.806484444 +0000 UTC m=+4.202407547 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mt448" (UniqueName: "kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448") pod "network-check-target-7xnhl" (UID: "27626c71-9dab-4636-93f8-f3321c44e711") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:13.806803 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:13.806552 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret podName:cf904787-1ca2-44e2-a227-75aa1d60f7a0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:14.806534452 +0000 UTC m=+4.202457555 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret") pod "global-pull-secret-syncer-27glw" (UID: "cf904787-1ca2-44e2-a227-75aa1d60f7a0") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:14.186995 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:14.186949 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:11:12 +0000 UTC" deadline="2028-01-10 13:06:07.118605076 +0000 UTC" Apr 17 11:16:14.186995 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:14.186991 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15193h49m52.931617098s" Apr 17 11:16:14.211872 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:14.211841 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:14.212032 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:14.211959 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7xnhl" podUID="27626c71-9dab-4636-93f8-f3321c44e711" Apr 17 11:16:14.230625 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:14.230290 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-29v4q" event={"ID":"a6c5567f-d00d-4e77-b239-f0ad9016d0b1","Type":"ContainerStarted","Data":"e16e1fb64fa3b4f6d95a4a61f319f54df6c6c45395c56b941d39c4266f064d2a"} Apr 17 11:16:14.234158 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:14.234101 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-65tnl" event={"ID":"7c6d0851-5688-40f9-8967-116e7a6bddf3","Type":"ContainerStarted","Data":"a7c0a6c72e8c9450bc3d8a1cca5c9b9dff66e26bd179f57779e70b26deb4f7cc"} Apr 17 11:16:14.237189 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:14.237139 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v728t" event={"ID":"31b7facb-4c12-4174-a583-430fbb53bf63","Type":"ContainerStarted","Data":"b36a10a40b86519ce4c2d3369ca1b89fb0a43022cbc27f6557c31131f5023f18"} Apr 17 11:16:14.238935 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:14.238870 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rctf4" event={"ID":"05781f98-6d49-4771-a747-d678a55de76e","Type":"ContainerStarted","Data":"cc64d201b84558e3b8a1911e23858466f98e12e24cbc5dc4b350c77e524d3a43"} Apr 17 11:16:14.241743 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:14.241717 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jc7f5" event={"ID":"4c820c8f-2002-4e3b-afd9-88115414ecc4","Type":"ContainerStarted","Data":"0598bc4e219b50f560ee2b67694a374874b45749ba6b31ba6e358297a8f70ea5"} Apr 17 11:16:14.254583 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:14.254549 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-94.ec2.internal" event={"ID":"69d674d86c0903ec8afec4ccfab0ec85","Type":"ContainerStarted","Data":"8ea04744695c5081c2439bf35a0af0abc77392ceb3c0ebf6704075b5b184456e"} Apr 17 11:16:14.257229 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:14.257202 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" event={"ID":"3dc3cf5b-5d2a-40f0-a694-f05e7539986c","Type":"ContainerStarted","Data":"7221a61cff1507276bad7cc515fe3cc089ae49250eff831402ac9aa6a93c5c6d"} Apr 17 11:16:14.260231 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:14.260203 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fgzfd" event={"ID":"02fceaee-2358-4389-a551-6c489878daca","Type":"ContainerStarted","Data":"b667e7ea171cf7083ebd5876bf8bd9563fb91d05e25311bb48c430790f7c6050"} Apr 17 11:16:14.263778 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:14.263753 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-prk6q" event={"ID":"be735422-56b8-4ef0-8974-325284a7057a","Type":"ContainerStarted","Data":"1f35d1109f26bec2863697eca2b837ad086497b5587873aebc1734fce19c787a"} Apr 17 11:16:14.278415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:14.278362 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" event={"ID":"b4022fde-6cb7-4448-ba75-34477921e084","Type":"ContainerStarted","Data":"5ba79bdbbc9422d7c0259515f185bc3444b239e5eb0427b9a3ebc4508b6f73ca"} Apr 17 11:16:14.714190 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:14.714153 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs\") pod \"network-metrics-daemon-zsnbl\" (UID: \"44159d9f-1705-4830-8bfe-c087640f29cb\") " pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:14.714392 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:14.714351 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:14.714454 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:14.714433 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs podName:44159d9f-1705-4830-8bfe-c087640f29cb nodeName:}" failed. No retries permitted until 2026-04-17 11:16:16.714413326 +0000 UTC m=+6.110336414 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs") pod "network-metrics-daemon-zsnbl" (UID: "44159d9f-1705-4830-8bfe-c087640f29cb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:14.814762 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:14.814720 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mt448\" (UniqueName: \"kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448\") pod \"network-check-target-7xnhl\" (UID: \"27626c71-9dab-4636-93f8-f3321c44e711\") " pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:14.814928 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:14.814782 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret\") pod \"global-pull-secret-syncer-27glw\" (UID: \"cf904787-1ca2-44e2-a227-75aa1d60f7a0\") " pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:14.814928 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:14.814897 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:14.815036 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:14.814964 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret podName:cf904787-1ca2-44e2-a227-75aa1d60f7a0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:16.814945808 +0000 UTC m=+6.210868905 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret") pod "global-pull-secret-syncer-27glw" (UID: "cf904787-1ca2-44e2-a227-75aa1d60f7a0") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:14.815411 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:14.815364 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:14.815411 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:14.815404 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:14.815545 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:14.815417 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mt448 for pod openshift-network-diagnostics/network-check-target-7xnhl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:14.815545 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:14.815461 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448 podName:27626c71-9dab-4636-93f8-f3321c44e711 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:16.815446934 +0000 UTC m=+6.211370032 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-mt448" (UniqueName: "kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448") pod "network-check-target-7xnhl" (UID: "27626c71-9dab-4636-93f8-f3321c44e711") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:15.213637 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:15.213553 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:15.213637 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:15.213580 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:15.214133 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:15.213690 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-27glw" podUID="cf904787-1ca2-44e2-a227-75aa1d60f7a0" Apr 17 11:16:15.214190 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:15.214135 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsnbl" podUID="44159d9f-1705-4830-8bfe-c087640f29cb" Apr 17 11:16:15.293222 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:15.293181 2571 generic.go:358] "Generic (PLEG): container finished" podID="d762e99710cc8c4964384ecaf1747fc1" containerID="79b49f61ff96dc625bb3bff83cac403a2de4146784dee6f8a98176d687a3a5c4" exitCode=0 Apr 17 11:16:15.293974 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:15.293734 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal" event={"ID":"d762e99710cc8c4964384ecaf1747fc1","Type":"ContainerDied","Data":"79b49f61ff96dc625bb3bff83cac403a2de4146784dee6f8a98176d687a3a5c4"} Apr 17 11:16:15.306634 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:15.306571 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-94.ec2.internal" podStartSLOduration=3.306550261 podStartE2EDuration="3.306550261s" podCreationTimestamp="2026-04-17 11:16:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:14.2703644 +0000 UTC m=+3.666287504" watchObservedRunningTime="2026-04-17 11:16:15.306550261 +0000 UTC m=+4.702473347" Apr 17 11:16:16.212355 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:16.212271 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:16.212626 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:16.212418 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7xnhl" podUID="27626c71-9dab-4636-93f8-f3321c44e711" Apr 17 11:16:16.308947 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:16.308264 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal" event={"ID":"d762e99710cc8c4964384ecaf1747fc1","Type":"ContainerStarted","Data":"30ae44b93dad144bfe1a2f0cba3852b310e06bb596478ac8824d1131b6d233e0"} Apr 17 11:16:16.731706 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:16.731663 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs\") pod \"network-metrics-daemon-zsnbl\" (UID: \"44159d9f-1705-4830-8bfe-c087640f29cb\") " pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:16.731891 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:16.731855 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:16.731964 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:16.731945 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs podName:44159d9f-1705-4830-8bfe-c087640f29cb nodeName:}" failed. No retries permitted until 2026-04-17 11:16:20.731920749 +0000 UTC m=+10.127843844 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs") pod "network-metrics-daemon-zsnbl" (UID: "44159d9f-1705-4830-8bfe-c087640f29cb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:16.832410 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:16.832357 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mt448\" (UniqueName: \"kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448\") pod \"network-check-target-7xnhl\" (UID: \"27626c71-9dab-4636-93f8-f3321c44e711\") " pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:16.832593 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:16.832435 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret\") pod \"global-pull-secret-syncer-27glw\" (UID: \"cf904787-1ca2-44e2-a227-75aa1d60f7a0\") " pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:16.832593 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:16.832570 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:16.832593 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:16.832578 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:16.832751 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:16.832604 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:16.832751 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:16.832617 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mt448 for pod openshift-network-diagnostics/network-check-target-7xnhl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:16.832751 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:16.832636 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret podName:cf904787-1ca2-44e2-a227-75aa1d60f7a0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:20.832617547 +0000 UTC m=+10.228540645 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret") pod "global-pull-secret-syncer-27glw" (UID: "cf904787-1ca2-44e2-a227-75aa1d60f7a0") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:16.832751 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:16.832672 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448 podName:27626c71-9dab-4636-93f8-f3321c44e711 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:20.832655865 +0000 UTC m=+10.228578948 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-mt448" (UniqueName: "kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448") pod "network-check-target-7xnhl" (UID: "27626c71-9dab-4636-93f8-f3321c44e711") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:17.212683 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:17.211797 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:17.212683 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:17.211935 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-27glw" podUID="cf904787-1ca2-44e2-a227-75aa1d60f7a0" Apr 17 11:16:17.212683 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:17.212341 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:17.212683 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:17.212466 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsnbl" podUID="44159d9f-1705-4830-8bfe-c087640f29cb" Apr 17 11:16:18.212201 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:18.212153 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:18.212687 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:18.212303 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7xnhl" podUID="27626c71-9dab-4636-93f8-f3321c44e711" Apr 17 11:16:19.211931 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:19.211890 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:19.212107 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:19.212014 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-27glw" podUID="cf904787-1ca2-44e2-a227-75aa1d60f7a0" Apr 17 11:16:19.212107 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:19.211890 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:19.212238 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:19.212143 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsnbl" podUID="44159d9f-1705-4830-8bfe-c087640f29cb" Apr 17 11:16:20.212272 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:20.212238 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:20.212735 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:20.212392 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7xnhl" podUID="27626c71-9dab-4636-93f8-f3321c44e711" Apr 17 11:16:20.768891 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:20.768310 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs\") pod \"network-metrics-daemon-zsnbl\" (UID: \"44159d9f-1705-4830-8bfe-c087640f29cb\") " pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:20.768891 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:20.768476 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:20.768891 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:20.768548 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs podName:44159d9f-1705-4830-8bfe-c087640f29cb nodeName:}" failed. No retries permitted until 2026-04-17 11:16:28.768527132 +0000 UTC m=+18.164450232 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs") pod "network-metrics-daemon-zsnbl" (UID: "44159d9f-1705-4830-8bfe-c087640f29cb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:20.869195 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:20.869156 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mt448\" (UniqueName: \"kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448\") pod \"network-check-target-7xnhl\" (UID: \"27626c71-9dab-4636-93f8-f3321c44e711\") " pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:20.869439 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:20.869213 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret\") pod \"global-pull-secret-syncer-27glw\" (UID: \"cf904787-1ca2-44e2-a227-75aa1d60f7a0\") " pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:20.869439 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:20.869322 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:20.869439 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:20.869344 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:20.869439 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:20.869357 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mt448 for pod openshift-network-diagnostics/network-check-target-7xnhl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:20.869439 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:20.869357 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:20.869439 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:20.869426 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448 podName:27626c71-9dab-4636-93f8-f3321c44e711 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:28.869408201 +0000 UTC m=+18.265331298 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-mt448" (UniqueName: "kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448") pod "network-check-target-7xnhl" (UID: "27626c71-9dab-4636-93f8-f3321c44e711") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:20.869439 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:20.869445 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret podName:cf904787-1ca2-44e2-a227-75aa1d60f7a0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:28.869435721 +0000 UTC m=+18.265358811 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret") pod "global-pull-secret-syncer-27glw" (UID: "cf904787-1ca2-44e2-a227-75aa1d60f7a0") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:21.213393 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:21.213176 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:21.213393 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:21.213294 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-27glw" podUID="cf904787-1ca2-44e2-a227-75aa1d60f7a0" Apr 17 11:16:21.213883 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:21.213653 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:21.213883 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:21.213762 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsnbl" podUID="44159d9f-1705-4830-8bfe-c087640f29cb" Apr 17 11:16:22.212074 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:22.212028 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:22.212262 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:22.212213 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7xnhl" podUID="27626c71-9dab-4636-93f8-f3321c44e711" Apr 17 11:16:23.212515 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:23.212428 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:23.212978 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:23.212437 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:23.212978 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:23.212576 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-27glw" podUID="cf904787-1ca2-44e2-a227-75aa1d60f7a0" Apr 17 11:16:23.212978 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:23.212678 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsnbl" podUID="44159d9f-1705-4830-8bfe-c087640f29cb" Apr 17 11:16:24.211875 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:24.211846 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:24.212035 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:24.211941 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7xnhl" podUID="27626c71-9dab-4636-93f8-f3321c44e711" Apr 17 11:16:25.212308 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:25.212273 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:25.212803 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:25.212282 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:25.212803 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:25.212437 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsnbl" podUID="44159d9f-1705-4830-8bfe-c087640f29cb" Apr 17 11:16:25.212803 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:25.212501 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-27glw" podUID="cf904787-1ca2-44e2-a227-75aa1d60f7a0" Apr 17 11:16:26.212573 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:26.212542 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:26.212934 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:26.212640 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7xnhl" podUID="27626c71-9dab-4636-93f8-f3321c44e711" Apr 17 11:16:27.212269 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:27.212228 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:27.212460 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:27.212388 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-27glw" podUID="cf904787-1ca2-44e2-a227-75aa1d60f7a0" Apr 17 11:16:27.212460 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:27.212432 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:27.212592 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:27.212538 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsnbl" podUID="44159d9f-1705-4830-8bfe-c087640f29cb" Apr 17 11:16:28.212726 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:28.212688 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:28.213183 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:28.212834 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7xnhl" podUID="27626c71-9dab-4636-93f8-f3321c44e711" Apr 17 11:16:28.827711 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:28.827664 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs\") pod \"network-metrics-daemon-zsnbl\" (UID: \"44159d9f-1705-4830-8bfe-c087640f29cb\") " pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:28.827983 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:28.827840 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:28.827983 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:28.827929 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs podName:44159d9f-1705-4830-8bfe-c087640f29cb nodeName:}" failed. No retries permitted until 2026-04-17 11:16:44.827907655 +0000 UTC m=+34.223830744 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs") pod "network-metrics-daemon-zsnbl" (UID: "44159d9f-1705-4830-8bfe-c087640f29cb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:28.928881 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:28.928842 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mt448\" (UniqueName: \"kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448\") pod \"network-check-target-7xnhl\" (UID: \"27626c71-9dab-4636-93f8-f3321c44e711\") " pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:28.929109 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:28.928890 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret\") pod \"global-pull-secret-syncer-27glw\" (UID: \"cf904787-1ca2-44e2-a227-75aa1d60f7a0\") " pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:28.929109 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:28.929032 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:28.929109 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:28.929055 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:28.929109 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:28.929060 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:28.929330 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:28.929141 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret podName:cf904787-1ca2-44e2-a227-75aa1d60f7a0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:44.929125826 +0000 UTC m=+34.325048908 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret") pod "global-pull-secret-syncer-27glw" (UID: "cf904787-1ca2-44e2-a227-75aa1d60f7a0") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:28.929330 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:28.929068 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mt448 for pod openshift-network-diagnostics/network-check-target-7xnhl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:28.929330 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:28.929228 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448 podName:27626c71-9dab-4636-93f8-f3321c44e711 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:44.929210043 +0000 UTC m=+34.325133150 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-mt448" (UniqueName: "kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448") pod "network-check-target-7xnhl" (UID: "27626c71-9dab-4636-93f8-f3321c44e711") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:29.212552 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:29.212470 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:29.212552 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:29.212506 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:29.212784 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:29.212596 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-27glw" podUID="cf904787-1ca2-44e2-a227-75aa1d60f7a0" Apr 17 11:16:29.212784 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:29.212727 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsnbl" podUID="44159d9f-1705-4830-8bfe-c087640f29cb" Apr 17 11:16:30.212634 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:30.212604 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:30.212817 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:30.212707 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7xnhl" podUID="27626c71-9dab-4636-93f8-f3321c44e711" Apr 17 11:16:31.213202 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:31.212956 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:31.213854 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:31.213311 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsnbl" podUID="44159d9f-1705-4830-8bfe-c087640f29cb" Apr 17 11:16:31.213854 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:31.213016 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:31.213854 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:31.213440 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-27glw" podUID="cf904787-1ca2-44e2-a227-75aa1d60f7a0" Apr 17 11:16:31.336235 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:31.335952 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rctf4" event={"ID":"05781f98-6d49-4771-a747-d678a55de76e","Type":"ContainerStarted","Data":"e588ab9887640a371ed61395736671058fef49cefa40d9d5db368fbebc3b4c50"} Apr 17 11:16:31.338278 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:31.338192 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" event={"ID":"3dc3cf5b-5d2a-40f0-a694-f05e7539986c","Type":"ContainerStarted","Data":"1594ee2aa91a312299d19f071a7ae5ee7bd648cee3bb5a2b3b1c69cff1acc74b"} Apr 17 11:16:31.340140 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:31.339874 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fgzfd" event={"ID":"02fceaee-2358-4389-a551-6c489878daca","Type":"ContainerStarted","Data":"01f732a529a1bae9436497d17783630643a3eed03937182af0c4d2898ee95ffa"} Apr 17 11:16:31.341558 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:31.341523 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-prk6q" event={"ID":"be735422-56b8-4ef0-8974-325284a7057a","Type":"ContainerStarted","Data":"a1b0c8a0d0a010659400e00ef2f7628e5853f4be2b225388247a875f2d0f2372"} Apr 17 11:16:31.343866 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:31.343838 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" event={"ID":"b4022fde-6cb7-4448-ba75-34477921e084","Type":"ContainerStarted","Data":"7f037dffee853cd459213e78bea595061ccc1f7f771e8ebeb4777fa964ce6d2c"} Apr 17 11:16:31.343966 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:31.343876 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" event={"ID":"b4022fde-6cb7-4448-ba75-34477921e084","Type":"ContainerStarted","Data":"a39668d6106021358e7bd2112b1c50bee99ec3ee69dd58a7f22aae63e6d42164"} Apr 17 11:16:31.343966 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:31.343891 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" event={"ID":"b4022fde-6cb7-4448-ba75-34477921e084","Type":"ContainerStarted","Data":"cc82484a10459a43fb54a9180973ee02616eb357d89a0aec3d6393b1870eae65"} Apr 17 11:16:31.343966 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:31.343903 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" event={"ID":"b4022fde-6cb7-4448-ba75-34477921e084","Type":"ContainerStarted","Data":"83b0cf9efa0a9851bffd996c65337c0ec471bf8aebf0afa23eff6da97e608450"} Apr 17 11:16:31.345129 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:31.345109 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-29v4q" event={"ID":"a6c5567f-d00d-4e77-b239-f0ad9016d0b1","Type":"ContainerStarted","Data":"66390953f29964cadb6032fe0ba51a6795a5a5fe6f899a7287e573553c236f97"} Apr 17 11:16:31.346648 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:31.346626 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-65tnl" event={"ID":"7c6d0851-5688-40f9-8967-116e7a6bddf3","Type":"ContainerStarted","Data":"b6abf30851fd919d4725da6f3562a67619c8c7a555de42acd28c80699e6ab420"} Apr 17 11:16:31.348121 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:31.348094 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v728t" event={"ID":"31b7facb-4c12-4174-a583-430fbb53bf63","Type":"ContainerStarted","Data":"b067c08719bef4d4ef69a9557a25e98dd9f4a55a408449dd6b8a8a2e90b889d8"} Apr 17 11:16:31.348340 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:31.348303 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rctf4" podStartSLOduration=3.144127727 podStartE2EDuration="20.348292518s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:13.708096627 +0000 UTC m=+3.104019710" lastFinishedPulling="2026-04-17 11:16:30.912261409 +0000 UTC m=+20.308184501" observedRunningTime="2026-04-17 11:16:31.347976805 +0000 UTC m=+20.743899910" watchObservedRunningTime="2026-04-17 11:16:31.348292518 +0000 UTC m=+20.744215621" Apr 17 11:16:31.348439 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:31.348409 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-94.ec2.internal" podStartSLOduration=19.348401683 podStartE2EDuration="19.348401683s" podCreationTimestamp="2026-04-17 11:16:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:16.323870645 +0000 UTC m=+5.719793781" watchObservedRunningTime="2026-04-17 11:16:31.348401683 +0000 UTC m=+20.744324788" Apr 17 11:16:31.367057 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:31.366993 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-65tnl" podStartSLOduration=3.090016581 podStartE2EDuration="20.366975836s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:13.710694485 +0000 UTC m=+3.106617571" lastFinishedPulling="2026-04-17 11:16:30.987653731 +0000 UTC m=+20.383576826" observedRunningTime="2026-04-17 11:16:31.366528574 +0000 UTC m=+20.762451691" watchObservedRunningTime="2026-04-17 11:16:31.366975836 +0000 UTC m=+20.762898940" Apr 17 11:16:31.381514 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:31.381468 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-29v4q" podStartSLOduration=11.328519197 podStartE2EDuration="20.381452542s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:13.710905182 +0000 UTC m=+3.106828271" lastFinishedPulling="2026-04-17 11:16:22.763838518 +0000 UTC m=+12.159761616" observedRunningTime="2026-04-17 11:16:31.381038531 +0000 UTC m=+20.776961636" watchObservedRunningTime="2026-04-17 11:16:31.381452542 +0000 UTC m=+20.777375646" Apr 17 11:16:31.393315 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:31.393274 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fgzfd" podStartSLOduration=3.18926579 podStartE2EDuration="20.393255753s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:13.708152381 +0000 UTC m=+3.104075477" lastFinishedPulling="2026-04-17 11:16:30.912142345 +0000 UTC m=+20.308065440" observedRunningTime="2026-04-17 11:16:31.392864297 +0000 UTC m=+20.788787401" watchObservedRunningTime="2026-04-17 11:16:31.393255753 +0000 UTC m=+20.789178858" Apr 17 11:16:31.407007 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:31.406959 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-prk6q" podStartSLOduration=3.19661738 podStartE2EDuration="20.406943939s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:13.703510844 +0000 UTC m=+3.099433930" lastFinishedPulling="2026-04-17 11:16:30.913837393 +0000 UTC m=+20.309760489" observedRunningTime="2026-04-17 11:16:31.406647262 +0000 UTC m=+20.802570400" watchObservedRunningTime="2026-04-17 11:16:31.406943939 +0000 UTC m=+20.802867042" Apr 17 11:16:32.123689 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:32.123668 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 11:16:32.147633 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:32.147517 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T11:16:32.123685677Z","UUID":"63363d23-4efe-4592-a9af-00a4ec144054","Handler":null,"Name":"","Endpoint":""} Apr 17 11:16:32.148869 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:32.148852 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 11:16:32.148959 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:32.148876 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 11:16:32.212609 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:32.212526 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:32.212738 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:32.212627 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7xnhl" podUID="27626c71-9dab-4636-93f8-f3321c44e711" Apr 17 11:16:32.350716 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:32.350689 2571 generic.go:358] "Generic (PLEG): container finished" podID="31b7facb-4c12-4174-a583-430fbb53bf63" containerID="b067c08719bef4d4ef69a9557a25e98dd9f4a55a408449dd6b8a8a2e90b889d8" exitCode=0 Apr 17 11:16:32.351160 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:32.350769 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v728t" event={"ID":"31b7facb-4c12-4174-a583-430fbb53bf63","Type":"ContainerDied","Data":"b067c08719bef4d4ef69a9557a25e98dd9f4a55a408449dd6b8a8a2e90b889d8"} Apr 17 11:16:32.352156 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:32.352057 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jc7f5" event={"ID":"4c820c8f-2002-4e3b-afd9-88115414ecc4","Type":"ContainerStarted","Data":"2bd2ee8406e18a441116a74064040f4213e25e84ee85793b187d143092f0c53f"} Apr 17 11:16:32.353661 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:32.353639 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" event={"ID":"3dc3cf5b-5d2a-40f0-a694-f05e7539986c","Type":"ContainerStarted","Data":"12f546607fc59099bd65ed614ec48f294117a994cda70d9e80b3d22a42e9b4cf"} Apr 17 11:16:32.356286 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:32.356260 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" event={"ID":"b4022fde-6cb7-4448-ba75-34477921e084","Type":"ContainerStarted","Data":"c348cf73025037b01ac68725f3b11288601b6e95e2521b39ed1c67d1f2ac8683"} Apr 17 11:16:32.356401 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:32.356315 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" event={"ID":"b4022fde-6cb7-4448-ba75-34477921e084","Type":"ContainerStarted","Data":"f7e335db5552613702967444b37883520cceda853d214d984be73365a9c2e183"} Apr 17 11:16:32.384870 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:32.384822 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-jc7f5" podStartSLOduration=4.17944937 podStartE2EDuration="21.384807943s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:13.706818586 +0000 UTC m=+3.102741673" lastFinishedPulling="2026-04-17 11:16:30.912177158 +0000 UTC m=+20.308100246" observedRunningTime="2026-04-17 11:16:32.384471688 +0000 UTC m=+21.780394793" watchObservedRunningTime="2026-04-17 11:16:32.384807943 +0000 UTC m=+21.780731047" Apr 17 11:16:33.212237 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:33.212136 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:33.212237 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:33.212155 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:33.212468 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:33.212259 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-27glw" podUID="cf904787-1ca2-44e2-a227-75aa1d60f7a0" Apr 17 11:16:33.212468 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:33.212332 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsnbl" podUID="44159d9f-1705-4830-8bfe-c087640f29cb" Apr 17 11:16:33.360432 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:33.360396 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" event={"ID":"3dc3cf5b-5d2a-40f0-a694-f05e7539986c","Type":"ContainerStarted","Data":"bda38283beedd8c66ebe9a7f0b58ede0271d8261657682e19ce238fde4635290"} Apr 17 11:16:33.375040 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:33.374996 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m6zb6" podStartSLOduration=3.186777742 podStartE2EDuration="22.374981533s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:13.712964774 +0000 UTC m=+3.108887855" lastFinishedPulling="2026-04-17 11:16:32.901168548 +0000 UTC m=+22.297091646" observedRunningTime="2026-04-17 11:16:33.373975418 +0000 UTC m=+22.769898521" watchObservedRunningTime="2026-04-17 11:16:33.374981533 +0000 UTC m=+22.770904638" Apr 17 11:16:33.935234 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:33.934972 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-29v4q" Apr 17 11:16:33.935803 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:33.935737 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-29v4q" Apr 17 11:16:34.212770 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:34.212689 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:34.212917 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:34.212869 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7xnhl" podUID="27626c71-9dab-4636-93f8-f3321c44e711" Apr 17 11:16:34.365866 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:34.365834 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" event={"ID":"b4022fde-6cb7-4448-ba75-34477921e084","Type":"ContainerStarted","Data":"f6d9e1712bf0e5f773e89c6afc6fcb58321649d7182cdaea53ae3b7cc5a114c4"} Apr 17 11:16:34.366318 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:34.366011 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-29v4q" Apr 17 11:16:34.366648 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:34.366627 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-29v4q" Apr 17 11:16:35.212161 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:35.212129 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:35.212399 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:35.212136 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:35.212399 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:35.212253 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-27glw" podUID="cf904787-1ca2-44e2-a227-75aa1d60f7a0" Apr 17 11:16:35.212399 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:35.212335 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsnbl" podUID="44159d9f-1705-4830-8bfe-c087640f29cb" Apr 17 11:16:36.211922 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:36.211888 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:36.212430 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:36.212000 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7xnhl" podUID="27626c71-9dab-4636-93f8-f3321c44e711" Apr 17 11:16:37.212170 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:37.211992 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:37.212682 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:37.212059 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:37.212682 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:37.212235 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-27glw" podUID="cf904787-1ca2-44e2-a227-75aa1d60f7a0" Apr 17 11:16:37.212682 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:37.212349 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsnbl" podUID="44159d9f-1705-4830-8bfe-c087640f29cb" Apr 17 11:16:37.374386 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:37.374322 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" event={"ID":"b4022fde-6cb7-4448-ba75-34477921e084","Type":"ContainerStarted","Data":"10f990dd5b199d0dacaa668bce60ba17f526436367480e069369443580ed46a7"} Apr 17 11:16:37.374604 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:37.374583 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:37.376197 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:37.376165 2571 generic.go:358] "Generic (PLEG): container finished" podID="31b7facb-4c12-4174-a583-430fbb53bf63" containerID="cf7d3fa71d62e988775a75da2e6887210c5e7cd3bfc3b80db928e495dd262c6b" exitCode=0 Apr 17 11:16:37.376306 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:37.376206 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v728t" event={"ID":"31b7facb-4c12-4174-a583-430fbb53bf63","Type":"ContainerDied","Data":"cf7d3fa71d62e988775a75da2e6887210c5e7cd3bfc3b80db928e495dd262c6b"} Apr 17 11:16:37.388969 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:37.388947 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:37.399364 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:37.399326 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" podStartSLOduration=9.1261021 podStartE2EDuration="26.399314966s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:13.702018054 +0000 UTC m=+3.097941137" lastFinishedPulling="2026-04-17 11:16:30.975230922 +0000 UTC m=+20.371154003" observedRunningTime="2026-04-17 11:16:37.399025957 +0000 UTC m=+26.794949061" watchObservedRunningTime="2026-04-17 11:16:37.399314966 +0000 UTC m=+26.795238070" Apr 17 11:16:38.212389 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:38.212348 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:38.212708 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:38.212467 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7xnhl" podUID="27626c71-9dab-4636-93f8-f3321c44e711" Apr 17 11:16:38.380955 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:38.380861 2571 generic.go:358] "Generic (PLEG): container finished" podID="31b7facb-4c12-4174-a583-430fbb53bf63" containerID="6d3ee55cfecf767a4fd9044ee2e6eb741a0f9ecf8ea3284af5808dbdecf215cd" exitCode=0 Apr 17 11:16:38.381098 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:38.380944 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v728t" event={"ID":"31b7facb-4c12-4174-a583-430fbb53bf63","Type":"ContainerDied","Data":"6d3ee55cfecf767a4fd9044ee2e6eb741a0f9ecf8ea3284af5808dbdecf215cd"} Apr 17 11:16:38.381737 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:38.381676 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:38.381737 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:38.381714 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:38.397614 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:38.397587 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:16:38.632339 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:38.632269 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7xnhl"] Apr 17 11:16:38.632508 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:38.632393 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:38.632556 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:38.632501 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7xnhl" podUID="27626c71-9dab-4636-93f8-f3321c44e711" Apr 17 11:16:38.637749 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:38.637725 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zsnbl"] Apr 17 11:16:38.637864 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:38.637818 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:38.637929 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:38.637914 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsnbl" podUID="44159d9f-1705-4830-8bfe-c087640f29cb" Apr 17 11:16:38.656559 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:38.656533 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-27glw"] Apr 17 11:16:38.656684 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:38.656628 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:38.656741 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:38.656699 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-27glw" podUID="cf904787-1ca2-44e2-a227-75aa1d60f7a0" Apr 17 11:16:39.384962 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:39.384868 2571 generic.go:358] "Generic (PLEG): container finished" podID="31b7facb-4c12-4174-a583-430fbb53bf63" containerID="598a7aed832a58ad926e2016df383517fbfb08289b116b029cb8276201c9007e" exitCode=0 Apr 17 11:16:39.384962 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:39.384949 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v728t" event={"ID":"31b7facb-4c12-4174-a583-430fbb53bf63","Type":"ContainerDied","Data":"598a7aed832a58ad926e2016df383517fbfb08289b116b029cb8276201c9007e"} Apr 17 11:16:40.212209 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:40.211864 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:40.212363 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:40.212069 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:40.212363 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:40.212253 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-27glw" podUID="cf904787-1ca2-44e2-a227-75aa1d60f7a0" Apr 17 11:16:40.212363 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:40.212068 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:40.212363 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:40.212327 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsnbl" podUID="44159d9f-1705-4830-8bfe-c087640f29cb" Apr 17 11:16:40.212565 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:40.212428 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7xnhl" podUID="27626c71-9dab-4636-93f8-f3321c44e711" Apr 17 11:16:42.212753 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:42.212639 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:42.213278 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:42.213093 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7xnhl" podUID="27626c71-9dab-4636-93f8-f3321c44e711" Apr 17 11:16:42.213278 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:42.213274 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:42.213441 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:42.213418 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-27glw" podUID="cf904787-1ca2-44e2-a227-75aa1d60f7a0" Apr 17 11:16:42.213506 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:42.213483 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:42.213694 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:42.213669 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsnbl" podUID="44159d9f-1705-4830-8bfe-c087640f29cb" Apr 17 11:16:43.907059 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:43.906984 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-94.ec2.internal" event="NodeReady" Apr 17 11:16:43.907673 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:43.907140 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 11:16:43.960431 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:43.960399 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6wgpx"] Apr 17 11:16:43.993065 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:43.993031 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-l6vms"] Apr 17 11:16:43.993231 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:43.993206 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6wgpx" Apr 17 11:16:43.995522 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:43.995495 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 11:16:43.995844 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:43.995823 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bd2vh\"" Apr 17 11:16:43.995936 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:43.995825 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 11:16:44.016936 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.016911 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6wgpx"] Apr 17 11:16:44.017052 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.016941 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l6vms"] Apr 17 11:16:44.017146 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.017047 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l6vms" Apr 17 11:16:44.019217 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.019191 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-j8tns\"" Apr 17 11:16:44.019346 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.019275 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 11:16:44.019443 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.019352 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 11:16:44.019591 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.019575 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 11:16:44.146066 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.146027 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-tmp-dir\") pod \"dns-default-6wgpx\" (UID: \"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e\") " pod="openshift-dns/dns-default-6wgpx" Apr 17 11:16:44.146066 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.146079 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-config-volume\") pod \"dns-default-6wgpx\" (UID: \"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e\") " pod="openshift-dns/dns-default-6wgpx" Apr 17 11:16:44.146314 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.146152 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert\") pod \"ingress-canary-l6vms\" (UID: \"b6d8ae60-a18c-4042-87a0-4790a47763c3\") " pod="openshift-ingress-canary/ingress-canary-l6vms" Apr 17 11:16:44.146314 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.146228 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g92gr\" (UniqueName: \"kubernetes.io/projected/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-kube-api-access-g92gr\") pod \"dns-default-6wgpx\" (UID: \"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e\") " pod="openshift-dns/dns-default-6wgpx" Apr 17 11:16:44.146314 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.146264 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls\") pod \"dns-default-6wgpx\" (UID: \"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e\") " pod="openshift-dns/dns-default-6wgpx" Apr 17 11:16:44.146497 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.146334 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk647\" (UniqueName: \"kubernetes.io/projected/b6d8ae60-a18c-4042-87a0-4790a47763c3-kube-api-access-qk647\") pod \"ingress-canary-l6vms\" (UID: \"b6d8ae60-a18c-4042-87a0-4790a47763c3\") " pod="openshift-ingress-canary/ingress-canary-l6vms" Apr 17 11:16:44.212145 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.212061 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:44.212315 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.212061 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:44.212315 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.212059 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:44.216242 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.216218 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:16:44.216242 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.216233 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-g9l5h\"" Apr 17 11:16:44.216458 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.216255 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 11:16:44.216771 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.216743 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:16:44.216771 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.216744 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nzl46\"" Apr 17 11:16:44.216918 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.216835 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:16:44.247130 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.247105 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert\") pod \"ingress-canary-l6vms\" (UID: \"b6d8ae60-a18c-4042-87a0-4790a47763c3\") " pod="openshift-ingress-canary/ingress-canary-l6vms" Apr 17 11:16:44.247264 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.247147 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g92gr\" (UniqueName: \"kubernetes.io/projected/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-kube-api-access-g92gr\") pod \"dns-default-6wgpx\" (UID: \"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e\") " pod="openshift-dns/dns-default-6wgpx" Apr 17 11:16:44.247264 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.247165 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls\") pod \"dns-default-6wgpx\" (UID: \"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e\") " pod="openshift-dns/dns-default-6wgpx" Apr 17 11:16:44.247264 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.247214 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qk647\" (UniqueName: \"kubernetes.io/projected/b6d8ae60-a18c-4042-87a0-4790a47763c3-kube-api-access-qk647\") pod \"ingress-canary-l6vms\" (UID: \"b6d8ae60-a18c-4042-87a0-4790a47763c3\") " pod="openshift-ingress-canary/ingress-canary-l6vms" Apr 17 11:16:44.247264 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:44.247259 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:44.247497 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:44.247312 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:44.247497 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:44.247324 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert podName:b6d8ae60-a18c-4042-87a0-4790a47763c3 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:44.747304707 +0000 UTC m=+34.143227808 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert") pod "ingress-canary-l6vms" (UID: "b6d8ae60-a18c-4042-87a0-4790a47763c3") : secret "canary-serving-cert" not found Apr 17 11:16:44.247497 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.247266 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-tmp-dir\") pod \"dns-default-6wgpx\" (UID: \"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e\") " pod="openshift-dns/dns-default-6wgpx" Apr 17 11:16:44.247497 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:44.247389 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls podName:d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:44.747355496 +0000 UTC m=+34.143278577 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls") pod "dns-default-6wgpx" (UID: "d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e") : secret "dns-default-metrics-tls" not found Apr 17 11:16:44.247497 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.247433 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-config-volume\") pod \"dns-default-6wgpx\" (UID: \"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e\") " pod="openshift-dns/dns-default-6wgpx" Apr 17 11:16:44.247753 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.247563 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-tmp-dir\") pod \"dns-default-6wgpx\" (UID: \"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e\") " pod="openshift-dns/dns-default-6wgpx" Apr 17 11:16:44.247897 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.247878 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-config-volume\") pod \"dns-default-6wgpx\" (UID: \"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e\") " pod="openshift-dns/dns-default-6wgpx" Apr 17 11:16:44.258453 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.258426 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g92gr\" (UniqueName: \"kubernetes.io/projected/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-kube-api-access-g92gr\") pod \"dns-default-6wgpx\" (UID: \"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e\") " pod="openshift-dns/dns-default-6wgpx" Apr 17 11:16:44.258453 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.258441 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk647\" (UniqueName: \"kubernetes.io/projected/b6d8ae60-a18c-4042-87a0-4790a47763c3-kube-api-access-qk647\") pod \"ingress-canary-l6vms\" (UID: \"b6d8ae60-a18c-4042-87a0-4790a47763c3\") " pod="openshift-ingress-canary/ingress-canary-l6vms" Apr 17 11:16:44.752078 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.752037 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert\") pod \"ingress-canary-l6vms\" (UID: \"b6d8ae60-a18c-4042-87a0-4790a47763c3\") " pod="openshift-ingress-canary/ingress-canary-l6vms" Apr 17 11:16:44.752078 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.752090 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls\") pod \"dns-default-6wgpx\" (UID: \"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e\") " pod="openshift-dns/dns-default-6wgpx" Apr 17 11:16:44.752324 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:44.752204 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:44.752324 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:44.752227 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:44.752324 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:44.752269 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls podName:d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:45.752253827 +0000 UTC m=+35.148176909 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls") pod "dns-default-6wgpx" (UID: "d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e") : secret "dns-default-metrics-tls" not found Apr 17 11:16:44.752324 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:44.752290 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert podName:b6d8ae60-a18c-4042-87a0-4790a47763c3 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:45.752275297 +0000 UTC m=+35.148198380 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert") pod "ingress-canary-l6vms" (UID: "b6d8ae60-a18c-4042-87a0-4790a47763c3") : secret "canary-serving-cert" not found Apr 17 11:16:44.853227 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.853189 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs\") pod \"network-metrics-daemon-zsnbl\" (UID: \"44159d9f-1705-4830-8bfe-c087640f29cb\") " pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:16:44.853461 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:44.853330 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:16:44.853461 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:44.853431 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs podName:44159d9f-1705-4830-8bfe-c087640f29cb nodeName:}" failed. No retries permitted until 2026-04-17 11:17:16.853409621 +0000 UTC m=+66.249332717 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs") pod "network-metrics-daemon-zsnbl" (UID: "44159d9f-1705-4830-8bfe-c087640f29cb") : secret "metrics-daemon-secret" not found Apr 17 11:16:44.954276 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.954228 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret\") pod \"global-pull-secret-syncer-27glw\" (UID: \"cf904787-1ca2-44e2-a227-75aa1d60f7a0\") " pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:44.954987 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.954347 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mt448\" (UniqueName: \"kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448\") pod \"network-check-target-7xnhl\" (UID: \"27626c71-9dab-4636-93f8-f3321c44e711\") " pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:44.956616 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.956585 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cf904787-1ca2-44e2-a227-75aa1d60f7a0-original-pull-secret\") pod \"global-pull-secret-syncer-27glw\" (UID: \"cf904787-1ca2-44e2-a227-75aa1d60f7a0\") " pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:44.956782 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:44.956766 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt448\" (UniqueName: \"kubernetes.io/projected/27626c71-9dab-4636-93f8-f3321c44e711-kube-api-access-mt448\") pod \"network-check-target-7xnhl\" (UID: \"27626c71-9dab-4636-93f8-f3321c44e711\") " pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:45.124234 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:45.124202 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-27glw" Apr 17 11:16:45.139223 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:45.139193 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:45.280808 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:45.280607 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-27glw"] Apr 17 11:16:45.304158 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:45.304125 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf904787_1ca2_44e2_a227_75aa1d60f7a0.slice/crio-d4f2e8a211867323937a7a456074c29e60cb674d4ab6002f7121bf6c0e9844f0 WatchSource:0}: Error finding container d4f2e8a211867323937a7a456074c29e60cb674d4ab6002f7121bf6c0e9844f0: Status 404 returned error can't find the container with id d4f2e8a211867323937a7a456074c29e60cb674d4ab6002f7121bf6c0e9844f0 Apr 17 11:16:45.401313 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:45.401257 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-27glw" event={"ID":"cf904787-1ca2-44e2-a227-75aa1d60f7a0","Type":"ContainerStarted","Data":"d4f2e8a211867323937a7a456074c29e60cb674d4ab6002f7121bf6c0e9844f0"} Apr 17 11:16:45.449225 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:45.449175 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7xnhl"] Apr 17 11:16:45.458430 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:45.458399 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27626c71_9dab_4636_93f8_f3321c44e711.slice/crio-72f5ac92aa12a836130a04f11458750bc840f169e283d289dac100536b61b508 WatchSource:0}: Error finding container 72f5ac92aa12a836130a04f11458750bc840f169e283d289dac100536b61b508: Status 404 returned error can't find the container with id 72f5ac92aa12a836130a04f11458750bc840f169e283d289dac100536b61b508 Apr 17 11:16:45.760315 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:45.760279 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls\") pod \"dns-default-6wgpx\" (UID: \"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e\") " pod="openshift-dns/dns-default-6wgpx" Apr 17 11:16:45.760503 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:45.760340 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert\") pod \"ingress-canary-l6vms\" (UID: \"b6d8ae60-a18c-4042-87a0-4790a47763c3\") " pod="openshift-ingress-canary/ingress-canary-l6vms" Apr 17 11:16:45.760503 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:45.760436 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:45.760503 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:45.760445 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:45.760503 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:45.760489 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert podName:b6d8ae60-a18c-4042-87a0-4790a47763c3 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:47.760473509 +0000 UTC m=+37.156396591 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert") pod "ingress-canary-l6vms" (UID: "b6d8ae60-a18c-4042-87a0-4790a47763c3") : secret "canary-serving-cert" not found Apr 17 11:16:45.760503 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:45.760502 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls podName:d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:47.760496285 +0000 UTC m=+37.156419367 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls") pod "dns-default-6wgpx" (UID: "d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e") : secret "dns-default-metrics-tls" not found Apr 17 11:16:46.404131 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:46.404097 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7xnhl" event={"ID":"27626c71-9dab-4636-93f8-f3321c44e711","Type":"ContainerStarted","Data":"72f5ac92aa12a836130a04f11458750bc840f169e283d289dac100536b61b508"} Apr 17 11:16:46.407050 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:46.407018 2571 generic.go:358] "Generic (PLEG): container finished" podID="31b7facb-4c12-4174-a583-430fbb53bf63" containerID="2dd763f051994ee0390661579b8fe482be03eab646ee118032d6ca4d73b88e25" exitCode=0 Apr 17 11:16:46.407209 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:46.407138 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v728t" event={"ID":"31b7facb-4c12-4174-a583-430fbb53bf63","Type":"ContainerDied","Data":"2dd763f051994ee0390661579b8fe482be03eab646ee118032d6ca4d73b88e25"} Apr 17 11:16:47.412337 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:47.412296 2571 generic.go:358] "Generic (PLEG): container finished" podID="31b7facb-4c12-4174-a583-430fbb53bf63" containerID="d3fa8b9ac72ce201a6723bff89da2f321875ac5173d85479aef34d09290ac44f" exitCode=0 Apr 17 11:16:47.412823 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:47.412350 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v728t" event={"ID":"31b7facb-4c12-4174-a583-430fbb53bf63","Type":"ContainerDied","Data":"d3fa8b9ac72ce201a6723bff89da2f321875ac5173d85479aef34d09290ac44f"} Apr 17 11:16:47.777166 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:47.777095 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert\") pod \"ingress-canary-l6vms\" (UID: \"b6d8ae60-a18c-4042-87a0-4790a47763c3\") " pod="openshift-ingress-canary/ingress-canary-l6vms" Apr 17 11:16:47.777166 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:47.777160 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls\") pod \"dns-default-6wgpx\" (UID: \"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e\") " pod="openshift-dns/dns-default-6wgpx" Apr 17 11:16:47.777350 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:47.777263 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:47.777350 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:47.777307 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:47.777350 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:47.777344 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert podName:b6d8ae60-a18c-4042-87a0-4790a47763c3 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:51.777323865 +0000 UTC m=+41.173246952 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert") pod "ingress-canary-l6vms" (UID: "b6d8ae60-a18c-4042-87a0-4790a47763c3") : secret "canary-serving-cert" not found Apr 17 11:16:47.777478 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:47.777364 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls podName:d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:51.777354161 +0000 UTC m=+41.173277247 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls") pod "dns-default-6wgpx" (UID: "d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e") : secret "dns-default-metrics-tls" not found Apr 17 11:16:49.418097 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:49.417837 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7xnhl" event={"ID":"27626c71-9dab-4636-93f8-f3321c44e711","Type":"ContainerStarted","Data":"90948b0efc90daf706f912d8f205097ade1e57d843c15200e62d82c9c9883a38"} Apr 17 11:16:49.418097 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:49.418109 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:16:49.421129 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:49.421106 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-v728t" event={"ID":"31b7facb-4c12-4174-a583-430fbb53bf63","Type":"ContainerStarted","Data":"11b294691f057f8086986a417b9601f0979d1c0fa68cf7edc043c1934667450d"} Apr 17 11:16:49.449145 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:49.449099 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-7xnhl" podStartSLOduration=35.259174742 podStartE2EDuration="38.449083596s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:45.460439638 +0000 UTC m=+34.856362720" lastFinishedPulling="2026-04-17 11:16:48.65034848 +0000 UTC m=+38.046271574" observedRunningTime="2026-04-17 11:16:49.448358104 +0000 UTC m=+38.844281208" watchObservedRunningTime="2026-04-17 11:16:49.449083596 +0000 UTC m=+38.845006701" Apr 17 11:16:49.483565 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:49.483358 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-v728t" podStartSLOduration=6.846296911 podStartE2EDuration="38.483337615s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:13.709418895 +0000 UTC m=+3.105341988" lastFinishedPulling="2026-04-17 11:16:45.34645961 +0000 UTC m=+34.742382692" observedRunningTime="2026-04-17 11:16:49.482171156 +0000 UTC m=+38.878094260" watchObservedRunningTime="2026-04-17 11:16:49.483337615 +0000 UTC m=+38.879260732" Apr 17 11:16:51.426612 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:51.426565 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-27glw" event={"ID":"cf904787-1ca2-44e2-a227-75aa1d60f7a0","Type":"ContainerStarted","Data":"bc864169c478f135d1cbdf7660dfefd5afebb76e42f04db3d3037ea9f288e3fb"} Apr 17 11:16:51.444422 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:51.444339 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-27glw" podStartSLOduration=34.193590586 podStartE2EDuration="39.444320557s" podCreationTimestamp="2026-04-17 11:16:12 +0000 UTC" firstStartedPulling="2026-04-17 11:16:45.324635081 +0000 UTC m=+34.720558163" lastFinishedPulling="2026-04-17 11:16:50.575365049 +0000 UTC m=+39.971288134" observedRunningTime="2026-04-17 11:16:51.443728656 +0000 UTC m=+40.839651772" watchObservedRunningTime="2026-04-17 11:16:51.444320557 +0000 UTC m=+40.840243662" Apr 17 11:16:51.806852 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:51.806752 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls\") pod \"dns-default-6wgpx\" (UID: \"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e\") " pod="openshift-dns/dns-default-6wgpx" Apr 17 11:16:51.806852 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:51.806816 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert\") pod \"ingress-canary-l6vms\" (UID: \"b6d8ae60-a18c-4042-87a0-4790a47763c3\") " pod="openshift-ingress-canary/ingress-canary-l6vms" Apr 17 11:16:51.807043 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:51.806897 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:51.807043 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:51.806907 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:51.807043 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:51.806949 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert podName:b6d8ae60-a18c-4042-87a0-4790a47763c3 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:59.806936032 +0000 UTC m=+49.202859113 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert") pod "ingress-canary-l6vms" (UID: "b6d8ae60-a18c-4042-87a0-4790a47763c3") : secret "canary-serving-cert" not found Apr 17 11:16:51.807043 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:51.806962 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls podName:d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e nodeName:}" failed. No retries permitted until 2026-04-17 11:16:59.806956767 +0000 UTC m=+49.202879849 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls") pod "dns-default-6wgpx" (UID: "d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e") : secret "dns-default-metrics-tls" not found Apr 17 11:16:56.905420 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:56.905388 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp"] Apr 17 11:16:56.950390 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:56.950337 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n"] Apr 17 11:16:56.950561 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:56.950449 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" Apr 17 11:16:56.952933 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:56.952908 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 11:16:56.953051 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:56.952946 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 11:16:56.953138 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:56.953111 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 11:16:56.953652 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:56.953635 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 11:16:56.979631 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:56.979593 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp"] Apr 17 11:16:56.979631 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:56.979630 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n"] Apr 17 11:16:56.979811 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:56.979724 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:16:56.982884 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:56.982865 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 11:16:56.983039 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:56.983018 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 11:16:56.983095 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:56.983042 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 11:16:56.983095 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:56.983062 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 11:16:57.040907 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.040870 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99bcs\" (UniqueName: \"kubernetes.io/projected/08840162-a5b5-4cb0-9aef-1e6ce6c5d575-kube-api-access-99bcs\") pod \"klusterlet-addon-workmgr-7f5f6644b4-8gjmp\" (UID: \"08840162-a5b5-4cb0-9aef-1e6ce6c5d575\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" Apr 17 11:16:57.040907 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.040910 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/08840162-a5b5-4cb0-9aef-1e6ce6c5d575-tmp\") pod \"klusterlet-addon-workmgr-7f5f6644b4-8gjmp\" (UID: \"08840162-a5b5-4cb0-9aef-1e6ce6c5d575\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" Apr 17 11:16:57.041118 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.040958 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/08840162-a5b5-4cb0-9aef-1e6ce6c5d575-klusterlet-config\") pod \"klusterlet-addon-workmgr-7f5f6644b4-8gjmp\" (UID: \"08840162-a5b5-4cb0-9aef-1e6ce6c5d575\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" Apr 17 11:16:57.141545 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.141511 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/08840162-a5b5-4cb0-9aef-1e6ce6c5d575-klusterlet-config\") pod \"klusterlet-addon-workmgr-7f5f6644b4-8gjmp\" (UID: \"08840162-a5b5-4cb0-9aef-1e6ce6c5d575\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" Apr 17 11:16:57.141706 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.141564 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/fc0a0424-86a2-44a2-8d5c-34b3cd005718-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-59964b5894-l9t7n\" (UID: \"fc0a0424-86a2-44a2-8d5c-34b3cd005718\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:16:57.141706 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.141589 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/fc0a0424-86a2-44a2-8d5c-34b3cd005718-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-59964b5894-l9t7n\" (UID: \"fc0a0424-86a2-44a2-8d5c-34b3cd005718\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:16:57.141706 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.141606 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fc0a0424-86a2-44a2-8d5c-34b3cd005718-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-59964b5894-l9t7n\" (UID: \"fc0a0424-86a2-44a2-8d5c-34b3cd005718\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:16:57.141706 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.141633 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99bcs\" (UniqueName: \"kubernetes.io/projected/08840162-a5b5-4cb0-9aef-1e6ce6c5d575-kube-api-access-99bcs\") pod \"klusterlet-addon-workmgr-7f5f6644b4-8gjmp\" (UID: \"08840162-a5b5-4cb0-9aef-1e6ce6c5d575\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" Apr 17 11:16:57.141706 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.141680 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/fc0a0424-86a2-44a2-8d5c-34b3cd005718-hub\") pod \"cluster-proxy-proxy-agent-59964b5894-l9t7n\" (UID: \"fc0a0424-86a2-44a2-8d5c-34b3cd005718\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:16:57.141897 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.141755 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td2bx\" (UniqueName: \"kubernetes.io/projected/fc0a0424-86a2-44a2-8d5c-34b3cd005718-kube-api-access-td2bx\") pod \"cluster-proxy-proxy-agent-59964b5894-l9t7n\" (UID: \"fc0a0424-86a2-44a2-8d5c-34b3cd005718\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:16:57.141897 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.141807 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/08840162-a5b5-4cb0-9aef-1e6ce6c5d575-tmp\") pod \"klusterlet-addon-workmgr-7f5f6644b4-8gjmp\" (UID: \"08840162-a5b5-4cb0-9aef-1e6ce6c5d575\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" Apr 17 11:16:57.141967 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.141908 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/fc0a0424-86a2-44a2-8d5c-34b3cd005718-ca\") pod \"cluster-proxy-proxy-agent-59964b5894-l9t7n\" (UID: \"fc0a0424-86a2-44a2-8d5c-34b3cd005718\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:16:57.142117 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.142101 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/08840162-a5b5-4cb0-9aef-1e6ce6c5d575-tmp\") pod \"klusterlet-addon-workmgr-7f5f6644b4-8gjmp\" (UID: \"08840162-a5b5-4cb0-9aef-1e6ce6c5d575\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" Apr 17 11:16:57.145805 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.145778 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/08840162-a5b5-4cb0-9aef-1e6ce6c5d575-klusterlet-config\") pod \"klusterlet-addon-workmgr-7f5f6644b4-8gjmp\" (UID: \"08840162-a5b5-4cb0-9aef-1e6ce6c5d575\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" Apr 17 11:16:57.150212 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.150191 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99bcs\" (UniqueName: \"kubernetes.io/projected/08840162-a5b5-4cb0-9aef-1e6ce6c5d575-kube-api-access-99bcs\") pod \"klusterlet-addon-workmgr-7f5f6644b4-8gjmp\" (UID: \"08840162-a5b5-4cb0-9aef-1e6ce6c5d575\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" Apr 17 11:16:57.242997 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.242913 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/fc0a0424-86a2-44a2-8d5c-34b3cd005718-hub\") pod \"cluster-proxy-proxy-agent-59964b5894-l9t7n\" (UID: \"fc0a0424-86a2-44a2-8d5c-34b3cd005718\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:16:57.242997 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.242949 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-td2bx\" (UniqueName: \"kubernetes.io/projected/fc0a0424-86a2-44a2-8d5c-34b3cd005718-kube-api-access-td2bx\") pod \"cluster-proxy-proxy-agent-59964b5894-l9t7n\" (UID: \"fc0a0424-86a2-44a2-8d5c-34b3cd005718\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:16:57.242997 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.242983 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/fc0a0424-86a2-44a2-8d5c-34b3cd005718-ca\") pod \"cluster-proxy-proxy-agent-59964b5894-l9t7n\" (UID: \"fc0a0424-86a2-44a2-8d5c-34b3cd005718\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:16:57.243247 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.243033 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/fc0a0424-86a2-44a2-8d5c-34b3cd005718-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-59964b5894-l9t7n\" (UID: \"fc0a0424-86a2-44a2-8d5c-34b3cd005718\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:16:57.243247 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.243060 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/fc0a0424-86a2-44a2-8d5c-34b3cd005718-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-59964b5894-l9t7n\" (UID: \"fc0a0424-86a2-44a2-8d5c-34b3cd005718\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:16:57.243247 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.243077 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fc0a0424-86a2-44a2-8d5c-34b3cd005718-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-59964b5894-l9t7n\" (UID: \"fc0a0424-86a2-44a2-8d5c-34b3cd005718\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:16:57.243923 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.243893 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/fc0a0424-86a2-44a2-8d5c-34b3cd005718-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-59964b5894-l9t7n\" (UID: \"fc0a0424-86a2-44a2-8d5c-34b3cd005718\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:16:57.245474 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.245452 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/fc0a0424-86a2-44a2-8d5c-34b3cd005718-ca\") pod \"cluster-proxy-proxy-agent-59964b5894-l9t7n\" (UID: \"fc0a0424-86a2-44a2-8d5c-34b3cd005718\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:16:57.245575 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.245457 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/fc0a0424-86a2-44a2-8d5c-34b3cd005718-hub\") pod \"cluster-proxy-proxy-agent-59964b5894-l9t7n\" (UID: \"fc0a0424-86a2-44a2-8d5c-34b3cd005718\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:16:57.246068 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.246047 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fc0a0424-86a2-44a2-8d5c-34b3cd005718-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-59964b5894-l9t7n\" (UID: \"fc0a0424-86a2-44a2-8d5c-34b3cd005718\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:16:57.246068 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.246049 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/fc0a0424-86a2-44a2-8d5c-34b3cd005718-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-59964b5894-l9t7n\" (UID: \"fc0a0424-86a2-44a2-8d5c-34b3cd005718\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:16:57.250698 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.250677 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-td2bx\" (UniqueName: \"kubernetes.io/projected/fc0a0424-86a2-44a2-8d5c-34b3cd005718-kube-api-access-td2bx\") pod \"cluster-proxy-proxy-agent-59964b5894-l9t7n\" (UID: \"fc0a0424-86a2-44a2-8d5c-34b3cd005718\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:16:57.262310 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.260464 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" Apr 17 11:16:57.306063 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.306008 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:16:57.385884 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.385847 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp"] Apr 17 11:16:57.390712 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:57.390681 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08840162_a5b5_4cb0_9aef_1e6ce6c5d575.slice/crio-63accb7e0999e9303c12a06d7cff49ccda66e05d0d3450478269afb12b32371a WatchSource:0}: Error finding container 63accb7e0999e9303c12a06d7cff49ccda66e05d0d3450478269afb12b32371a: Status 404 returned error can't find the container with id 63accb7e0999e9303c12a06d7cff49ccda66e05d0d3450478269afb12b32371a Apr 17 11:16:57.428671 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.428626 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n"] Apr 17 11:16:57.432046 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:16:57.432015 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc0a0424_86a2_44a2_8d5c_34b3cd005718.slice/crio-1a95382f064305e918caba4106a5df6a82b3586f793f2531b5030ccb663423b7 WatchSource:0}: Error finding container 1a95382f064305e918caba4106a5df6a82b3586f793f2531b5030ccb663423b7: Status 404 returned error can't find the container with id 1a95382f064305e918caba4106a5df6a82b3586f793f2531b5030ccb663423b7 Apr 17 11:16:57.439399 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.439351 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" event={"ID":"08840162-a5b5-4cb0-9aef-1e6ce6c5d575","Type":"ContainerStarted","Data":"63accb7e0999e9303c12a06d7cff49ccda66e05d0d3450478269afb12b32371a"} Apr 17 11:16:57.440312 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:57.440289 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" event={"ID":"fc0a0424-86a2-44a2-8d5c-34b3cd005718","Type":"ContainerStarted","Data":"1a95382f064305e918caba4106a5df6a82b3586f793f2531b5030ccb663423b7"} Apr 17 11:16:59.863053 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:59.863015 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls\") pod \"dns-default-6wgpx\" (UID: \"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e\") " pod="openshift-dns/dns-default-6wgpx" Apr 17 11:16:59.863524 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:16:59.863098 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert\") pod \"ingress-canary-l6vms\" (UID: \"b6d8ae60-a18c-4042-87a0-4790a47763c3\") " pod="openshift-ingress-canary/ingress-canary-l6vms" Apr 17 11:16:59.863524 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:59.863218 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:59.863524 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:59.863221 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:59.863524 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:59.863287 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert podName:b6d8ae60-a18c-4042-87a0-4790a47763c3 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:15.863267568 +0000 UTC m=+65.259190654 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert") pod "ingress-canary-l6vms" (UID: "b6d8ae60-a18c-4042-87a0-4790a47763c3") : secret "canary-serving-cert" not found Apr 17 11:16:59.863524 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:16:59.863308 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls podName:d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e nodeName:}" failed. No retries permitted until 2026-04-17 11:17:15.863298561 +0000 UTC m=+65.259221649 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls") pod "dns-default-6wgpx" (UID: "d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e") : secret "dns-default-metrics-tls" not found Apr 17 11:17:02.451241 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:17:02.451213 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" event={"ID":"fc0a0424-86a2-44a2-8d5c-34b3cd005718","Type":"ContainerStarted","Data":"de80ef5bd79743089d65be56de0f340ee0513c1f9eda14d1b4158536ed870236"} Apr 17 11:17:03.453809 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:17:03.453772 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" event={"ID":"08840162-a5b5-4cb0-9aef-1e6ce6c5d575","Type":"ContainerStarted","Data":"68cf99a310d94b260c1f9d0787603d524ce0cc973f601a5035b6868f9b3450b8"} Apr 17 11:17:03.454239 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:17:03.454000 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" Apr 17 11:17:03.455731 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:17:03.455708 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" Apr 17 11:17:03.469652 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:17:03.469611 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" podStartSLOduration=2.430635944 podStartE2EDuration="7.469597439s" podCreationTimestamp="2026-04-17 11:16:56 +0000 UTC" firstStartedPulling="2026-04-17 11:16:57.393112511 +0000 UTC m=+46.789035596" lastFinishedPulling="2026-04-17 11:17:02.432074006 +0000 UTC m=+51.827997091" observedRunningTime="2026-04-17 11:17:03.468976381 +0000 UTC m=+52.864899485" watchObservedRunningTime="2026-04-17 11:17:03.469597439 +0000 UTC m=+52.865520543" Apr 17 11:17:05.460251 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:17:05.460153 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" event={"ID":"fc0a0424-86a2-44a2-8d5c-34b3cd005718","Type":"ContainerStarted","Data":"3e5ed0e02514d7ea246d731549fb3fb83a194d793beeb340795cb060d17f6624"} Apr 17 11:17:05.460251 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:17:05.460203 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" event={"ID":"fc0a0424-86a2-44a2-8d5c-34b3cd005718","Type":"ContainerStarted","Data":"00a4f341dfe628a2cd55a843ae66330fe46bf6f8792c818d6f67780165553133"} Apr 17 11:17:05.478525 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:17:05.478476 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" podStartSLOduration=1.8220180940000001 podStartE2EDuration="9.478461941s" podCreationTimestamp="2026-04-17 11:16:56 +0000 UTC" firstStartedPulling="2026-04-17 11:16:57.433769724 +0000 UTC m=+46.829692805" lastFinishedPulling="2026-04-17 11:17:05.090213561 +0000 UTC m=+54.486136652" observedRunningTime="2026-04-17 11:17:05.477658886 +0000 UTC m=+54.873581991" watchObservedRunningTime="2026-04-17 11:17:05.478461941 +0000 UTC m=+54.874385045" Apr 17 11:17:10.397744 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:17:10.397715 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wjrrc" Apr 17 11:17:15.869699 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:17:15.869662 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert\") pod \"ingress-canary-l6vms\" (UID: \"b6d8ae60-a18c-4042-87a0-4790a47763c3\") " pod="openshift-ingress-canary/ingress-canary-l6vms" Apr 17 11:17:15.870082 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:17:15.869714 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls\") pod \"dns-default-6wgpx\" (UID: \"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e\") " pod="openshift-dns/dns-default-6wgpx" Apr 17 11:17:15.870082 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:17:15.869811 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:15.870082 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:17:15.869876 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls podName:d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e nodeName:}" failed. No retries permitted until 2026-04-17 11:17:47.869862444 +0000 UTC m=+97.265785526 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls") pod "dns-default-6wgpx" (UID: "d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e") : secret "dns-default-metrics-tls" not found Apr 17 11:17:15.870082 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:17:15.869811 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:15.870082 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:17:15.869948 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert podName:b6d8ae60-a18c-4042-87a0-4790a47763c3 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:47.869936197 +0000 UTC m=+97.265859284 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert") pod "ingress-canary-l6vms" (UID: "b6d8ae60-a18c-4042-87a0-4790a47763c3") : secret "canary-serving-cert" not found Apr 17 11:17:16.877719 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:17:16.877677 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs\") pod \"network-metrics-daemon-zsnbl\" (UID: \"44159d9f-1705-4830-8bfe-c087640f29cb\") " pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:17:16.878087 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:17:16.877821 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:17:16.878087 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:17:16.877886 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs podName:44159d9f-1705-4830-8bfe-c087640f29cb nodeName:}" failed. No retries permitted until 2026-04-17 11:18:20.87787061 +0000 UTC m=+130.273793693 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs") pod "network-metrics-daemon-zsnbl" (UID: "44159d9f-1705-4830-8bfe-c087640f29cb") : secret "metrics-daemon-secret" not found Apr 17 11:17:20.425788 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:17:20.425757 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-7xnhl" Apr 17 11:17:47.898296 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:17:47.898240 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert\") pod \"ingress-canary-l6vms\" (UID: \"b6d8ae60-a18c-4042-87a0-4790a47763c3\") " pod="openshift-ingress-canary/ingress-canary-l6vms" Apr 17 11:17:47.898921 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:17:47.898313 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls\") pod \"dns-default-6wgpx\" (UID: \"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e\") " pod="openshift-dns/dns-default-6wgpx" Apr 17 11:17:47.898921 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:17:47.898409 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:47.898921 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:17:47.898436 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:47.898921 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:17:47.898498 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert podName:b6d8ae60-a18c-4042-87a0-4790a47763c3 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:51.898482512 +0000 UTC m=+161.294405594 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert") pod "ingress-canary-l6vms" (UID: "b6d8ae60-a18c-4042-87a0-4790a47763c3") : secret "canary-serving-cert" not found Apr 17 11:17:47.898921 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:17:47.898516 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls podName:d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e nodeName:}" failed. No retries permitted until 2026-04-17 11:18:51.898508504 +0000 UTC m=+161.294431586 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls") pod "dns-default-6wgpx" (UID: "d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e") : secret "dns-default-metrics-tls" not found Apr 17 11:18:13.400569 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:13.400535 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rctf4_05781f98-6d49-4771-a747-d678a55de76e/dns-node-resolver/0.log" Apr 17 11:18:14.001519 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:14.001491 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fgzfd_02fceaee-2358-4389-a551-6c489878daca/node-ca/0.log" Apr 17 11:18:20.941249 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:20.941207 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs\") pod \"network-metrics-daemon-zsnbl\" (UID: \"44159d9f-1705-4830-8bfe-c087640f29cb\") " pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:18:20.941719 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:18:20.941345 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:18:20.941719 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:18:20.941427 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs podName:44159d9f-1705-4830-8bfe-c087640f29cb nodeName:}" failed. No retries permitted until 2026-04-17 11:20:22.941410781 +0000 UTC m=+252.337333868 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs") pod "network-metrics-daemon-zsnbl" (UID: "44159d9f-1705-4830-8bfe-c087640f29cb") : secret "metrics-daemon-secret" not found Apr 17 11:18:43.567603 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.567571 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-wl8lk"] Apr 17 11:18:43.570621 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.570595 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wl8lk" Apr 17 11:18:43.572804 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.572777 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 11:18:43.573406 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.573362 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-64q7x\"" Apr 17 11:18:43.573558 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.573540 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 11:18:43.573977 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.573958 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 11:18:43.574262 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.574242 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 11:18:43.592691 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.592664 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wl8lk"] Apr 17 11:18:43.714506 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.714476 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/308d97f5-9121-4af5-a32a-1b143d7593d9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wl8lk\" (UID: \"308d97f5-9121-4af5-a32a-1b143d7593d9\") " pod="openshift-insights/insights-runtime-extractor-wl8lk" Apr 17 11:18:43.714669 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.714538 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/308d97f5-9121-4af5-a32a-1b143d7593d9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wl8lk\" (UID: \"308d97f5-9121-4af5-a32a-1b143d7593d9\") " pod="openshift-insights/insights-runtime-extractor-wl8lk" Apr 17 11:18:43.714669 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.714588 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr5hs\" (UniqueName: \"kubernetes.io/projected/308d97f5-9121-4af5-a32a-1b143d7593d9-kube-api-access-kr5hs\") pod \"insights-runtime-extractor-wl8lk\" (UID: \"308d97f5-9121-4af5-a32a-1b143d7593d9\") " pod="openshift-insights/insights-runtime-extractor-wl8lk" Apr 17 11:18:43.714737 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.714679 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/308d97f5-9121-4af5-a32a-1b143d7593d9-crio-socket\") pod \"insights-runtime-extractor-wl8lk\" (UID: \"308d97f5-9121-4af5-a32a-1b143d7593d9\") " pod="openshift-insights/insights-runtime-extractor-wl8lk" Apr 17 11:18:43.714737 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.714700 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/308d97f5-9121-4af5-a32a-1b143d7593d9-data-volume\") pod \"insights-runtime-extractor-wl8lk\" (UID: \"308d97f5-9121-4af5-a32a-1b143d7593d9\") " pod="openshift-insights/insights-runtime-extractor-wl8lk" Apr 17 11:18:43.815953 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.815899 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/308d97f5-9121-4af5-a32a-1b143d7593d9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wl8lk\" (UID: \"308d97f5-9121-4af5-a32a-1b143d7593d9\") " pod="openshift-insights/insights-runtime-extractor-wl8lk" Apr 17 11:18:43.815953 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.815950 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kr5hs\" (UniqueName: \"kubernetes.io/projected/308d97f5-9121-4af5-a32a-1b143d7593d9-kube-api-access-kr5hs\") pod \"insights-runtime-extractor-wl8lk\" (UID: \"308d97f5-9121-4af5-a32a-1b143d7593d9\") " pod="openshift-insights/insights-runtime-extractor-wl8lk" Apr 17 11:18:43.816144 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.816006 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/308d97f5-9121-4af5-a32a-1b143d7593d9-crio-socket\") pod \"insights-runtime-extractor-wl8lk\" (UID: \"308d97f5-9121-4af5-a32a-1b143d7593d9\") " pod="openshift-insights/insights-runtime-extractor-wl8lk" Apr 17 11:18:43.816144 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.816024 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/308d97f5-9121-4af5-a32a-1b143d7593d9-data-volume\") pod \"insights-runtime-extractor-wl8lk\" (UID: \"308d97f5-9121-4af5-a32a-1b143d7593d9\") " pod="openshift-insights/insights-runtime-extractor-wl8lk" Apr 17 11:18:43.816144 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.816056 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/308d97f5-9121-4af5-a32a-1b143d7593d9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wl8lk\" (UID: \"308d97f5-9121-4af5-a32a-1b143d7593d9\") " pod="openshift-insights/insights-runtime-extractor-wl8lk" Apr 17 11:18:43.816278 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.816256 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/308d97f5-9121-4af5-a32a-1b143d7593d9-crio-socket\") pod \"insights-runtime-extractor-wl8lk\" (UID: \"308d97f5-9121-4af5-a32a-1b143d7593d9\") " pod="openshift-insights/insights-runtime-extractor-wl8lk" Apr 17 11:18:43.816446 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.816422 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/308d97f5-9121-4af5-a32a-1b143d7593d9-data-volume\") pod \"insights-runtime-extractor-wl8lk\" (UID: \"308d97f5-9121-4af5-a32a-1b143d7593d9\") " pod="openshift-insights/insights-runtime-extractor-wl8lk" Apr 17 11:18:43.816647 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.816631 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/308d97f5-9121-4af5-a32a-1b143d7593d9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wl8lk\" (UID: \"308d97f5-9121-4af5-a32a-1b143d7593d9\") " pod="openshift-insights/insights-runtime-extractor-wl8lk" Apr 17 11:18:43.818299 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.818256 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/308d97f5-9121-4af5-a32a-1b143d7593d9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wl8lk\" (UID: \"308d97f5-9121-4af5-a32a-1b143d7593d9\") " pod="openshift-insights/insights-runtime-extractor-wl8lk" Apr 17 11:18:43.826531 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.826508 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr5hs\" (UniqueName: \"kubernetes.io/projected/308d97f5-9121-4af5-a32a-1b143d7593d9-kube-api-access-kr5hs\") pod \"insights-runtime-extractor-wl8lk\" (UID: \"308d97f5-9121-4af5-a32a-1b143d7593d9\") " pod="openshift-insights/insights-runtime-extractor-wl8lk" Apr 17 11:18:43.879674 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:43.879642 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wl8lk" Apr 17 11:18:44.001144 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:44.001111 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wl8lk"] Apr 17 11:18:44.692620 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:44.692583 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wl8lk" event={"ID":"308d97f5-9121-4af5-a32a-1b143d7593d9","Type":"ContainerStarted","Data":"cb9e87b2bad7d05f0bb9860d39f60fc237c5b00e766654a48a37ce942b1fdb1a"} Apr 17 11:18:44.692620 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:44.692620 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wl8lk" event={"ID":"308d97f5-9121-4af5-a32a-1b143d7593d9","Type":"ContainerStarted","Data":"775d492479fd2ca0669c0a59b3ccb92498c9275ac0ffee5375731a575db773bb"} Apr 17 11:18:45.696543 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:45.696507 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wl8lk" event={"ID":"308d97f5-9121-4af5-a32a-1b143d7593d9","Type":"ContainerStarted","Data":"233fd7cdcd3e27def7f5b610692cc32189c16c94c3d21f03caa3f5fe401cd667"} Apr 17 11:18:46.700555 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:46.700513 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wl8lk" event={"ID":"308d97f5-9121-4af5-a32a-1b143d7593d9","Type":"ContainerStarted","Data":"807eb602d8f15869c64fafb3ccc41ea22fd05d1048484a03964fbcd26e12b9ac"} Apr 17 11:18:46.718031 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:46.717982 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-wl8lk" podStartSLOduration=1.6521165450000002 podStartE2EDuration="3.717966637s" podCreationTimestamp="2026-04-17 11:18:43 +0000 UTC" firstStartedPulling="2026-04-17 11:18:44.060963577 +0000 UTC m=+153.456886661" lastFinishedPulling="2026-04-17 11:18:46.12681367 +0000 UTC m=+155.522736753" observedRunningTime="2026-04-17 11:18:46.717084633 +0000 UTC m=+156.113007737" watchObservedRunningTime="2026-04-17 11:18:46.717966637 +0000 UTC m=+156.113889741" Apr 17 11:18:47.005975 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:18:47.005887 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6wgpx" podUID="d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e" Apr 17 11:18:47.027134 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:18:47.027097 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-l6vms" podUID="b6d8ae60-a18c-4042-87a0-4790a47763c3" Apr 17 11:18:47.232550 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:18:47.232496 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-zsnbl" podUID="44159d9f-1705-4830-8bfe-c087640f29cb" Apr 17 11:18:47.702575 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:47.702540 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6wgpx" Apr 17 11:18:51.690679 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.690646 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-98dxk"] Apr 17 11:18:51.693897 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.693875 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.696263 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.696239 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 11:18:51.696484 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.696238 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 11:18:51.696484 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.696315 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 11:18:51.696766 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.696731 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 11:18:51.696766 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.696745 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 11:18:51.696893 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.696763 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2bm54\"" Apr 17 11:18:51.696893 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.696789 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 11:18:51.778198 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.778147 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8c3b726a-5a6a-4448-9528-b468829506bc-node-exporter-tls\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.778198 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.778202 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8c3b726a-5a6a-4448-9528-b468829506bc-node-exporter-accelerators-collector-config\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.778559 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.778238 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8c3b726a-5a6a-4448-9528-b468829506bc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.778559 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.778258 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8c3b726a-5a6a-4448-9528-b468829506bc-root\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.778559 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.778275 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8c3b726a-5a6a-4448-9528-b468829506bc-node-exporter-textfile\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.778559 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.778297 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8c3b726a-5a6a-4448-9528-b468829506bc-node-exporter-wtmp\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.778559 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.778431 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtzt7\" (UniqueName: \"kubernetes.io/projected/8c3b726a-5a6a-4448-9528-b468829506bc-kube-api-access-qtzt7\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.778559 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.778483 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8c3b726a-5a6a-4448-9528-b468829506bc-sys\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.778559 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.778528 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8c3b726a-5a6a-4448-9528-b468829506bc-metrics-client-ca\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.879635 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.879588 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8c3b726a-5a6a-4448-9528-b468829506bc-metrics-client-ca\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.879635 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.879644 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8c3b726a-5a6a-4448-9528-b468829506bc-node-exporter-tls\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.879852 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.879661 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8c3b726a-5a6a-4448-9528-b468829506bc-node-exporter-accelerators-collector-config\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.879852 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:18:51.879758 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 11:18:51.879852 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.879798 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8c3b726a-5a6a-4448-9528-b468829506bc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.879852 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:18:51.879831 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c3b726a-5a6a-4448-9528-b468829506bc-node-exporter-tls podName:8c3b726a-5a6a-4448-9528-b468829506bc nodeName:}" failed. No retries permitted until 2026-04-17 11:18:52.379810616 +0000 UTC m=+161.775733699 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/8c3b726a-5a6a-4448-9528-b468829506bc-node-exporter-tls") pod "node-exporter-98dxk" (UID: "8c3b726a-5a6a-4448-9528-b468829506bc") : secret "node-exporter-tls" not found Apr 17 11:18:51.880041 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.879866 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8c3b726a-5a6a-4448-9528-b468829506bc-root\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.880041 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.879894 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8c3b726a-5a6a-4448-9528-b468829506bc-node-exporter-textfile\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.880041 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.879935 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8c3b726a-5a6a-4448-9528-b468829506bc-node-exporter-wtmp\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.880041 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.879981 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtzt7\" (UniqueName: \"kubernetes.io/projected/8c3b726a-5a6a-4448-9528-b468829506bc-kube-api-access-qtzt7\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.880041 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.880007 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8c3b726a-5a6a-4448-9528-b468829506bc-sys\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.880237 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.880074 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8c3b726a-5a6a-4448-9528-b468829506bc-sys\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.880237 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.879935 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8c3b726a-5a6a-4448-9528-b468829506bc-root\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.880237 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.880096 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8c3b726a-5a6a-4448-9528-b468829506bc-node-exporter-wtmp\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.880237 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.880195 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8c3b726a-5a6a-4448-9528-b468829506bc-metrics-client-ca\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.880431 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.880244 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8c3b726a-5a6a-4448-9528-b468829506bc-node-exporter-accelerators-collector-config\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.881017 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.881000 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8c3b726a-5a6a-4448-9528-b468829506bc-node-exporter-textfile\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.882046 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.882028 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8c3b726a-5a6a-4448-9528-b468829506bc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.888203 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.888180 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtzt7\" (UniqueName: \"kubernetes.io/projected/8c3b726a-5a6a-4448-9528-b468829506bc-kube-api-access-qtzt7\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:51.981264 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.981182 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls\") pod \"dns-default-6wgpx\" (UID: \"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e\") " pod="openshift-dns/dns-default-6wgpx" Apr 17 11:18:51.981416 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.981277 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert\") pod \"ingress-canary-l6vms\" (UID: \"b6d8ae60-a18c-4042-87a0-4790a47763c3\") " pod="openshift-ingress-canary/ingress-canary-l6vms" Apr 17 11:18:51.984647 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.984612 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e-metrics-tls\") pod \"dns-default-6wgpx\" (UID: \"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e\") " pod="openshift-dns/dns-default-6wgpx" Apr 17 11:18:51.987193 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:51.987167 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b6d8ae60-a18c-4042-87a0-4790a47763c3-cert\") pod \"ingress-canary-l6vms\" (UID: \"b6d8ae60-a18c-4042-87a0-4790a47763c3\") " pod="openshift-ingress-canary/ingress-canary-l6vms" Apr 17 11:18:52.205842 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:52.205812 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bd2vh\"" Apr 17 11:18:52.214734 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:52.214709 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6wgpx" Apr 17 11:18:52.328885 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:52.328852 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6wgpx"] Apr 17 11:18:52.332551 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:18:52.332528 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2c70e20_7fbd_4bbb_9ce6_10c8f99f985e.slice/crio-d36e1793ca110773bec6953a3dab7dc5988737c35ad36d85ddb715fcf513373d WatchSource:0}: Error finding container d36e1793ca110773bec6953a3dab7dc5988737c35ad36d85ddb715fcf513373d: Status 404 returned error can't find the container with id d36e1793ca110773bec6953a3dab7dc5988737c35ad36d85ddb715fcf513373d Apr 17 11:18:52.384855 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:52.384817 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8c3b726a-5a6a-4448-9528-b468829506bc-node-exporter-tls\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:52.387049 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:52.387029 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8c3b726a-5a6a-4448-9528-b468829506bc-node-exporter-tls\") pod \"node-exporter-98dxk\" (UID: \"8c3b726a-5a6a-4448-9528-b468829506bc\") " pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:52.602643 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:52.602616 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-98dxk" Apr 17 11:18:52.610698 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:18:52.610643 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c3b726a_5a6a_4448_9528_b468829506bc.slice/crio-17596071ca3d61eaffae4b6ffb7e63f48ecd7860996fe079799a8dd22950a6c3 WatchSource:0}: Error finding container 17596071ca3d61eaffae4b6ffb7e63f48ecd7860996fe079799a8dd22950a6c3: Status 404 returned error can't find the container with id 17596071ca3d61eaffae4b6ffb7e63f48ecd7860996fe079799a8dd22950a6c3 Apr 17 11:18:52.714319 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:52.714278 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6wgpx" event={"ID":"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e","Type":"ContainerStarted","Data":"d36e1793ca110773bec6953a3dab7dc5988737c35ad36d85ddb715fcf513373d"} Apr 17 11:18:52.715265 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:52.715239 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-98dxk" event={"ID":"8c3b726a-5a6a-4448-9528-b468829506bc","Type":"ContainerStarted","Data":"17596071ca3d61eaffae4b6ffb7e63f48ecd7860996fe079799a8dd22950a6c3"} Apr 17 11:18:53.740365 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.740332 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz"] Apr 17 11:18:53.744038 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.744016 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.746216 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.746155 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 11:18:53.746361 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.746344 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 11:18:53.746552 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.746540 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 11:18:53.746646 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.746626 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 11:18:53.746725 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.746646 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-13qfdsjlf679t\"" Apr 17 11:18:53.746725 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.746705 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-pk7m8\"" Apr 17 11:18:53.746725 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.746635 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 11:18:53.754628 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.754604 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz"] Apr 17 11:18:53.797735 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.797705 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5466274d-9675-4358-af95-c9c0e9882159-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.797874 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.797754 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5466274d-9675-4358-af95-c9c0e9882159-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.797874 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.797786 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64jlk\" (UniqueName: \"kubernetes.io/projected/5466274d-9675-4358-af95-c9c0e9882159-kube-api-access-64jlk\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.797874 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.797812 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5466274d-9675-4358-af95-c9c0e9882159-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.798023 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.797924 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5466274d-9675-4358-af95-c9c0e9882159-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.798073 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.798039 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5466274d-9675-4358-af95-c9c0e9882159-secret-thanos-querier-tls\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.798111 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.798071 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5466274d-9675-4358-af95-c9c0e9882159-metrics-client-ca\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.798144 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.798110 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5466274d-9675-4358-af95-c9c0e9882159-secret-grpc-tls\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.898957 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.898915 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5466274d-9675-4358-af95-c9c0e9882159-secret-grpc-tls\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.898957 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.898960 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5466274d-9675-4358-af95-c9c0e9882159-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.899203 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.899138 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5466274d-9675-4358-af95-c9c0e9882159-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.899203 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.899181 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64jlk\" (UniqueName: \"kubernetes.io/projected/5466274d-9675-4358-af95-c9c0e9882159-kube-api-access-64jlk\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.899310 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.899214 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5466274d-9675-4358-af95-c9c0e9882159-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.899310 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.899267 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5466274d-9675-4358-af95-c9c0e9882159-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.899422 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.899343 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5466274d-9675-4358-af95-c9c0e9882159-secret-thanos-querier-tls\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.899422 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.899387 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5466274d-9675-4358-af95-c9c0e9882159-metrics-client-ca\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.900478 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.900446 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5466274d-9675-4358-af95-c9c0e9882159-metrics-client-ca\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.902884 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.902721 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5466274d-9675-4358-af95-c9c0e9882159-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.902884 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.902780 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5466274d-9675-4358-af95-c9c0e9882159-secret-grpc-tls\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.902884 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.902769 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5466274d-9675-4358-af95-c9c0e9882159-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.902884 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.902867 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5466274d-9675-4358-af95-c9c0e9882159-secret-thanos-querier-tls\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.903910 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.903851 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5466274d-9675-4358-af95-c9c0e9882159-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.904454 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.904429 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5466274d-9675-4358-af95-c9c0e9882159-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:53.916186 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:53.915749 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64jlk\" (UniqueName: \"kubernetes.io/projected/5466274d-9675-4358-af95-c9c0e9882159-kube-api-access-64jlk\") pod \"thanos-querier-6f7f856ccb-8jmgz\" (UID: \"5466274d-9675-4358-af95-c9c0e9882159\") " pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:54.053157 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:54.053122 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:54.200987 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:54.200953 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz"] Apr 17 11:18:54.206295 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:18:54.206262 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5466274d_9675_4358_af95_c9c0e9882159.slice/crio-ba267a5548316a650a144e28d4b42e24fe01471a4912b90fb8118af8d7bb55f6 WatchSource:0}: Error finding container ba267a5548316a650a144e28d4b42e24fe01471a4912b90fb8118af8d7bb55f6: Status 404 returned error can't find the container with id ba267a5548316a650a144e28d4b42e24fe01471a4912b90fb8118af8d7bb55f6 Apr 17 11:18:54.722330 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:54.722290 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" event={"ID":"5466274d-9675-4358-af95-c9c0e9882159","Type":"ContainerStarted","Data":"ba267a5548316a650a144e28d4b42e24fe01471a4912b90fb8118af8d7bb55f6"} Apr 17 11:18:54.724151 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:54.724121 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6wgpx" event={"ID":"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e","Type":"ContainerStarted","Data":"ddd878e540d13dbc50e37fe15af1854ff51afe7fb2265cc4c13b883031b299dd"} Apr 17 11:18:54.724303 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:54.724158 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6wgpx" event={"ID":"d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e","Type":"ContainerStarted","Data":"e54a64d61cb3eb526fdc361bf716be40157bf2e328f0e263fc92a5e4c8825d37"} Apr 17 11:18:54.724303 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:54.724250 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6wgpx" Apr 17 11:18:54.725754 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:54.725713 2571 generic.go:358] "Generic (PLEG): container finished" podID="8c3b726a-5a6a-4448-9528-b468829506bc" containerID="ea064f2a3ea1f1e509a85de59627405311f529128b49e437a3fbed33907881ce" exitCode=0 Apr 17 11:18:54.725880 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:54.725750 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-98dxk" event={"ID":"8c3b726a-5a6a-4448-9528-b468829506bc","Type":"ContainerDied","Data":"ea064f2a3ea1f1e509a85de59627405311f529128b49e437a3fbed33907881ce"} Apr 17 11:18:54.745223 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:54.745175 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6wgpx" podStartSLOduration=130.308582758 podStartE2EDuration="2m11.745157803s" podCreationTimestamp="2026-04-17 11:16:43 +0000 UTC" firstStartedPulling="2026-04-17 11:18:52.334132547 +0000 UTC m=+161.730055629" lastFinishedPulling="2026-04-17 11:18:53.770707587 +0000 UTC m=+163.166630674" observedRunningTime="2026-04-17 11:18:54.744632313 +0000 UTC m=+164.140555430" watchObservedRunningTime="2026-04-17 11:18:54.745157803 +0000 UTC m=+164.141080951" Apr 17 11:18:55.731608 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:55.731561 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-98dxk" event={"ID":"8c3b726a-5a6a-4448-9528-b468829506bc","Type":"ContainerStarted","Data":"fc58bba7f3b92855ebd498cad37fd9e4497f92a94ef7e5f1a18a550d2bef7693"} Apr 17 11:18:55.731608 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:55.731611 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-98dxk" event={"ID":"8c3b726a-5a6a-4448-9528-b468829506bc","Type":"ContainerStarted","Data":"708654b7a73667a3f504607a2b45e92b1fdeef94b3f63a8fedaf82b0b6ec2657"} Apr 17 11:18:55.758431 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:55.758348 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-98dxk" podStartSLOduration=3.6038778799999998 podStartE2EDuration="4.758329364s" podCreationTimestamp="2026-04-17 11:18:51 +0000 UTC" firstStartedPulling="2026-04-17 11:18:52.612567763 +0000 UTC m=+162.008490848" lastFinishedPulling="2026-04-17 11:18:53.767019233 +0000 UTC m=+163.162942332" observedRunningTime="2026-04-17 11:18:55.756689589 +0000 UTC m=+165.152612745" watchObservedRunningTime="2026-04-17 11:18:55.758329364 +0000 UTC m=+165.154252470" Apr 17 11:18:56.486990 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:56.486951 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hp9n9"] Apr 17 11:18:56.489890 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:56.489874 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hp9n9" Apr 17 11:18:56.492300 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:56.492278 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 11:18:56.493471 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:56.493453 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-6dptq\"" Apr 17 11:18:56.499486 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:56.499464 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hp9n9"] Apr 17 11:18:56.523488 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:56.523457 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/896800c7-73d6-4a93-95ad-0095f218b33d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hp9n9\" (UID: \"896800c7-73d6-4a93-95ad-0095f218b33d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hp9n9" Apr 17 11:18:56.623904 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:56.623862 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/896800c7-73d6-4a93-95ad-0095f218b33d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hp9n9\" (UID: \"896800c7-73d6-4a93-95ad-0095f218b33d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hp9n9" Apr 17 11:18:56.624087 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:18:56.624012 2571 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 11:18:56.624154 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:18:56.624092 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/896800c7-73d6-4a93-95ad-0095f218b33d-monitoring-plugin-cert podName:896800c7-73d6-4a93-95ad-0095f218b33d nodeName:}" failed. No retries permitted until 2026-04-17 11:18:57.124070152 +0000 UTC m=+166.519993234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/896800c7-73d6-4a93-95ad-0095f218b33d-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-hp9n9" (UID: "896800c7-73d6-4a93-95ad-0095f218b33d") : secret "monitoring-plugin-cert" not found Apr 17 11:18:56.736712 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:56.736670 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" event={"ID":"5466274d-9675-4358-af95-c9c0e9882159","Type":"ContainerStarted","Data":"6343d3be6d3202210e5316f074cbdf3b407cfe814219cbf4a98f3492d4d75244"} Apr 17 11:18:56.736712 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:56.736721 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" event={"ID":"5466274d-9675-4358-af95-c9c0e9882159","Type":"ContainerStarted","Data":"7ecf5e14fd57dcdbe846760bf0ff4f5f1949a6608ff54302c70f685039e84d1e"} Apr 17 11:18:56.736959 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:56.736739 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" event={"ID":"5466274d-9675-4358-af95-c9c0e9882159","Type":"ContainerStarted","Data":"36ee3e2dcac88b6cd6af09d462ab0b3c7691b01a944f599609c56fcfe2dfc366"} Apr 17 11:18:57.128834 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:57.128741 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/896800c7-73d6-4a93-95ad-0095f218b33d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hp9n9\" (UID: \"896800c7-73d6-4a93-95ad-0095f218b33d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hp9n9" Apr 17 11:18:57.131126 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:57.131101 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/896800c7-73d6-4a93-95ad-0095f218b33d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hp9n9\" (UID: \"896800c7-73d6-4a93-95ad-0095f218b33d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hp9n9" Apr 17 11:18:57.399055 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:57.398955 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hp9n9" Apr 17 11:18:57.520651 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:57.520620 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hp9n9"] Apr 17 11:18:57.525496 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:18:57.525468 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod896800c7_73d6_4a93_95ad_0095f218b33d.slice/crio-e075b869a5773007cd464540861970d9f8104a9c55d284f7f5ab9ea8693a391a WatchSource:0}: Error finding container e075b869a5773007cd464540861970d9f8104a9c55d284f7f5ab9ea8693a391a: Status 404 returned error can't find the container with id e075b869a5773007cd464540861970d9f8104a9c55d284f7f5ab9ea8693a391a Apr 17 11:18:57.745866 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:57.745774 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" event={"ID":"5466274d-9675-4358-af95-c9c0e9882159","Type":"ContainerStarted","Data":"d5ac08c16d6e85d18d9e15bbad79aa15c28856e0e7e17c26bb04764c9682520f"} Apr 17 11:18:57.745866 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:57.745815 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" event={"ID":"5466274d-9675-4358-af95-c9c0e9882159","Type":"ContainerStarted","Data":"77e0f7e0cbc2a0d80d64ba0333ce2a0d9bbce65b3f0563f28551a810b4f3d544"} Apr 17 11:18:57.745866 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:57.745825 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" event={"ID":"5466274d-9675-4358-af95-c9c0e9882159","Type":"ContainerStarted","Data":"3ce7a375e959383333350782a75d627defa28d52fc111e8a37de019b71b5de12"} Apr 17 11:18:57.746089 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:57.745985 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:18:57.746872 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:57.746848 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hp9n9" event={"ID":"896800c7-73d6-4a93-95ad-0095f218b33d","Type":"ContainerStarted","Data":"e075b869a5773007cd464540861970d9f8104a9c55d284f7f5ab9ea8693a391a"} Apr 17 11:18:57.770063 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:57.770018 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" podStartSLOduration=2.116818551 podStartE2EDuration="4.770005951s" podCreationTimestamp="2026-04-17 11:18:53 +0000 UTC" firstStartedPulling="2026-04-17 11:18:54.208254578 +0000 UTC m=+163.604177663" lastFinishedPulling="2026-04-17 11:18:56.861441981 +0000 UTC m=+166.257365063" observedRunningTime="2026-04-17 11:18:57.768763948 +0000 UTC m=+167.164687052" watchObservedRunningTime="2026-04-17 11:18:57.770005951 +0000 UTC m=+167.165929055" Apr 17 11:18:59.753559 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:59.753485 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hp9n9" event={"ID":"896800c7-73d6-4a93-95ad-0095f218b33d","Type":"ContainerStarted","Data":"2c39b4fe2a030e49b173ef688e13a4a451f93622e153a5d2278d684cb8990b88"} Apr 17 11:18:59.753922 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:59.753702 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hp9n9" Apr 17 11:18:59.758151 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:59.758128 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hp9n9" Apr 17 11:18:59.768302 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:18:59.768260 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hp9n9" podStartSLOduration=2.576853844 podStartE2EDuration="3.768246035s" podCreationTimestamp="2026-04-17 11:18:56 +0000 UTC" firstStartedPulling="2026-04-17 11:18:57.527170795 +0000 UTC m=+166.923093877" lastFinishedPulling="2026-04-17 11:18:58.718562983 +0000 UTC m=+168.114486068" observedRunningTime="2026-04-17 11:18:59.768186049 +0000 UTC m=+169.164109154" watchObservedRunningTime="2026-04-17 11:18:59.768246035 +0000 UTC m=+169.164169140" Apr 17 11:19:00.212599 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:00.212554 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l6vms" Apr 17 11:19:00.215195 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:00.215176 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-j8tns\"" Apr 17 11:19:00.223010 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:00.222992 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l6vms" Apr 17 11:19:00.335900 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:00.335868 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l6vms"] Apr 17 11:19:00.338940 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:19:00.338908 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6d8ae60_a18c_4042_87a0_4790a47763c3.slice/crio-07986c4ae310b67d0b457852b7ac6981ecce9c7b0294b9dd8fb60a37bb685abc WatchSource:0}: Error finding container 07986c4ae310b67d0b457852b7ac6981ecce9c7b0294b9dd8fb60a37bb685abc: Status 404 returned error can't find the container with id 07986c4ae310b67d0b457852b7ac6981ecce9c7b0294b9dd8fb60a37bb685abc Apr 17 11:19:00.757467 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:00.757434 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l6vms" event={"ID":"b6d8ae60-a18c-4042-87a0-4790a47763c3","Type":"ContainerStarted","Data":"07986c4ae310b67d0b457852b7ac6981ecce9c7b0294b9dd8fb60a37bb685abc"} Apr 17 11:19:02.212595 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:02.212490 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:19:02.767585 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:02.767543 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l6vms" event={"ID":"b6d8ae60-a18c-4042-87a0-4790a47763c3","Type":"ContainerStarted","Data":"9d6173d66155cbb175ed6233c1dde00bb0aea02b4ebc93144a3c7930a5f74a95"} Apr 17 11:19:02.768772 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:02.768744 2571 generic.go:358] "Generic (PLEG): container finished" podID="08840162-a5b5-4cb0-9aef-1e6ce6c5d575" containerID="68cf99a310d94b260c1f9d0787603d524ce0cc973f601a5035b6868f9b3450b8" exitCode=1 Apr 17 11:19:02.768890 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:02.768818 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" event={"ID":"08840162-a5b5-4cb0-9aef-1e6ce6c5d575","Type":"ContainerDied","Data":"68cf99a310d94b260c1f9d0787603d524ce0cc973f601a5035b6868f9b3450b8"} Apr 17 11:19:02.769093 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:02.769075 2571 scope.go:117] "RemoveContainer" containerID="68cf99a310d94b260c1f9d0787603d524ce0cc973f601a5035b6868f9b3450b8" Apr 17 11:19:02.782386 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:02.782318 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-l6vms" podStartSLOduration=138.258687717 podStartE2EDuration="2m19.782300566s" podCreationTimestamp="2026-04-17 11:16:43 +0000 UTC" firstStartedPulling="2026-04-17 11:19:00.340798173 +0000 UTC m=+169.736721256" lastFinishedPulling="2026-04-17 11:19:01.864411023 +0000 UTC m=+171.260334105" observedRunningTime="2026-04-17 11:19:02.781354203 +0000 UTC m=+172.177277344" watchObservedRunningTime="2026-04-17 11:19:02.782300566 +0000 UTC m=+172.178223685" Apr 17 11:19:03.454480 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:03.454448 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" Apr 17 11:19:03.468574 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:03.468544 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-fjq7v"] Apr 17 11:19:03.471680 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:03.471665 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-fjq7v" Apr 17 11:19:03.477053 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:03.477027 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 11:19:03.477183 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:03.477162 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-w8gtp\"" Apr 17 11:19:03.477346 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:03.477330 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 11:19:03.485583 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:03.485559 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-fjq7v"] Apr 17 11:19:03.583706 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:03.583670 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ccq9\" (UniqueName: \"kubernetes.io/projected/419b6231-5c83-4da0-981d-a59db536295f-kube-api-access-5ccq9\") pod \"downloads-6bcc868b7-fjq7v\" (UID: \"419b6231-5c83-4da0-981d-a59db536295f\") " pod="openshift-console/downloads-6bcc868b7-fjq7v" Apr 17 11:19:03.684459 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:03.684429 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ccq9\" (UniqueName: \"kubernetes.io/projected/419b6231-5c83-4da0-981d-a59db536295f-kube-api-access-5ccq9\") pod \"downloads-6bcc868b7-fjq7v\" (UID: \"419b6231-5c83-4da0-981d-a59db536295f\") " pod="openshift-console/downloads-6bcc868b7-fjq7v" Apr 17 11:19:03.692737 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:03.692718 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ccq9\" (UniqueName: \"kubernetes.io/projected/419b6231-5c83-4da0-981d-a59db536295f-kube-api-access-5ccq9\") pod \"downloads-6bcc868b7-fjq7v\" (UID: \"419b6231-5c83-4da0-981d-a59db536295f\") " pod="openshift-console/downloads-6bcc868b7-fjq7v" Apr 17 11:19:03.756261 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:03.756183 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6f7f856ccb-8jmgz" Apr 17 11:19:03.773238 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:03.773205 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" event={"ID":"08840162-a5b5-4cb0-9aef-1e6ce6c5d575","Type":"ContainerStarted","Data":"15eedc8b1437165e0c753655a17282d66e674721f1ef152f08c9ad039f2a9ec3"} Apr 17 11:19:03.780094 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:03.780074 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-fjq7v" Apr 17 11:19:03.899114 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:03.899085 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-fjq7v"] Apr 17 11:19:03.902386 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:19:03.902349 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod419b6231_5c83_4da0_981d_a59db536295f.slice/crio-0d68a124edca521d3ea033c74baf3b32b2a3cc4641f3925aa49c0a2858d1c319 WatchSource:0}: Error finding container 0d68a124edca521d3ea033c74baf3b32b2a3cc4641f3925aa49c0a2858d1c319: Status 404 returned error can't find the container with id 0d68a124edca521d3ea033c74baf3b32b2a3cc4641f3925aa49c0a2858d1c319 Apr 17 11:19:04.733869 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:04.733841 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6wgpx" Apr 17 11:19:04.778082 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:04.778044 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-fjq7v" event={"ID":"419b6231-5c83-4da0-981d-a59db536295f","Type":"ContainerStarted","Data":"0d68a124edca521d3ea033c74baf3b32b2a3cc4641f3925aa49c0a2858d1c319"} Apr 17 11:19:04.778342 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:04.778320 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" Apr 17 11:19:04.779009 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:04.778981 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f5f6644b4-8gjmp" Apr 17 11:19:20.825492 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:20.825443 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-fjq7v" event={"ID":"419b6231-5c83-4da0-981d-a59db536295f","Type":"ContainerStarted","Data":"3db4fbcf4db02a7d7af2d2f2d9ebef639e91fe075402d9a8c46a727aa83d1b56"} Apr 17 11:19:20.825866 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:20.825666 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-fjq7v" Apr 17 11:19:20.843157 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:20.843132 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-fjq7v" Apr 17 11:19:20.845051 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:20.845001 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-fjq7v" podStartSLOduration=1.928565217 podStartE2EDuration="17.844984516s" podCreationTimestamp="2026-04-17 11:19:03 +0000 UTC" firstStartedPulling="2026-04-17 11:19:03.904302445 +0000 UTC m=+173.300225539" lastFinishedPulling="2026-04-17 11:19:19.820721756 +0000 UTC m=+189.216644838" observedRunningTime="2026-04-17 11:19:20.843622443 +0000 UTC m=+190.239545548" watchObservedRunningTime="2026-04-17 11:19:20.844984516 +0000 UTC m=+190.240907619" Apr 17 11:19:57.307387 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:19:57.307308 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" podUID="fc0a0424-86a2-44a2-8d5c-34b3cd005718" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 11:20:07.306985 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:20:07.306945 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" podUID="fc0a0424-86a2-44a2-8d5c-34b3cd005718" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 11:20:17.307970 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:20:17.307931 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" podUID="fc0a0424-86a2-44a2-8d5c-34b3cd005718" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 11:20:17.308343 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:20:17.308012 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" Apr 17 11:20:17.308495 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:20:17.308464 2571 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"3e5ed0e02514d7ea246d731549fb3fb83a194d793beeb340795cb060d17f6624"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 11:20:17.308538 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:20:17.308526 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" podUID="fc0a0424-86a2-44a2-8d5c-34b3cd005718" containerName="service-proxy" containerID="cri-o://3e5ed0e02514d7ea246d731549fb3fb83a194d793beeb340795cb060d17f6624" gracePeriod=30 Apr 17 11:20:17.978132 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:20:17.978097 2571 generic.go:358] "Generic (PLEG): container finished" podID="fc0a0424-86a2-44a2-8d5c-34b3cd005718" containerID="3e5ed0e02514d7ea246d731549fb3fb83a194d793beeb340795cb060d17f6624" exitCode=2 Apr 17 11:20:17.978305 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:20:17.978139 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" event={"ID":"fc0a0424-86a2-44a2-8d5c-34b3cd005718","Type":"ContainerDied","Data":"3e5ed0e02514d7ea246d731549fb3fb83a194d793beeb340795cb060d17f6624"} Apr 17 11:20:17.978305 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:20:17.978162 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-59964b5894-l9t7n" event={"ID":"fc0a0424-86a2-44a2-8d5c-34b3cd005718","Type":"ContainerStarted","Data":"7f361d6df302af6a2c345d1a363d6a8a446180ecdabe6465f774e1ba488da7b2"} Apr 17 11:20:23.022725 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:20:23.022687 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs\") pod \"network-metrics-daemon-zsnbl\" (UID: \"44159d9f-1705-4830-8bfe-c087640f29cb\") " pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:20:23.025089 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:20:23.025065 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/44159d9f-1705-4830-8bfe-c087640f29cb-metrics-certs\") pod \"network-metrics-daemon-zsnbl\" (UID: \"44159d9f-1705-4830-8bfe-c087640f29cb\") " pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:20:23.217090 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:20:23.217059 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-g9l5h\"" Apr 17 11:20:23.223793 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:20:23.223769 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsnbl" Apr 17 11:20:23.341689 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:20:23.341658 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zsnbl"] Apr 17 11:20:23.344848 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:20:23.344820 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44159d9f_1705_4830_8bfe_c087640f29cb.slice/crio-51cfd8d517bd86b46b2fef7c7bb1adb3b8d324f5743534c571356861599b4be9 WatchSource:0}: Error finding container 51cfd8d517bd86b46b2fef7c7bb1adb3b8d324f5743534c571356861599b4be9: Status 404 returned error can't find the container with id 51cfd8d517bd86b46b2fef7c7bb1adb3b8d324f5743534c571356861599b4be9 Apr 17 11:20:23.994674 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:20:23.994637 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zsnbl" event={"ID":"44159d9f-1705-4830-8bfe-c087640f29cb","Type":"ContainerStarted","Data":"51cfd8d517bd86b46b2fef7c7bb1adb3b8d324f5743534c571356861599b4be9"} Apr 17 11:20:24.998804 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:20:24.998769 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zsnbl" event={"ID":"44159d9f-1705-4830-8bfe-c087640f29cb","Type":"ContainerStarted","Data":"3681a72072c54d82b8f999023aaee3b1f4098722af45a20c941e4af0487337b3"} Apr 17 11:20:24.999178 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:20:24.998809 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zsnbl" event={"ID":"44159d9f-1705-4830-8bfe-c087640f29cb","Type":"ContainerStarted","Data":"03393cc64c1b5c5100a0b30b2e71ff32f12891c743d79fa5bee0e5aa061794ff"} Apr 17 11:20:25.016659 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:20:25.016610 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zsnbl" podStartSLOduration=253.002475279 podStartE2EDuration="4m14.016596831s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:20:23.346782175 +0000 UTC m=+252.742705257" lastFinishedPulling="2026-04-17 11:20:24.360903711 +0000 UTC m=+253.756826809" observedRunningTime="2026-04-17 11:20:25.015421744 +0000 UTC m=+254.411344849" watchObservedRunningTime="2026-04-17 11:20:25.016596831 +0000 UTC m=+254.412519935" Apr 17 11:21:11.104942 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:21:11.104917 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 11:22:44.148021 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:44.147992 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6"] Apr 17 11:22:44.150917 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:44.150899 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6" Apr 17 11:22:44.153228 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:44.153202 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-swh2b\"" Apr 17 11:22:44.153348 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:44.153260 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 11:22:44.153348 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:44.153326 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 11:22:44.153560 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:44.153347 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 11:22:44.167451 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:44.167424 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6"] Apr 17 11:22:44.216992 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:44.216962 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kbk5\" (UniqueName: \"kubernetes.io/projected/82ea0b03-022f-4f8c-94a9-839bea111275-kube-api-access-7kbk5\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6\" (UID: \"82ea0b03-022f-4f8c-94a9-839bea111275\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6" Apr 17 11:22:44.217116 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:44.217037 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/82ea0b03-022f-4f8c-94a9-839bea111275-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6\" (UID: \"82ea0b03-022f-4f8c-94a9-839bea111275\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6" Apr 17 11:22:44.318003 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:44.317978 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/82ea0b03-022f-4f8c-94a9-839bea111275-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6\" (UID: \"82ea0b03-022f-4f8c-94a9-839bea111275\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6" Apr 17 11:22:44.318133 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:44.318023 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kbk5\" (UniqueName: \"kubernetes.io/projected/82ea0b03-022f-4f8c-94a9-839bea111275-kube-api-access-7kbk5\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6\" (UID: \"82ea0b03-022f-4f8c-94a9-839bea111275\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6" Apr 17 11:22:44.320225 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:44.320206 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/82ea0b03-022f-4f8c-94a9-839bea111275-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6\" (UID: \"82ea0b03-022f-4f8c-94a9-839bea111275\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6" Apr 17 11:22:44.328061 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:44.328033 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kbk5\" (UniqueName: \"kubernetes.io/projected/82ea0b03-022f-4f8c-94a9-839bea111275-kube-api-access-7kbk5\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6\" (UID: \"82ea0b03-022f-4f8c-94a9-839bea111275\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6" Apr 17 11:22:44.461143 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:44.461048 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6" Apr 17 11:22:44.584020 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:44.583988 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6"] Apr 17 11:22:44.587358 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:22:44.587325 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82ea0b03_022f_4f8c_94a9_839bea111275.slice/crio-691648236e28c51ec70fffc87bfe598e3a408dff924dab4227229e312e985f18 WatchSource:0}: Error finding container 691648236e28c51ec70fffc87bfe598e3a408dff924dab4227229e312e985f18: Status 404 returned error can't find the container with id 691648236e28c51ec70fffc87bfe598e3a408dff924dab4227229e312e985f18 Apr 17 11:22:44.588997 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:44.588978 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:22:45.363552 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:45.363512 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6" event={"ID":"82ea0b03-022f-4f8c-94a9-839bea111275","Type":"ContainerStarted","Data":"691648236e28c51ec70fffc87bfe598e3a408dff924dab4227229e312e985f18"} Apr 17 11:22:49.369529 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:49.369502 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-rp5mp"] Apr 17 11:22:49.372466 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:49.372447 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-rp5mp" Apr 17 11:22:49.374490 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:49.374468 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 17 11:22:49.374613 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:49.374549 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-2d597\"" Apr 17 11:22:49.374613 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:49.374562 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 11:22:49.374959 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:49.374914 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6" event={"ID":"82ea0b03-022f-4f8c-94a9-839bea111275","Type":"ContainerStarted","Data":"c21deaecaadefe479cb52a7a573a72e6b7f3ee3ce251e22029a94f5aca99b4de"} Apr 17 11:22:49.375068 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:49.375053 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6" Apr 17 11:22:49.381292 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:49.381268 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-rp5mp"] Apr 17 11:22:49.412823 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:49.412714 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6" podStartSLOduration=1.152456031 podStartE2EDuration="5.412697297s" podCreationTimestamp="2026-04-17 11:22:44 +0000 UTC" firstStartedPulling="2026-04-17 11:22:44.589154908 +0000 UTC m=+393.985078002" lastFinishedPulling="2026-04-17 11:22:48.849396183 +0000 UTC m=+398.245319268" observedRunningTime="2026-04-17 11:22:49.411991116 +0000 UTC m=+398.807914219" watchObservedRunningTime="2026-04-17 11:22:49.412697297 +0000 UTC m=+398.808620402" Apr 17 11:22:49.450897 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:49.450864 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnchg\" (UniqueName: \"kubernetes.io/projected/cddf0250-1b16-44dc-95bb-fbc109cb51a2-kube-api-access-fnchg\") pod \"keda-operator-ffbb595cb-rp5mp\" (UID: \"cddf0250-1b16-44dc-95bb-fbc109cb51a2\") " pod="openshift-keda/keda-operator-ffbb595cb-rp5mp" Apr 17 11:22:49.451054 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:49.450903 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/cddf0250-1b16-44dc-95bb-fbc109cb51a2-cabundle0\") pod \"keda-operator-ffbb595cb-rp5mp\" (UID: \"cddf0250-1b16-44dc-95bb-fbc109cb51a2\") " pod="openshift-keda/keda-operator-ffbb595cb-rp5mp" Apr 17 11:22:49.451054 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:49.450989 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cddf0250-1b16-44dc-95bb-fbc109cb51a2-certificates\") pod \"keda-operator-ffbb595cb-rp5mp\" (UID: \"cddf0250-1b16-44dc-95bb-fbc109cb51a2\") " pod="openshift-keda/keda-operator-ffbb595cb-rp5mp" Apr 17 11:22:49.551684 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:49.551646 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnchg\" (UniqueName: \"kubernetes.io/projected/cddf0250-1b16-44dc-95bb-fbc109cb51a2-kube-api-access-fnchg\") pod \"keda-operator-ffbb595cb-rp5mp\" (UID: \"cddf0250-1b16-44dc-95bb-fbc109cb51a2\") " pod="openshift-keda/keda-operator-ffbb595cb-rp5mp" Apr 17 11:22:49.551684 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:49.551689 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/cddf0250-1b16-44dc-95bb-fbc109cb51a2-cabundle0\") pod \"keda-operator-ffbb595cb-rp5mp\" (UID: \"cddf0250-1b16-44dc-95bb-fbc109cb51a2\") " pod="openshift-keda/keda-operator-ffbb595cb-rp5mp" Apr 17 11:22:49.551925 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:49.551727 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cddf0250-1b16-44dc-95bb-fbc109cb51a2-certificates\") pod \"keda-operator-ffbb595cb-rp5mp\" (UID: \"cddf0250-1b16-44dc-95bb-fbc109cb51a2\") " pod="openshift-keda/keda-operator-ffbb595cb-rp5mp" Apr 17 11:22:49.551925 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:22:49.551834 2571 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 17 11:22:49.551925 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:22:49.551852 2571 secret.go:281] references non-existent secret key: ca.crt Apr 17 11:22:49.551925 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:22:49.551861 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 11:22:49.551925 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:22:49.551875 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-rp5mp: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 17 11:22:49.552096 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:22:49.551940 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cddf0250-1b16-44dc-95bb-fbc109cb51a2-certificates podName:cddf0250-1b16-44dc-95bb-fbc109cb51a2 nodeName:}" failed. No retries permitted until 2026-04-17 11:22:50.051920389 +0000 UTC m=+399.447843490 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/cddf0250-1b16-44dc-95bb-fbc109cb51a2-certificates") pod "keda-operator-ffbb595cb-rp5mp" (UID: "cddf0250-1b16-44dc-95bb-fbc109cb51a2") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 17 11:22:49.552278 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:49.552257 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/cddf0250-1b16-44dc-95bb-fbc109cb51a2-cabundle0\") pod \"keda-operator-ffbb595cb-rp5mp\" (UID: \"cddf0250-1b16-44dc-95bb-fbc109cb51a2\") " pod="openshift-keda/keda-operator-ffbb595cb-rp5mp" Apr 17 11:22:49.561718 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:49.561695 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnchg\" (UniqueName: \"kubernetes.io/projected/cddf0250-1b16-44dc-95bb-fbc109cb51a2-kube-api-access-fnchg\") pod \"keda-operator-ffbb595cb-rp5mp\" (UID: \"cddf0250-1b16-44dc-95bb-fbc109cb51a2\") " pod="openshift-keda/keda-operator-ffbb595cb-rp5mp" Apr 17 11:22:50.054768 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:50.054731 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cddf0250-1b16-44dc-95bb-fbc109cb51a2-certificates\") pod \"keda-operator-ffbb595cb-rp5mp\" (UID: \"cddf0250-1b16-44dc-95bb-fbc109cb51a2\") " pod="openshift-keda/keda-operator-ffbb595cb-rp5mp" Apr 17 11:22:50.054939 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:22:50.054843 2571 secret.go:281] references non-existent secret key: ca.crt Apr 17 11:22:50.054939 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:22:50.054855 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 11:22:50.054939 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:22:50.054863 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-rp5mp: references non-existent secret key: ca.crt Apr 17 11:22:50.054939 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:22:50.054910 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cddf0250-1b16-44dc-95bb-fbc109cb51a2-certificates podName:cddf0250-1b16-44dc-95bb-fbc109cb51a2 nodeName:}" failed. No retries permitted until 2026-04-17 11:22:51.054896888 +0000 UTC m=+400.450819969 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/cddf0250-1b16-44dc-95bb-fbc109cb51a2-certificates") pod "keda-operator-ffbb595cb-rp5mp" (UID: "cddf0250-1b16-44dc-95bb-fbc109cb51a2") : references non-existent secret key: ca.crt Apr 17 11:22:51.060433 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:51.060400 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cddf0250-1b16-44dc-95bb-fbc109cb51a2-certificates\") pod \"keda-operator-ffbb595cb-rp5mp\" (UID: \"cddf0250-1b16-44dc-95bb-fbc109cb51a2\") " pod="openshift-keda/keda-operator-ffbb595cb-rp5mp" Apr 17 11:22:51.060793 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:22:51.060540 2571 secret.go:281] references non-existent secret key: ca.crt Apr 17 11:22:51.060793 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:22:51.060558 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 11:22:51.060793 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:22:51.060568 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-rp5mp: references non-existent secret key: ca.crt Apr 17 11:22:51.060793 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:22:51.060623 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cddf0250-1b16-44dc-95bb-fbc109cb51a2-certificates podName:cddf0250-1b16-44dc-95bb-fbc109cb51a2 nodeName:}" failed. No retries permitted until 2026-04-17 11:22:53.060608511 +0000 UTC m=+402.456531593 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/cddf0250-1b16-44dc-95bb-fbc109cb51a2-certificates") pod "keda-operator-ffbb595cb-rp5mp" (UID: "cddf0250-1b16-44dc-95bb-fbc109cb51a2") : references non-existent secret key: ca.crt Apr 17 11:22:53.072916 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:53.072877 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cddf0250-1b16-44dc-95bb-fbc109cb51a2-certificates\") pod \"keda-operator-ffbb595cb-rp5mp\" (UID: \"cddf0250-1b16-44dc-95bb-fbc109cb51a2\") " pod="openshift-keda/keda-operator-ffbb595cb-rp5mp" Apr 17 11:22:53.075351 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:53.075327 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/cddf0250-1b16-44dc-95bb-fbc109cb51a2-certificates\") pod \"keda-operator-ffbb595cb-rp5mp\" (UID: \"cddf0250-1b16-44dc-95bb-fbc109cb51a2\") " pod="openshift-keda/keda-operator-ffbb595cb-rp5mp" Apr 17 11:22:53.283575 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:53.283537 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-rp5mp" Apr 17 11:22:53.402393 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:53.402335 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-rp5mp"] Apr 17 11:22:53.407188 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:22:53.407158 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcddf0250_1b16_44dc_95bb_fbc109cb51a2.slice/crio-ebee28613d71ccc0f741b199925e38305a18339f62a2665ca255351c45a4d25e WatchSource:0}: Error finding container ebee28613d71ccc0f741b199925e38305a18339f62a2665ca255351c45a4d25e: Status 404 returned error can't find the container with id ebee28613d71ccc0f741b199925e38305a18339f62a2665ca255351c45a4d25e Apr 17 11:22:54.387565 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:54.387532 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-rp5mp" event={"ID":"cddf0250-1b16-44dc-95bb-fbc109cb51a2","Type":"ContainerStarted","Data":"ebee28613d71ccc0f741b199925e38305a18339f62a2665ca255351c45a4d25e"} Apr 17 11:22:58.400006 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:58.399976 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-rp5mp" event={"ID":"cddf0250-1b16-44dc-95bb-fbc109cb51a2","Type":"ContainerStarted","Data":"b85c2093124d6def70903582d36ca4ea9a0fcbc51f7c177b6920098e97c0d6e4"} Apr 17 11:22:58.400357 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:58.400137 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-rp5mp" Apr 17 11:22:58.418344 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:22:58.418302 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-rp5mp" podStartSLOduration=4.808528307 podStartE2EDuration="9.418287123s" podCreationTimestamp="2026-04-17 11:22:49 +0000 UTC" firstStartedPulling="2026-04-17 11:22:53.408433537 +0000 UTC m=+402.804356622" lastFinishedPulling="2026-04-17 11:22:58.018192355 +0000 UTC m=+407.414115438" observedRunningTime="2026-04-17 11:22:58.417030462 +0000 UTC m=+407.812953591" watchObservedRunningTime="2026-04-17 11:22:58.418287123 +0000 UTC m=+407.814210227" Apr 17 11:23:10.379312 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:23:10.379279 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ctmz6" Apr 17 11:23:19.405008 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:23:19.404970 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-rp5mp" Apr 17 11:23:55.368222 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:23:55.368186 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mfz54"] Apr 17 11:23:55.371321 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:23:55.371302 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mfz54" Apr 17 11:23:55.379942 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:23:55.379925 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 11:23:55.380574 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:23:55.380559 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:23:55.380617 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:23:55.380563 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-vx9pw\"" Apr 17 11:23:55.397443 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:23:55.397423 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mfz54"] Apr 17 11:23:55.531231 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:23:55.531192 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgkhk\" (UniqueName: \"kubernetes.io/projected/e9d6003a-a895-40c7-bc89-70c94ba78222-kube-api-access-vgkhk\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mfz54\" (UID: \"e9d6003a-a895-40c7-bc89-70c94ba78222\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mfz54" Apr 17 11:23:55.531424 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:23:55.531242 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9d6003a-a895-40c7-bc89-70c94ba78222-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mfz54\" (UID: \"e9d6003a-a895-40c7-bc89-70c94ba78222\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mfz54" Apr 17 11:23:55.632511 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:23:55.632425 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgkhk\" (UniqueName: \"kubernetes.io/projected/e9d6003a-a895-40c7-bc89-70c94ba78222-kube-api-access-vgkhk\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mfz54\" (UID: \"e9d6003a-a895-40c7-bc89-70c94ba78222\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mfz54" Apr 17 11:23:55.632511 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:23:55.632475 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9d6003a-a895-40c7-bc89-70c94ba78222-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mfz54\" (UID: \"e9d6003a-a895-40c7-bc89-70c94ba78222\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mfz54" Apr 17 11:23:55.632788 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:23:55.632773 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9d6003a-a895-40c7-bc89-70c94ba78222-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mfz54\" (UID: \"e9d6003a-a895-40c7-bc89-70c94ba78222\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mfz54" Apr 17 11:23:55.655353 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:23:55.655326 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgkhk\" (UniqueName: \"kubernetes.io/projected/e9d6003a-a895-40c7-bc89-70c94ba78222-kube-api-access-vgkhk\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-mfz54\" (UID: \"e9d6003a-a895-40c7-bc89-70c94ba78222\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mfz54" Apr 17 11:23:55.680249 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:23:55.680224 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mfz54" Apr 17 11:23:55.828121 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:23:55.828085 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mfz54"] Apr 17 11:23:55.834538 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:23:55.834507 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d6003a_a895_40c7_bc89_70c94ba78222.slice/crio-a2c157036b72d472f302e14bfcfb4361e8fd619ecdc556c2466b519c91e120e5 WatchSource:0}: Error finding container a2c157036b72d472f302e14bfcfb4361e8fd619ecdc556c2466b519c91e120e5: Status 404 returned error can't find the container with id a2c157036b72d472f302e14bfcfb4361e8fd619ecdc556c2466b519c91e120e5 Apr 17 11:23:56.557901 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:23:56.557864 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mfz54" event={"ID":"e9d6003a-a895-40c7-bc89-70c94ba78222","Type":"ContainerStarted","Data":"a2c157036b72d472f302e14bfcfb4361e8fd619ecdc556c2466b519c91e120e5"} Apr 17 11:23:57.563052 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:23:57.563017 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mfz54" event={"ID":"e9d6003a-a895-40c7-bc89-70c94ba78222","Type":"ContainerStarted","Data":"590123e627d6cb1408c95c05d0b69bed49cf628a526610bd411cf47e2a570a85"} Apr 17 11:23:57.588661 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:23:57.588616 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-mfz54" podStartSLOduration=1.100761844 podStartE2EDuration="2.588601052s" podCreationTimestamp="2026-04-17 11:23:55 +0000 UTC" firstStartedPulling="2026-04-17 11:23:55.837151433 +0000 UTC m=+465.233074518" lastFinishedPulling="2026-04-17 11:23:57.32499064 +0000 UTC m=+466.720913726" observedRunningTime="2026-04-17 11:23:57.587517591 +0000 UTC m=+466.983440705" watchObservedRunningTime="2026-04-17 11:23:57.588601052 +0000 UTC m=+466.984524155" Apr 17 11:24:11.845102 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:11.845071 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-656mq"] Apr 17 11:24:11.848171 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:11.848155 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-656mq" Apr 17 11:24:11.850076 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:11.850055 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 11:24:11.850264 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:11.850252 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:24:11.850674 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:11.850659 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-j7tnq\"" Apr 17 11:24:11.856011 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:11.855973 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-656mq"] Apr 17 11:24:11.961666 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:11.961630 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5118fa57-5af0-4e64-9ae9-ac54ed661c3e-tmp\") pod \"openshift-lws-operator-bfc7f696d-656mq\" (UID: \"5118fa57-5af0-4e64-9ae9-ac54ed661c3e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-656mq" Apr 17 11:24:11.961822 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:11.961720 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrwcr\" (UniqueName: \"kubernetes.io/projected/5118fa57-5af0-4e64-9ae9-ac54ed661c3e-kube-api-access-vrwcr\") pod \"openshift-lws-operator-bfc7f696d-656mq\" (UID: \"5118fa57-5af0-4e64-9ae9-ac54ed661c3e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-656mq" Apr 17 11:24:12.062525 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:12.062492 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrwcr\" (UniqueName: \"kubernetes.io/projected/5118fa57-5af0-4e64-9ae9-ac54ed661c3e-kube-api-access-vrwcr\") pod \"openshift-lws-operator-bfc7f696d-656mq\" (UID: \"5118fa57-5af0-4e64-9ae9-ac54ed661c3e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-656mq" Apr 17 11:24:12.062694 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:12.062556 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5118fa57-5af0-4e64-9ae9-ac54ed661c3e-tmp\") pod \"openshift-lws-operator-bfc7f696d-656mq\" (UID: \"5118fa57-5af0-4e64-9ae9-ac54ed661c3e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-656mq" Apr 17 11:24:12.062938 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:12.062918 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5118fa57-5af0-4e64-9ae9-ac54ed661c3e-tmp\") pod \"openshift-lws-operator-bfc7f696d-656mq\" (UID: \"5118fa57-5af0-4e64-9ae9-ac54ed661c3e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-656mq" Apr 17 11:24:12.073669 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:12.073645 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrwcr\" (UniqueName: \"kubernetes.io/projected/5118fa57-5af0-4e64-9ae9-ac54ed661c3e-kube-api-access-vrwcr\") pod \"openshift-lws-operator-bfc7f696d-656mq\" (UID: \"5118fa57-5af0-4e64-9ae9-ac54ed661c3e\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-656mq" Apr 17 11:24:12.158073 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:12.157984 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-656mq" Apr 17 11:24:12.280872 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:12.280841 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-656mq"] Apr 17 11:24:12.283956 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:24:12.283927 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5118fa57_5af0_4e64_9ae9_ac54ed661c3e.slice/crio-865ecb56d4506179000a892c663656f8bee64309c9b77874c0d770032bae2dfc WatchSource:0}: Error finding container 865ecb56d4506179000a892c663656f8bee64309c9b77874c0d770032bae2dfc: Status 404 returned error can't find the container with id 865ecb56d4506179000a892c663656f8bee64309c9b77874c0d770032bae2dfc Apr 17 11:24:12.612082 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:12.612051 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-656mq" event={"ID":"5118fa57-5af0-4e64-9ae9-ac54ed661c3e","Type":"ContainerStarted","Data":"865ecb56d4506179000a892c663656f8bee64309c9b77874c0d770032bae2dfc"} Apr 17 11:24:15.622619 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:15.622540 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-656mq" event={"ID":"5118fa57-5af0-4e64-9ae9-ac54ed661c3e","Type":"ContainerStarted","Data":"f3006c869a794a31814f61afff7643f2eaff3bff7f33cc7db9d48fcc589c9adf"} Apr 17 11:24:15.638037 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:15.637989 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-656mq" podStartSLOduration=1.572679736 podStartE2EDuration="4.637972189s" podCreationTimestamp="2026-04-17 11:24:11 +0000 UTC" firstStartedPulling="2026-04-17 11:24:12.285528806 +0000 UTC m=+481.681451891" lastFinishedPulling="2026-04-17 11:24:15.350821251 +0000 UTC m=+484.746744344" observedRunningTime="2026-04-17 11:24:15.636961914 +0000 UTC m=+485.032885017" watchObservedRunningTime="2026-04-17 11:24:15.637972189 +0000 UTC m=+485.033895296" Apr 17 11:24:42.527700 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.527659 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-56b679b657-dz99n"] Apr 17 11:24:42.531275 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.531247 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-56b679b657-dz99n" Apr 17 11:24:42.533951 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.533930 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 11:24:42.533951 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.533941 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-l5tjv\"" Apr 17 11:24:42.533951 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.533930 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 11:24:42.534151 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.533936 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 11:24:42.542750 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.542719 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-56b679b657-dz99n"] Apr 17 11:24:42.590829 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.590798 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/788421f1-977c-4d5f-9180-d51eec8be1ce-manager-config\") pod \"lws-controller-manager-56b679b657-dz99n\" (UID: \"788421f1-977c-4d5f-9180-d51eec8be1ce\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-dz99n" Apr 17 11:24:42.590962 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.590837 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/788421f1-977c-4d5f-9180-d51eec8be1ce-metrics-cert\") pod \"lws-controller-manager-56b679b657-dz99n\" (UID: \"788421f1-977c-4d5f-9180-d51eec8be1ce\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-dz99n" Apr 17 11:24:42.590962 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.590888 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsd97\" (UniqueName: \"kubernetes.io/projected/788421f1-977c-4d5f-9180-d51eec8be1ce-kube-api-access-zsd97\") pod \"lws-controller-manager-56b679b657-dz99n\" (UID: \"788421f1-977c-4d5f-9180-d51eec8be1ce\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-dz99n" Apr 17 11:24:42.590962 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.590958 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/788421f1-977c-4d5f-9180-d51eec8be1ce-cert\") pod \"lws-controller-manager-56b679b657-dz99n\" (UID: \"788421f1-977c-4d5f-9180-d51eec8be1ce\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-dz99n" Apr 17 11:24:42.691754 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.691716 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/788421f1-977c-4d5f-9180-d51eec8be1ce-cert\") pod \"lws-controller-manager-56b679b657-dz99n\" (UID: \"788421f1-977c-4d5f-9180-d51eec8be1ce\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-dz99n" Apr 17 11:24:42.691908 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.691779 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/788421f1-977c-4d5f-9180-d51eec8be1ce-manager-config\") pod \"lws-controller-manager-56b679b657-dz99n\" (UID: \"788421f1-977c-4d5f-9180-d51eec8be1ce\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-dz99n" Apr 17 11:24:42.691908 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.691809 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/788421f1-977c-4d5f-9180-d51eec8be1ce-metrics-cert\") pod \"lws-controller-manager-56b679b657-dz99n\" (UID: \"788421f1-977c-4d5f-9180-d51eec8be1ce\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-dz99n" Apr 17 11:24:42.691908 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.691838 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsd97\" (UniqueName: \"kubernetes.io/projected/788421f1-977c-4d5f-9180-d51eec8be1ce-kube-api-access-zsd97\") pod \"lws-controller-manager-56b679b657-dz99n\" (UID: \"788421f1-977c-4d5f-9180-d51eec8be1ce\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-dz99n" Apr 17 11:24:42.692526 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.692508 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/788421f1-977c-4d5f-9180-d51eec8be1ce-manager-config\") pod \"lws-controller-manager-56b679b657-dz99n\" (UID: \"788421f1-977c-4d5f-9180-d51eec8be1ce\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-dz99n" Apr 17 11:24:42.694218 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.694195 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/788421f1-977c-4d5f-9180-d51eec8be1ce-cert\") pod \"lws-controller-manager-56b679b657-dz99n\" (UID: \"788421f1-977c-4d5f-9180-d51eec8be1ce\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-dz99n" Apr 17 11:24:42.694305 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.694246 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/788421f1-977c-4d5f-9180-d51eec8be1ce-metrics-cert\") pod \"lws-controller-manager-56b679b657-dz99n\" (UID: \"788421f1-977c-4d5f-9180-d51eec8be1ce\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-dz99n" Apr 17 11:24:42.712761 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.712736 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsd97\" (UniqueName: \"kubernetes.io/projected/788421f1-977c-4d5f-9180-d51eec8be1ce-kube-api-access-zsd97\") pod \"lws-controller-manager-56b679b657-dz99n\" (UID: \"788421f1-977c-4d5f-9180-d51eec8be1ce\") " pod="openshift-lws-operator/lws-controller-manager-56b679b657-dz99n" Apr 17 11:24:42.840415 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.840355 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-56b679b657-dz99n" Apr 17 11:24:42.968868 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:42.968821 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-56b679b657-dz99n"] Apr 17 11:24:42.971856 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:24:42.971830 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod788421f1_977c_4d5f_9180_d51eec8be1ce.slice/crio-731ca2af9520e3bacfa03c48b26c5637f127e25dfc4700e3ec5994b118e02e78 WatchSource:0}: Error finding container 731ca2af9520e3bacfa03c48b26c5637f127e25dfc4700e3ec5994b118e02e78: Status 404 returned error can't find the container with id 731ca2af9520e3bacfa03c48b26c5637f127e25dfc4700e3ec5994b118e02e78 Apr 17 11:24:43.716961 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:43.716878 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-56b679b657-dz99n" event={"ID":"788421f1-977c-4d5f-9180-d51eec8be1ce","Type":"ContainerStarted","Data":"731ca2af9520e3bacfa03c48b26c5637f127e25dfc4700e3ec5994b118e02e78"} Apr 17 11:24:45.723353 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:45.723319 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-56b679b657-dz99n" event={"ID":"788421f1-977c-4d5f-9180-d51eec8be1ce","Type":"ContainerStarted","Data":"730b01d398466a5c94f83ff1dbed3036646ac85caa28b9233ff3324a2844a625"} Apr 17 11:24:45.723713 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:45.723490 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-56b679b657-dz99n" Apr 17 11:24:45.742051 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:45.742003 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-56b679b657-dz99n" podStartSLOduration=1.9785214 podStartE2EDuration="3.74198848s" podCreationTimestamp="2026-04-17 11:24:42 +0000 UTC" firstStartedPulling="2026-04-17 11:24:42.973669676 +0000 UTC m=+512.369592758" lastFinishedPulling="2026-04-17 11:24:44.737136756 +0000 UTC m=+514.133059838" observedRunningTime="2026-04-17 11:24:45.74106482 +0000 UTC m=+515.136987937" watchObservedRunningTime="2026-04-17 11:24:45.74198848 +0000 UTC m=+515.137911585" Apr 17 11:24:56.728954 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:24:56.728923 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-56b679b657-dz99n" Apr 17 11:25:36.349259 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:36.349225 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-ghw7m"] Apr 17 11:25:36.352459 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:36.352443 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghw7m" Apr 17 11:25:36.355248 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:36.355222 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 17 11:25:36.355366 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:36.355350 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 11:25:36.355714 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:36.355698 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-lcs5v\"" Apr 17 11:25:36.355771 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:36.355733 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 17 11:25:36.355771 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:36.355736 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 11:25:36.360412 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:36.360388 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-ghw7m"] Apr 17 11:25:36.410906 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:36.410873 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e02ea2b0-952d-405f-b158-c39fa002b78c-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-ghw7m\" (UID: \"e02ea2b0-952d-405f-b158-c39fa002b78c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghw7m" Apr 17 11:25:36.411061 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:36.410920 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhr8m\" (UniqueName: \"kubernetes.io/projected/e02ea2b0-952d-405f-b158-c39fa002b78c-kube-api-access-bhr8m\") pod \"kuadrant-console-plugin-6c886788f8-ghw7m\" (UID: \"e02ea2b0-952d-405f-b158-c39fa002b78c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghw7m" Apr 17 11:25:36.411061 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:36.410991 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e02ea2b0-952d-405f-b158-c39fa002b78c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-ghw7m\" (UID: \"e02ea2b0-952d-405f-b158-c39fa002b78c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghw7m" Apr 17 11:25:36.511571 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:36.511543 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhr8m\" (UniqueName: \"kubernetes.io/projected/e02ea2b0-952d-405f-b158-c39fa002b78c-kube-api-access-bhr8m\") pod \"kuadrant-console-plugin-6c886788f8-ghw7m\" (UID: \"e02ea2b0-952d-405f-b158-c39fa002b78c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghw7m" Apr 17 11:25:36.511713 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:36.511586 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e02ea2b0-952d-405f-b158-c39fa002b78c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-ghw7m\" (UID: \"e02ea2b0-952d-405f-b158-c39fa002b78c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghw7m" Apr 17 11:25:36.511713 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:36.511622 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e02ea2b0-952d-405f-b158-c39fa002b78c-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-ghw7m\" (UID: \"e02ea2b0-952d-405f-b158-c39fa002b78c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghw7m" Apr 17 11:25:36.511781 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:25:36.511746 2571 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 17 11:25:36.511842 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:25:36.511830 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e02ea2b0-952d-405f-b158-c39fa002b78c-plugin-serving-cert podName:e02ea2b0-952d-405f-b158-c39fa002b78c nodeName:}" failed. No retries permitted until 2026-04-17 11:25:37.011809004 +0000 UTC m=+566.407732086 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/e02ea2b0-952d-405f-b158-c39fa002b78c-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-ghw7m" (UID: "e02ea2b0-952d-405f-b158-c39fa002b78c") : secret "plugin-serving-cert" not found Apr 17 11:25:36.512194 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:36.512178 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e02ea2b0-952d-405f-b158-c39fa002b78c-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-ghw7m\" (UID: \"e02ea2b0-952d-405f-b158-c39fa002b78c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghw7m" Apr 17 11:25:36.521393 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:36.521353 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhr8m\" (UniqueName: \"kubernetes.io/projected/e02ea2b0-952d-405f-b158-c39fa002b78c-kube-api-access-bhr8m\") pod \"kuadrant-console-plugin-6c886788f8-ghw7m\" (UID: \"e02ea2b0-952d-405f-b158-c39fa002b78c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghw7m" Apr 17 11:25:37.015821 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:37.015784 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e02ea2b0-952d-405f-b158-c39fa002b78c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-ghw7m\" (UID: \"e02ea2b0-952d-405f-b158-c39fa002b78c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghw7m" Apr 17 11:25:37.018176 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:37.018151 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e02ea2b0-952d-405f-b158-c39fa002b78c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-ghw7m\" (UID: \"e02ea2b0-952d-405f-b158-c39fa002b78c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghw7m" Apr 17 11:25:37.261874 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:37.261840 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghw7m" Apr 17 11:25:37.385073 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:37.385049 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-ghw7m"] Apr 17 11:25:37.387938 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:25:37.387906 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode02ea2b0_952d_405f_b158_c39fa002b78c.slice/crio-859eeb005a9b5ad322508f169a69719e9a7e6b1acd090a3bf0ff197e5e2e2cac WatchSource:0}: Error finding container 859eeb005a9b5ad322508f169a69719e9a7e6b1acd090a3bf0ff197e5e2e2cac: Status 404 returned error can't find the container with id 859eeb005a9b5ad322508f169a69719e9a7e6b1acd090a3bf0ff197e5e2e2cac Apr 17 11:25:37.884700 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:37.884670 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghw7m" event={"ID":"e02ea2b0-952d-405f-b158-c39fa002b78c","Type":"ContainerStarted","Data":"859eeb005a9b5ad322508f169a69719e9a7e6b1acd090a3bf0ff197e5e2e2cac"} Apr 17 11:25:42.901622 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:42.901583 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghw7m" event={"ID":"e02ea2b0-952d-405f-b158-c39fa002b78c","Type":"ContainerStarted","Data":"6958af6841da11ee4faf7f627f6a0425cd12dbe7012f30e8d0c4bc2f2e7c62ad"} Apr 17 11:25:42.917638 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:25:42.917586 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ghw7m" podStartSLOduration=2.384244604 podStartE2EDuration="6.917573s" podCreationTimestamp="2026-04-17 11:25:36 +0000 UTC" firstStartedPulling="2026-04-17 11:25:37.389789733 +0000 UTC m=+566.785712830" lastFinishedPulling="2026-04-17 11:25:41.923118143 +0000 UTC m=+571.319041226" observedRunningTime="2026-04-17 11:25:42.916667077 +0000 UTC m=+572.312590183" watchObservedRunningTime="2026-04-17 11:25:42.917573 +0000 UTC m=+572.313496104" Apr 17 11:26:19.407321 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:19.407291 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-z2pcd"] Apr 17 11:26:19.430985 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:19.430956 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-z2pcd"] Apr 17 11:26:19.431136 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:19.431065 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-z2pcd" Apr 17 11:26:19.433123 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:19.433098 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-vpt8q\"" Apr 17 11:26:19.559116 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:19.559085 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x66xb\" (UniqueName: \"kubernetes.io/projected/f50efeaf-a4db-441a-ae34-53d10a99d82d-kube-api-access-x66xb\") pod \"authorino-674b59b84c-z2pcd\" (UID: \"f50efeaf-a4db-441a-ae34-53d10a99d82d\") " pod="kuadrant-system/authorino-674b59b84c-z2pcd" Apr 17 11:26:19.603941 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:19.603907 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-mbwq2"] Apr 17 11:26:19.606795 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:19.606773 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-mbwq2" Apr 17 11:26:19.614976 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:19.614954 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-mbwq2"] Apr 17 11:26:19.660538 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:19.660465 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x66xb\" (UniqueName: \"kubernetes.io/projected/f50efeaf-a4db-441a-ae34-53d10a99d82d-kube-api-access-x66xb\") pod \"authorino-674b59b84c-z2pcd\" (UID: \"f50efeaf-a4db-441a-ae34-53d10a99d82d\") " pod="kuadrant-system/authorino-674b59b84c-z2pcd" Apr 17 11:26:19.667802 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:19.667782 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x66xb\" (UniqueName: \"kubernetes.io/projected/f50efeaf-a4db-441a-ae34-53d10a99d82d-kube-api-access-x66xb\") pod \"authorino-674b59b84c-z2pcd\" (UID: \"f50efeaf-a4db-441a-ae34-53d10a99d82d\") " pod="kuadrant-system/authorino-674b59b84c-z2pcd" Apr 17 11:26:19.740085 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:19.740061 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-z2pcd" Apr 17 11:26:19.761872 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:19.761847 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxnsr\" (UniqueName: \"kubernetes.io/projected/48c782c4-2e07-4563-8bde-b04fdf45fec1-kube-api-access-nxnsr\") pod \"authorino-79cbc94b89-mbwq2\" (UID: \"48c782c4-2e07-4563-8bde-b04fdf45fec1\") " pod="kuadrant-system/authorino-79cbc94b89-mbwq2" Apr 17 11:26:19.852238 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:19.852194 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-z2pcd"] Apr 17 11:26:19.855105 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:26:19.855071 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf50efeaf_a4db_441a_ae34_53d10a99d82d.slice/crio-5da838b823501d43669345a62450f47634d026e9b5184a48272d0edeb6602aca WatchSource:0}: Error finding container 5da838b823501d43669345a62450f47634d026e9b5184a48272d0edeb6602aca: Status 404 returned error can't find the container with id 5da838b823501d43669345a62450f47634d026e9b5184a48272d0edeb6602aca Apr 17 11:26:19.862194 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:19.862169 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxnsr\" (UniqueName: \"kubernetes.io/projected/48c782c4-2e07-4563-8bde-b04fdf45fec1-kube-api-access-nxnsr\") pod \"authorino-79cbc94b89-mbwq2\" (UID: \"48c782c4-2e07-4563-8bde-b04fdf45fec1\") " pod="kuadrant-system/authorino-79cbc94b89-mbwq2" Apr 17 11:26:19.869851 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:19.869831 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxnsr\" (UniqueName: \"kubernetes.io/projected/48c782c4-2e07-4563-8bde-b04fdf45fec1-kube-api-access-nxnsr\") pod \"authorino-79cbc94b89-mbwq2\" (UID: \"48c782c4-2e07-4563-8bde-b04fdf45fec1\") " pod="kuadrant-system/authorino-79cbc94b89-mbwq2" Apr 17 11:26:19.916358 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:19.916299 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-mbwq2" Apr 17 11:26:20.012917 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:20.012873 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-z2pcd" event={"ID":"f50efeaf-a4db-441a-ae34-53d10a99d82d","Type":"ContainerStarted","Data":"5da838b823501d43669345a62450f47634d026e9b5184a48272d0edeb6602aca"} Apr 17 11:26:20.031331 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:20.031303 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-mbwq2"] Apr 17 11:26:20.034139 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:26:20.034112 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48c782c4_2e07_4563_8bde_b04fdf45fec1.slice/crio-2f8570d4b7ec92d32c148daa1b0417636e7eb2df5e9df2551523e0bfb1e5385a WatchSource:0}: Error finding container 2f8570d4b7ec92d32c148daa1b0417636e7eb2df5e9df2551523e0bfb1e5385a: Status 404 returned error can't find the container with id 2f8570d4b7ec92d32c148daa1b0417636e7eb2df5e9df2551523e0bfb1e5385a Apr 17 11:26:21.018999 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:21.018942 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-mbwq2" event={"ID":"48c782c4-2e07-4563-8bde-b04fdf45fec1","Type":"ContainerStarted","Data":"2f8570d4b7ec92d32c148daa1b0417636e7eb2df5e9df2551523e0bfb1e5385a"} Apr 17 11:26:23.029811 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:23.029772 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-mbwq2" event={"ID":"48c782c4-2e07-4563-8bde-b04fdf45fec1","Type":"ContainerStarted","Data":"3fdad632bfc5af98a5dc7aed20d25382e9c316d7e8e8a96b749f142775644ea1"} Apr 17 11:26:23.031081 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:23.031058 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-z2pcd" event={"ID":"f50efeaf-a4db-441a-ae34-53d10a99d82d","Type":"ContainerStarted","Data":"69daff4cedfec13c64e0f38efdd32b4c0cdd2a3c81e8cb3a2701d1acf884b34e"} Apr 17 11:26:23.044528 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:23.044485 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-mbwq2" podStartSLOduration=1.542417004 podStartE2EDuration="4.04447175s" podCreationTimestamp="2026-04-17 11:26:19 +0000 UTC" firstStartedPulling="2026-04-17 11:26:20.03556901 +0000 UTC m=+609.431492092" lastFinishedPulling="2026-04-17 11:26:22.537623757 +0000 UTC m=+611.933546838" observedRunningTime="2026-04-17 11:26:23.042703941 +0000 UTC m=+612.438627050" watchObservedRunningTime="2026-04-17 11:26:23.04447175 +0000 UTC m=+612.440394853" Apr 17 11:26:23.058017 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:23.057977 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-z2pcd" podStartSLOduration=1.366742442 podStartE2EDuration="4.057964273s" podCreationTimestamp="2026-04-17 11:26:19 +0000 UTC" firstStartedPulling="2026-04-17 11:26:19.856426261 +0000 UTC m=+609.252349343" lastFinishedPulling="2026-04-17 11:26:22.547648093 +0000 UTC m=+611.943571174" observedRunningTime="2026-04-17 11:26:23.056676888 +0000 UTC m=+612.452600015" watchObservedRunningTime="2026-04-17 11:26:23.057964273 +0000 UTC m=+612.453887375" Apr 17 11:26:23.086767 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:23.086738 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-z2pcd"] Apr 17 11:26:25.038206 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:25.038162 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-z2pcd" podUID="f50efeaf-a4db-441a-ae34-53d10a99d82d" containerName="authorino" containerID="cri-o://69daff4cedfec13c64e0f38efdd32b4c0cdd2a3c81e8cb3a2701d1acf884b34e" gracePeriod=30 Apr 17 11:26:25.284784 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:25.284757 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-z2pcd" Apr 17 11:26:25.410897 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:25.410859 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x66xb\" (UniqueName: \"kubernetes.io/projected/f50efeaf-a4db-441a-ae34-53d10a99d82d-kube-api-access-x66xb\") pod \"f50efeaf-a4db-441a-ae34-53d10a99d82d\" (UID: \"f50efeaf-a4db-441a-ae34-53d10a99d82d\") " Apr 17 11:26:25.413084 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:25.413058 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f50efeaf-a4db-441a-ae34-53d10a99d82d-kube-api-access-x66xb" (OuterVolumeSpecName: "kube-api-access-x66xb") pod "f50efeaf-a4db-441a-ae34-53d10a99d82d" (UID: "f50efeaf-a4db-441a-ae34-53d10a99d82d"). InnerVolumeSpecName "kube-api-access-x66xb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:26:25.512418 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:25.512355 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x66xb\" (UniqueName: \"kubernetes.io/projected/f50efeaf-a4db-441a-ae34-53d10a99d82d-kube-api-access-x66xb\") on node \"ip-10-0-129-94.ec2.internal\" DevicePath \"\"" Apr 17 11:26:26.042205 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:26.042170 2571 generic.go:358] "Generic (PLEG): container finished" podID="f50efeaf-a4db-441a-ae34-53d10a99d82d" containerID="69daff4cedfec13c64e0f38efdd32b4c0cdd2a3c81e8cb3a2701d1acf884b34e" exitCode=0 Apr 17 11:26:26.042636 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:26.042223 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-z2pcd" Apr 17 11:26:26.042636 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:26.042257 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-z2pcd" event={"ID":"f50efeaf-a4db-441a-ae34-53d10a99d82d","Type":"ContainerDied","Data":"69daff4cedfec13c64e0f38efdd32b4c0cdd2a3c81e8cb3a2701d1acf884b34e"} Apr 17 11:26:26.042636 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:26.042301 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-z2pcd" event={"ID":"f50efeaf-a4db-441a-ae34-53d10a99d82d","Type":"ContainerDied","Data":"5da838b823501d43669345a62450f47634d026e9b5184a48272d0edeb6602aca"} Apr 17 11:26:26.042636 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:26.042319 2571 scope.go:117] "RemoveContainer" containerID="69daff4cedfec13c64e0f38efdd32b4c0cdd2a3c81e8cb3a2701d1acf884b34e" Apr 17 11:26:26.050606 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:26.050588 2571 scope.go:117] "RemoveContainer" containerID="69daff4cedfec13c64e0f38efdd32b4c0cdd2a3c81e8cb3a2701d1acf884b34e" Apr 17 11:26:26.050858 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:26:26.050839 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69daff4cedfec13c64e0f38efdd32b4c0cdd2a3c81e8cb3a2701d1acf884b34e\": container with ID starting with 69daff4cedfec13c64e0f38efdd32b4c0cdd2a3c81e8cb3a2701d1acf884b34e not found: ID does not exist" containerID="69daff4cedfec13c64e0f38efdd32b4c0cdd2a3c81e8cb3a2701d1acf884b34e" Apr 17 11:26:26.050921 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:26.050872 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69daff4cedfec13c64e0f38efdd32b4c0cdd2a3c81e8cb3a2701d1acf884b34e"} err="failed to get container status \"69daff4cedfec13c64e0f38efdd32b4c0cdd2a3c81e8cb3a2701d1acf884b34e\": rpc error: code = NotFound desc = could not find container \"69daff4cedfec13c64e0f38efdd32b4c0cdd2a3c81e8cb3a2701d1acf884b34e\": container with ID starting with 69daff4cedfec13c64e0f38efdd32b4c0cdd2a3c81e8cb3a2701d1acf884b34e not found: ID does not exist" Apr 17 11:26:26.060852 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:26.060826 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-z2pcd"] Apr 17 11:26:26.064511 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:26.064490 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-z2pcd"] Apr 17 11:26:27.216408 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:27.216352 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f50efeaf-a4db-441a-ae34-53d10a99d82d" path="/var/lib/kubelet/pods/f50efeaf-a4db-441a-ae34-53d10a99d82d/volumes" Apr 17 11:26:43.872978 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:43.872939 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-g4tfv"] Apr 17 11:26:43.873342 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:43.873213 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f50efeaf-a4db-441a-ae34-53d10a99d82d" containerName="authorino" Apr 17 11:26:43.873342 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:43.873224 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f50efeaf-a4db-441a-ae34-53d10a99d82d" containerName="authorino" Apr 17 11:26:43.873342 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:43.873268 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f50efeaf-a4db-441a-ae34-53d10a99d82d" containerName="authorino" Apr 17 11:26:43.876058 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:43.876042 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-g4tfv" Apr 17 11:26:43.878098 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:43.878080 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 17 11:26:43.883220 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:43.883130 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-g4tfv"] Apr 17 11:26:43.939718 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:43.939684 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9eab7d8c-c003-4085-be22-cd64e2cf3af0-tls-cert\") pod \"authorino-68bd676465-g4tfv\" (UID: \"9eab7d8c-c003-4085-be22-cd64e2cf3af0\") " pod="kuadrant-system/authorino-68bd676465-g4tfv" Apr 17 11:26:43.939874 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:43.939755 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qp24\" (UniqueName: \"kubernetes.io/projected/9eab7d8c-c003-4085-be22-cd64e2cf3af0-kube-api-access-5qp24\") pod \"authorino-68bd676465-g4tfv\" (UID: \"9eab7d8c-c003-4085-be22-cd64e2cf3af0\") " pod="kuadrant-system/authorino-68bd676465-g4tfv" Apr 17 11:26:44.040672 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:44.040628 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qp24\" (UniqueName: \"kubernetes.io/projected/9eab7d8c-c003-4085-be22-cd64e2cf3af0-kube-api-access-5qp24\") pod \"authorino-68bd676465-g4tfv\" (UID: \"9eab7d8c-c003-4085-be22-cd64e2cf3af0\") " pod="kuadrant-system/authorino-68bd676465-g4tfv" Apr 17 11:26:44.040819 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:44.040695 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9eab7d8c-c003-4085-be22-cd64e2cf3af0-tls-cert\") pod \"authorino-68bd676465-g4tfv\" (UID: \"9eab7d8c-c003-4085-be22-cd64e2cf3af0\") " pod="kuadrant-system/authorino-68bd676465-g4tfv" Apr 17 11:26:44.043088 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:44.043068 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9eab7d8c-c003-4085-be22-cd64e2cf3af0-tls-cert\") pod \"authorino-68bd676465-g4tfv\" (UID: \"9eab7d8c-c003-4085-be22-cd64e2cf3af0\") " pod="kuadrant-system/authorino-68bd676465-g4tfv" Apr 17 11:26:44.049198 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:44.049179 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qp24\" (UniqueName: \"kubernetes.io/projected/9eab7d8c-c003-4085-be22-cd64e2cf3af0-kube-api-access-5qp24\") pod \"authorino-68bd676465-g4tfv\" (UID: \"9eab7d8c-c003-4085-be22-cd64e2cf3af0\") " pod="kuadrant-system/authorino-68bd676465-g4tfv" Apr 17 11:26:44.185921 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:44.185832 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-g4tfv" Apr 17 11:26:44.307399 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:44.307273 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-g4tfv"] Apr 17 11:26:44.309986 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:26:44.309957 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9eab7d8c_c003_4085_be22_cd64e2cf3af0.slice/crio-d77f7274c7b7f2a69e6dbd1a75bd3c9fed027fc34c390028516bff0d07950ae2 WatchSource:0}: Error finding container d77f7274c7b7f2a69e6dbd1a75bd3c9fed027fc34c390028516bff0d07950ae2: Status 404 returned error can't find the container with id d77f7274c7b7f2a69e6dbd1a75bd3c9fed027fc34c390028516bff0d07950ae2 Apr 17 11:26:45.105278 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:45.105192 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-g4tfv" event={"ID":"9eab7d8c-c003-4085-be22-cd64e2cf3af0","Type":"ContainerStarted","Data":"789160389d027d1c565bbb988ca7e37c6886c9c81ea695087924a89054cd75c2"} Apr 17 11:26:45.105278 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:45.105228 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-g4tfv" event={"ID":"9eab7d8c-c003-4085-be22-cd64e2cf3af0","Type":"ContainerStarted","Data":"d77f7274c7b7f2a69e6dbd1a75bd3c9fed027fc34c390028516bff0d07950ae2"} Apr 17 11:26:45.120314 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:45.120268 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-g4tfv" podStartSLOduration=1.596666435 podStartE2EDuration="2.120250332s" podCreationTimestamp="2026-04-17 11:26:43 +0000 UTC" firstStartedPulling="2026-04-17 11:26:44.311352069 +0000 UTC m=+633.707275153" lastFinishedPulling="2026-04-17 11:26:44.834935969 +0000 UTC m=+634.230859050" observedRunningTime="2026-04-17 11:26:45.118847266 +0000 UTC m=+634.514770371" watchObservedRunningTime="2026-04-17 11:26:45.120250332 +0000 UTC m=+634.516173434" Apr 17 11:26:45.145308 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:45.145278 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-mbwq2"] Apr 17 11:26:45.145497 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:45.145478 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-mbwq2" podUID="48c782c4-2e07-4563-8bde-b04fdf45fec1" containerName="authorino" containerID="cri-o://3fdad632bfc5af98a5dc7aed20d25382e9c316d7e8e8a96b749f142775644ea1" gracePeriod=30 Apr 17 11:26:45.387072 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:45.387039 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-mbwq2" Apr 17 11:26:45.449231 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:45.449200 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxnsr\" (UniqueName: \"kubernetes.io/projected/48c782c4-2e07-4563-8bde-b04fdf45fec1-kube-api-access-nxnsr\") pod \"48c782c4-2e07-4563-8bde-b04fdf45fec1\" (UID: \"48c782c4-2e07-4563-8bde-b04fdf45fec1\") " Apr 17 11:26:45.451203 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:45.451169 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c782c4-2e07-4563-8bde-b04fdf45fec1-kube-api-access-nxnsr" (OuterVolumeSpecName: "kube-api-access-nxnsr") pod "48c782c4-2e07-4563-8bde-b04fdf45fec1" (UID: "48c782c4-2e07-4563-8bde-b04fdf45fec1"). InnerVolumeSpecName "kube-api-access-nxnsr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:26:45.549900 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:45.549869 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nxnsr\" (UniqueName: \"kubernetes.io/projected/48c782c4-2e07-4563-8bde-b04fdf45fec1-kube-api-access-nxnsr\") on node \"ip-10-0-129-94.ec2.internal\" DevicePath \"\"" Apr 17 11:26:46.110655 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:46.110618 2571 generic.go:358] "Generic (PLEG): container finished" podID="48c782c4-2e07-4563-8bde-b04fdf45fec1" containerID="3fdad632bfc5af98a5dc7aed20d25382e9c316d7e8e8a96b749f142775644ea1" exitCode=0 Apr 17 11:26:46.111055 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:46.110668 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-mbwq2" Apr 17 11:26:46.111055 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:46.110703 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-mbwq2" event={"ID":"48c782c4-2e07-4563-8bde-b04fdf45fec1","Type":"ContainerDied","Data":"3fdad632bfc5af98a5dc7aed20d25382e9c316d7e8e8a96b749f142775644ea1"} Apr 17 11:26:46.111055 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:46.110740 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-mbwq2" event={"ID":"48c782c4-2e07-4563-8bde-b04fdf45fec1","Type":"ContainerDied","Data":"2f8570d4b7ec92d32c148daa1b0417636e7eb2df5e9df2551523e0bfb1e5385a"} Apr 17 11:26:46.111055 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:46.110762 2571 scope.go:117] "RemoveContainer" containerID="3fdad632bfc5af98a5dc7aed20d25382e9c316d7e8e8a96b749f142775644ea1" Apr 17 11:26:46.121219 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:46.121201 2571 scope.go:117] "RemoveContainer" containerID="3fdad632bfc5af98a5dc7aed20d25382e9c316d7e8e8a96b749f142775644ea1" Apr 17 11:26:46.121501 ip-10-0-129-94 kubenswrapper[2571]: E0417 11:26:46.121480 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fdad632bfc5af98a5dc7aed20d25382e9c316d7e8e8a96b749f142775644ea1\": container with ID starting with 3fdad632bfc5af98a5dc7aed20d25382e9c316d7e8e8a96b749f142775644ea1 not found: ID does not exist" containerID="3fdad632bfc5af98a5dc7aed20d25382e9c316d7e8e8a96b749f142775644ea1" Apr 17 11:26:46.121551 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:46.121511 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fdad632bfc5af98a5dc7aed20d25382e9c316d7e8e8a96b749f142775644ea1"} err="failed to get container status \"3fdad632bfc5af98a5dc7aed20d25382e9c316d7e8e8a96b749f142775644ea1\": rpc error: code = NotFound desc = could not find container \"3fdad632bfc5af98a5dc7aed20d25382e9c316d7e8e8a96b749f142775644ea1\": container with ID starting with 3fdad632bfc5af98a5dc7aed20d25382e9c316d7e8e8a96b749f142775644ea1 not found: ID does not exist" Apr 17 11:26:46.130849 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:46.130823 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-mbwq2"] Apr 17 11:26:46.134505 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:46.134484 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-mbwq2"] Apr 17 11:26:47.215906 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:26:47.215872 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48c782c4-2e07-4563-8bde-b04fdf45fec1" path="/var/lib/kubelet/pods/48c782c4-2e07-4563-8bde-b04fdf45fec1/volumes" Apr 17 11:29:08.293273 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:08.293244 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-g4tfv_9eab7d8c-c003-4085-be22-cd64e2cf3af0/authorino/0.log" Apr 17 11:29:08.348927 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:08.348901 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-ghw7m_e02ea2b0-952d-405f-b158-c39fa002b78c/kuadrant-console-plugin/0.log" Apr 17 11:29:12.678673 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:12.678635 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d2skc/must-gather-xtp6s"] Apr 17 11:29:12.679044 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:12.678939 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48c782c4-2e07-4563-8bde-b04fdf45fec1" containerName="authorino" Apr 17 11:29:12.679044 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:12.678949 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c782c4-2e07-4563-8bde-b04fdf45fec1" containerName="authorino" Apr 17 11:29:12.679044 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:12.679008 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="48c782c4-2e07-4563-8bde-b04fdf45fec1" containerName="authorino" Apr 17 11:29:12.681965 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:12.681950 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2skc/must-gather-xtp6s" Apr 17 11:29:12.683991 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:12.683976 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d2skc\"/\"openshift-service-ca.crt\"" Apr 17 11:29:12.684432 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:12.684414 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-d2skc\"/\"default-dockercfg-6vh64\"" Apr 17 11:29:12.684490 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:12.684435 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d2skc\"/\"kube-root-ca.crt\"" Apr 17 11:29:12.690728 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:12.690693 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2skc/must-gather-xtp6s"] Apr 17 11:29:12.759058 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:12.759017 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4qbm\" (UniqueName: \"kubernetes.io/projected/63a92a0c-1988-4432-b0bb-4f33a7c911ba-kube-api-access-m4qbm\") pod \"must-gather-xtp6s\" (UID: \"63a92a0c-1988-4432-b0bb-4f33a7c911ba\") " pod="openshift-must-gather-d2skc/must-gather-xtp6s" Apr 17 11:29:12.759224 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:12.759073 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/63a92a0c-1988-4432-b0bb-4f33a7c911ba-must-gather-output\") pod \"must-gather-xtp6s\" (UID: \"63a92a0c-1988-4432-b0bb-4f33a7c911ba\") " pod="openshift-must-gather-d2skc/must-gather-xtp6s" Apr 17 11:29:12.860350 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:12.860314 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/63a92a0c-1988-4432-b0bb-4f33a7c911ba-must-gather-output\") pod \"must-gather-xtp6s\" (UID: \"63a92a0c-1988-4432-b0bb-4f33a7c911ba\") " pod="openshift-must-gather-d2skc/must-gather-xtp6s" Apr 17 11:29:12.860537 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:12.860435 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4qbm\" (UniqueName: \"kubernetes.io/projected/63a92a0c-1988-4432-b0bb-4f33a7c911ba-kube-api-access-m4qbm\") pod \"must-gather-xtp6s\" (UID: \"63a92a0c-1988-4432-b0bb-4f33a7c911ba\") " pod="openshift-must-gather-d2skc/must-gather-xtp6s" Apr 17 11:29:12.860682 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:12.860661 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/63a92a0c-1988-4432-b0bb-4f33a7c911ba-must-gather-output\") pod \"must-gather-xtp6s\" (UID: \"63a92a0c-1988-4432-b0bb-4f33a7c911ba\") " pod="openshift-must-gather-d2skc/must-gather-xtp6s" Apr 17 11:29:12.868009 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:12.867985 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4qbm\" (UniqueName: \"kubernetes.io/projected/63a92a0c-1988-4432-b0bb-4f33a7c911ba-kube-api-access-m4qbm\") pod \"must-gather-xtp6s\" (UID: \"63a92a0c-1988-4432-b0bb-4f33a7c911ba\") " pod="openshift-must-gather-d2skc/must-gather-xtp6s" Apr 17 11:29:12.992197 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:12.992096 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2skc/must-gather-xtp6s" Apr 17 11:29:13.115275 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:13.115251 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2skc/must-gather-xtp6s"] Apr 17 11:29:13.117943 ip-10-0-129-94 kubenswrapper[2571]: W0417 11:29:13.117911 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63a92a0c_1988_4432_b0bb_4f33a7c911ba.slice/crio-0ddc2cbfb03d7b2b1628e3dccffa634cdfec5ca12e0f0ca30c18e3b07df22b13 WatchSource:0}: Error finding container 0ddc2cbfb03d7b2b1628e3dccffa634cdfec5ca12e0f0ca30c18e3b07df22b13: Status 404 returned error can't find the container with id 0ddc2cbfb03d7b2b1628e3dccffa634cdfec5ca12e0f0ca30c18e3b07df22b13 Apr 17 11:29:13.119662 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:13.119643 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:29:13.576614 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:13.576572 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2skc/must-gather-xtp6s" event={"ID":"63a92a0c-1988-4432-b0bb-4f33a7c911ba","Type":"ContainerStarted","Data":"0ddc2cbfb03d7b2b1628e3dccffa634cdfec5ca12e0f0ca30c18e3b07df22b13"} Apr 17 11:29:14.585256 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:14.584419 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2skc/must-gather-xtp6s" event={"ID":"63a92a0c-1988-4432-b0bb-4f33a7c911ba","Type":"ContainerStarted","Data":"4f916284f8f2fe45862e82704665dab8519fdc05c15b2594e4c8f2909e13e53c"} Apr 17 11:29:14.585256 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:14.584466 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2skc/must-gather-xtp6s" event={"ID":"63a92a0c-1988-4432-b0bb-4f33a7c911ba","Type":"ContainerStarted","Data":"0cae3fb11f56cc0ee44a076ad91b465a73570c0ca0e6c3d411486423321f57d2"} Apr 17 11:29:14.600459 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:14.600337 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d2skc/must-gather-xtp6s" podStartSLOduration=1.806220741 podStartE2EDuration="2.6003166s" podCreationTimestamp="2026-04-17 11:29:12 +0000 UTC" firstStartedPulling="2026-04-17 11:29:13.119772755 +0000 UTC m=+782.515695838" lastFinishedPulling="2026-04-17 11:29:13.913868605 +0000 UTC m=+783.309791697" observedRunningTime="2026-04-17 11:29:14.600127887 +0000 UTC m=+783.996050992" watchObservedRunningTime="2026-04-17 11:29:14.6003166 +0000 UTC m=+783.996239705" Apr 17 11:29:15.469881 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:15.469848 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-27glw_cf904787-1ca2-44e2-a227-75aa1d60f7a0/global-pull-secret-syncer/0.log" Apr 17 11:29:15.555904 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:15.555874 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-29v4q_a6c5567f-d00d-4e77-b239-f0ad9016d0b1/konnectivity-agent/0.log" Apr 17 11:29:15.648881 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:15.648854 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-94.ec2.internal_69d674d86c0903ec8afec4ccfab0ec85/haproxy/0.log" Apr 17 11:29:19.494933 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:19.494902 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-g4tfv_9eab7d8c-c003-4085-be22-cd64e2cf3af0/authorino/0.log" Apr 17 11:29:19.596847 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:19.596811 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-ghw7m_e02ea2b0-952d-405f-b158-c39fa002b78c/kuadrant-console-plugin/0.log" Apr 17 11:29:20.993955 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:20.993899 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-hp9n9_896800c7-73d6-4a93-95ad-0095f218b33d/monitoring-plugin/0.log" Apr 17 11:29:21.089482 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:21.089408 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-98dxk_8c3b726a-5a6a-4448-9528-b468829506bc/node-exporter/0.log" Apr 17 11:29:21.118503 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:21.118468 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-98dxk_8c3b726a-5a6a-4448-9528-b468829506bc/kube-rbac-proxy/0.log" Apr 17 11:29:21.143019 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:21.142916 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-98dxk_8c3b726a-5a6a-4448-9528-b468829506bc/init-textfile/0.log" Apr 17 11:29:21.651397 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:21.651330 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6f7f856ccb-8jmgz_5466274d-9675-4358-af95-c9c0e9882159/thanos-query/0.log" Apr 17 11:29:21.674090 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:21.674054 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6f7f856ccb-8jmgz_5466274d-9675-4358-af95-c9c0e9882159/kube-rbac-proxy-web/0.log" Apr 17 11:29:21.698387 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:21.698333 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6f7f856ccb-8jmgz_5466274d-9675-4358-af95-c9c0e9882159/kube-rbac-proxy/0.log" Apr 17 11:29:21.722333 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:21.722309 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6f7f856ccb-8jmgz_5466274d-9675-4358-af95-c9c0e9882159/prom-label-proxy/0.log" Apr 17 11:29:21.750948 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:21.750906 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6f7f856ccb-8jmgz_5466274d-9675-4358-af95-c9c0e9882159/kube-rbac-proxy-rules/0.log" Apr 17 11:29:21.780282 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:21.780167 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6f7f856ccb-8jmgz_5466274d-9675-4358-af95-c9c0e9882159/kube-rbac-proxy-metrics/0.log" Apr 17 11:29:24.104424 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:24.104398 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-fjq7v_419b6231-5c83-4da0-981d-a59db536295f/download-server/0.log" Apr 17 11:29:24.761287 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:24.761254 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d"] Apr 17 11:29:24.766863 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:24.766831 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" Apr 17 11:29:24.773121 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:24.773088 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d"] Apr 17 11:29:24.862330 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:24.862287 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad-proc\") pod \"perf-node-gather-daemonset-f486d\" (UID: \"1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" Apr 17 11:29:24.862538 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:24.862401 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad-sys\") pod \"perf-node-gather-daemonset-f486d\" (UID: \"1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" Apr 17 11:29:24.862538 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:24.862489 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad-lib-modules\") pod \"perf-node-gather-daemonset-f486d\" (UID: \"1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" Apr 17 11:29:24.862651 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:24.862588 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggz29\" (UniqueName: \"kubernetes.io/projected/1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad-kube-api-access-ggz29\") pod \"perf-node-gather-daemonset-f486d\" (UID: \"1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" Apr 17 11:29:24.862702 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:24.862655 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad-podres\") pod \"perf-node-gather-daemonset-f486d\" (UID: \"1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" Apr 17 11:29:24.964903 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:24.964862 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggz29\" (UniqueName: \"kubernetes.io/projected/1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad-kube-api-access-ggz29\") pod \"perf-node-gather-daemonset-f486d\" (UID: \"1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" Apr 17 11:29:24.965163 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:24.965140 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad-podres\") pod \"perf-node-gather-daemonset-f486d\" (UID: \"1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" Apr 17 11:29:24.965276 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:24.965186 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad-proc\") pod \"perf-node-gather-daemonset-f486d\" (UID: \"1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" Apr 17 11:29:24.965276 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:24.965255 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad-sys\") pod \"perf-node-gather-daemonset-f486d\" (UID: \"1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" Apr 17 11:29:24.965276 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:24.965265 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad-proc\") pod \"perf-node-gather-daemonset-f486d\" (UID: \"1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" Apr 17 11:29:24.965472 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:24.965317 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad-podres\") pod \"perf-node-gather-daemonset-f486d\" (UID: \"1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" Apr 17 11:29:24.965472 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:24.965329 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad-sys\") pod \"perf-node-gather-daemonset-f486d\" (UID: \"1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" Apr 17 11:29:24.965472 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:24.965337 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad-lib-modules\") pod \"perf-node-gather-daemonset-f486d\" (UID: \"1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" Apr 17 11:29:24.965576 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:24.965470 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad-lib-modules\") pod \"perf-node-gather-daemonset-f486d\" (UID: \"1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" Apr 17 11:29:24.974485 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:24.974458 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggz29\" (UniqueName: \"kubernetes.io/projected/1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad-kube-api-access-ggz29\") pod \"perf-node-gather-daemonset-f486d\" (UID: \"1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad\") " pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" Apr 17 11:29:25.081177 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:25.081106 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" Apr 17 11:29:25.227603 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:25.227569 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d"] Apr 17 11:29:25.271582 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:25.271548 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6wgpx_d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e/dns/0.log" Apr 17 11:29:25.297492 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:25.297463 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6wgpx_d2c70e20-7fbd-4bbb-9ce6-10c8f99f985e/kube-rbac-proxy/0.log" Apr 17 11:29:25.458968 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:25.458944 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rctf4_05781f98-6d49-4771-a747-d678a55de76e/dns-node-resolver/0.log" Apr 17 11:29:25.631672 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:25.631580 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" event={"ID":"1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad","Type":"ContainerStarted","Data":"2f3bfa5c4c85d5264ff0a5bf6fe85e53b798286be8485b802d539b112d9a658b"} Apr 17 11:29:25.631672 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:25.631629 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" event={"ID":"1a2a81c5-2912-4bfd-b5fb-727d2c38c0ad","Type":"ContainerStarted","Data":"ef1fcc1adf5d97e999282b5bec0f34e41c1cc14faadea197963cd5d0628357fe"} Apr 17 11:29:25.632534 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:25.632506 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" Apr 17 11:29:25.648747 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:25.648703 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" podStartSLOduration=1.648688701 podStartE2EDuration="1.648688701s" podCreationTimestamp="2026-04-17 11:29:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:29:25.647294674 +0000 UTC m=+795.043217778" watchObservedRunningTime="2026-04-17 11:29:25.648688701 +0000 UTC m=+795.044611804" Apr 17 11:29:25.940952 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:25.940884 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fgzfd_02fceaee-2358-4389-a551-6c489878daca/node-ca/0.log" Apr 17 11:29:27.286758 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:27.286727 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-l6vms_b6d8ae60-a18c-4042-87a0-4790a47763c3/serve-healthcheck-canary/0.log" Apr 17 11:29:27.861644 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:27.861619 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wl8lk_308d97f5-9121-4af5-a32a-1b143d7593d9/kube-rbac-proxy/0.log" Apr 17 11:29:27.882138 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:27.882118 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wl8lk_308d97f5-9121-4af5-a32a-1b143d7593d9/exporter/0.log" Apr 17 11:29:27.903787 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:27.903768 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wl8lk_308d97f5-9121-4af5-a32a-1b143d7593d9/extractor/0.log" Apr 17 11:29:30.364599 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:30.364533 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-56b679b657-dz99n_788421f1-977c-4d5f-9180-d51eec8be1ce/manager/0.log" Apr 17 11:29:30.422356 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:30.422331 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-656mq_5118fa57-5af0-4e64-9ae9-ac54ed661c3e/openshift-lws-operator/0.log" Apr 17 11:29:32.657249 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:32.657218 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-d2skc/perf-node-gather-daemonset-f486d" Apr 17 11:29:37.101386 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:37.101344 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-65tnl_7c6d0851-5688-40f9-8967-116e7a6bddf3/kube-multus/0.log" Apr 17 11:29:37.286676 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:37.286648 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v728t_31b7facb-4c12-4174-a583-430fbb53bf63/kube-multus-additional-cni-plugins/0.log" Apr 17 11:29:37.309247 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:37.309223 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v728t_31b7facb-4c12-4174-a583-430fbb53bf63/egress-router-binary-copy/0.log" Apr 17 11:29:37.333425 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:37.333365 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v728t_31b7facb-4c12-4174-a583-430fbb53bf63/cni-plugins/0.log" Apr 17 11:29:37.357655 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:37.357583 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v728t_31b7facb-4c12-4174-a583-430fbb53bf63/bond-cni-plugin/0.log" Apr 17 11:29:37.379907 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:37.379883 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v728t_31b7facb-4c12-4174-a583-430fbb53bf63/routeoverride-cni/0.log" Apr 17 11:29:37.402891 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:37.402867 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v728t_31b7facb-4c12-4174-a583-430fbb53bf63/whereabouts-cni-bincopy/0.log" Apr 17 11:29:37.425160 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:37.425138 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-v728t_31b7facb-4c12-4174-a583-430fbb53bf63/whereabouts-cni/0.log" Apr 17 11:29:37.715777 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:37.715693 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zsnbl_44159d9f-1705-4830-8bfe-c087640f29cb/network-metrics-daemon/0.log" Apr 17 11:29:37.739834 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:37.739809 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zsnbl_44159d9f-1705-4830-8bfe-c087640f29cb/kube-rbac-proxy/0.log" Apr 17 11:29:39.182539 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:39.182511 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wjrrc_b4022fde-6cb7-4448-ba75-34477921e084/ovn-controller/0.log" Apr 17 11:29:39.209699 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:39.209667 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wjrrc_b4022fde-6cb7-4448-ba75-34477921e084/ovn-acl-logging/0.log" Apr 17 11:29:39.235173 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:39.235142 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wjrrc_b4022fde-6cb7-4448-ba75-34477921e084/kube-rbac-proxy-node/0.log" Apr 17 11:29:39.257196 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:39.257164 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wjrrc_b4022fde-6cb7-4448-ba75-34477921e084/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 11:29:39.277070 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:39.277046 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wjrrc_b4022fde-6cb7-4448-ba75-34477921e084/northd/0.log" Apr 17 11:29:39.298949 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:39.298923 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wjrrc_b4022fde-6cb7-4448-ba75-34477921e084/nbdb/0.log" Apr 17 11:29:39.321273 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:39.321243 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wjrrc_b4022fde-6cb7-4448-ba75-34477921e084/sbdb/0.log" Apr 17 11:29:39.437695 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:39.437631 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wjrrc_b4022fde-6cb7-4448-ba75-34477921e084/ovnkube-controller/0.log" Apr 17 11:29:40.553297 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:40.553273 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-7xnhl_27626c71-9dab-4636-93f8-f3321c44e711/network-check-target-container/0.log" Apr 17 11:29:41.578341 ip-10-0-129-94 kubenswrapper[2571]: I0417 11:29:41.578317 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-jc7f5_4c820c8f-2002-4e3b-afd9-88115414ecc4/iptables-alerter/0.log"